jfultz 2 days ago

The 1200XL was my first computer. My family purchased it at a department store at a fire sale price (IIRC Montgomery Ward's, $199) after Atari stopped manufacturing and began dumping its inventory to make way for the 600XL/800XL. I had been researching a computer to get for ages, but my family was very careful about how we spent money, and it was a big purchase. We had seriously considered getting a TI-99/4A when TI exited the business, but we were concerned that it was just going to be a dead end. But the chance for a cheaper entry into an established ecosystem was great (that and me begging my mother to finally, finally get a computer!).

Re compatibility, I never came across software that didn't run on it. I'd read in magazines that there were issues, but never once experienced it. One interesting software change, though (but true of other XL computers, too) was that the color "artifacting" worked differently on it than it did the 400/800. For example, Ultima III used color artifacting, and so playing it on my system produced some incorrect colors...most notably the sea was red.

I did come across one hardware issue...a cheap third-party parallel interface adapter that didn't work, and that we thought at the time was defective, but I now think it's likely to have been affected by the incorrect power wiring in the 1200XL's SIO adapter. It was cheap enough that we didn't lose too much money on it, and I ended up getting the far superior ICD P:R: Connection instead.

The Atari community was a super great community to be in. And in so many ways, Atari was doing things that wouldn't be seen again for years, if not decades. Atari's SIO port is famously a predecessor/inspiration of USB. The APX Exchange was basically a third-party app store decades before Apple popularized the concept. The machine was hackable and moddable (I bought a 256KB upgrade kit for mine). When I migrated to PC for college use, it hurt to have to fall back to CGA...even EGA was just ugly, compared to what my 1200XL was capable of (although the 80 column displays were nice compared to Atari's 40-column).

  • sokoloff 15 hours ago

    My first owned computer was a 1200XL as well. Found a game that read raw keystrokes and that was different from the 400/800 to the 1200XL.

    I patched my copy of the game and sent the patch to the publisher (Microprose, IIRC) but never heard from them.

glimshe 2 days ago

Microsoft was one of the first companies who fully internalized the importance of seamless backwards compatibility. The lessons had been around for a while, such as the fate of the 1200XL.

They would have done a lot better, even at a higher price, if they had focused on it. The Atari 8-bit line had a lot going for it and was arguably superior (flame wars incoming, Atari army please help me) in many ways than the C64.

  • sirwhinesalot 2 days ago

    The Atari line was much better at scrolling, had a much much better master palette, supported display lists (nicer than setting up interrupts in the C64) and the POKEY had some advantages over the SID, not just the extra channel but also in doing beefy sound effects.

    I don't think any of this is denied by C64 fans.

    The C64 on the other hand could push nearly 6x the sprite data per line, had Color RAM for more interesting tile work, the SID was more capable for music, and it had much much better support for high resolution graphics.

    For their time they were very comparable but when (ab)used just right the C64 can do a neat NES impression. The Atari can't do that, but it can do some bonkers 3D using the super low resolution modes.

    • justin66 2 days ago

      > For their time they were very comparable

      We can say that now, but it's worth remembering the Atari 8 bit computers came out over two years before the C64. Not such a big gap in computing today, but back then it was a lot.

      ex-Atari people talking about what they could have done better is always an interesting youtube phenomenon. (as with, for example, ex-Sun people, you hear a lot of theories but you never encounter anyone who says "yeah, I was the guy who made the whole thing fail")

      • Suppafly 2 days ago

        >you never encounter anyone who says "yeah, I was the guy who made the whole thing fail"

        Because that was usually a business decision and not a technical one.

        • justin66 a day ago

          Rereading the comment I made which you're responding to, I can only assume you do not consider those making business decisions to be... people? That seems a bit harsh.

      • mrandish 2 days ago

        > ex-Atari people talking about what they could have done better

        As someone super into retro computing now, who also lived through the history getting my first computer in 1981, participating in users groups, reading all the zines, going to regional and national trade shows for 8-bit home computers, etc, this was (and still is) a significant recurring theme. Owners of 8-bit and 16-bit computers in the 80s/early 90s (and retro hobbyists/collectors today) obsess about the mistakes made by their respective manufacturers. Often this armchair analysis results in frustrated conclusions like "If only (Atari/Commodore/Tandy) had done X" the computer would have "made it" (ie survived against the PC). I engaged in this prevalent pastime myself for years and even today there are forums full of questions seeking to post mortem the "the fatal mistake(s)".

        The fascinating part for me is that my early hobby computing led to programming, product-making and tech startup entrepreneurship. By the mid and late 90s I'd evolved from sneaking into Comdex and CES circa 1983 to now launching products in my startup's huge Comdex booth literally next to Microsoft's, with our own private hospitality suites and VIP parties. I'd somehow gone from fanboy to insider. During this era I had multiple private business meetings with industry luminaries I'd idolized including Bill Gates, Steve Jobs, Andy Grove and many others. I'd even met, had a meal with or otherwise gotten to personally know 8-bit era luminaries including Nolan Bushnell, Jay Miner (designer of the Amiga), a couple of Tramiels and so many others. I got to ask my fanboy era questions directly to the players - and the best part was that I was then a peer and the meetings were private. I got a lot more color and perspective than the answers they gave to the press or even at retro-computing conference panels. It was fascinating! But overall I also learned that, aside from a couple notable exceptions, there was no single "Big Mistake". The reality was far more complicated and kind of boring: there were complex business, financial, manufacturing and distribution challenges far beyond users group hobby analysis. The future was unknowable and they were living through the first iteration of what only later became obvious patterns - and they were just doing the best they could to figure it out enough to survive.

        By the late 2000s I'd graduated to selling my third startup to a Fortune 500 valley tech giant and became a senior exec driving strategy and identifying new opportunities and markets, as well as leading large teams of MBAs, bankers and lawyers to identify, analyze and negotiate multi-billion dollar acquisitions of other tech companies. And this perspective is the one that most informed my prior obsession with identifying "the fatal mistake" which doomed Atari, Commodore, Tandy, Sinclair and so many others. Having recently retired early, just for fun I've delved into the history and using my post-2000s expertise done rigorous strategic analyses.

        In every case the answer is the same and, frankly, a little disappointing: Nothing they could have done would have "saved" them for more than a couple years. While Atari, Commodore, Tandy et al did make many mistakes, none of them were the root cause of their eventual demise. Even if those mistakes had been avoided, it would only have delayed the inevitable. In each case, macro factors beyond their control that were baked into the market, the technology or their own DNA, doomed them. For example, one of the Amiga's greatest advantages in 1985 was the brilliant custom chipset designed to exploit every quirk of analog video timing. And by the early 90s that great advantage was one of its biggest weaknesses. Also, the much-beloved 68000 series processors at the heart of the Amiga and Atari ST were ultimately doomed by the combination of being made by Motorola and being, perhaps, the ultimate expression of CISC architecture (which made them fun to program by hand in assembly language). But RISC was ultimately the only way forward when Moore's Law scaling kept delivering ever more gates into the 90s. But bridging over from CISC to a RISC ISA while maintaining backward compatibility was enormously complex. Only Intel eventually managed it and very nearly died in the attempt. Intel's lead in process fabrication helped them over the hump but Motorola was too far behind in fab tech because they hadn't invested as deeply. For Intel it their lifeblood and, ultimately, existential. For Motorola's board of directors, CPUs were just another business in their portfolio of businesses. Motorola was a decades old conglomerate that made prudent financial calculations. Intel was born as a chip startup and would either live or die as one. Andy Grove had to bet the company and find a way to make it work. Motorola didn't.

        • zozbot234 a day ago

          > Also, the much-beloved 68000 series processors at the heart of the Amiga and Atari ST were ultimately doomed by the combination of being made by Motorola and being, perhaps, the ultimate expression of CISC architecture (which made them fun to program by hand in assembly language).

          I agree that the issues w/ m68k-series processors were an underappreciated factor behind the demise of the Amiga and Atari ST computers. The Macintosh line managed to transition to PowerPC, but they did so via software emulation. That would've been highly problematic for the kind of software (games and early multimedia - not DTP, which was the Macintosh domain) where the Amiga and Atari ST were at their strongest. Other m68k machines in common use included, e.g. Sun and NeXT workstations which would've been running highly portable code without 'tight' performance requirements.

          • mrandish 19 hours ago

            > The Macintosh line managed to transition to PowerPC, but they did so via software emulation. That would've been highly problematic for the kind of software (games and early multimedia - not DTP, which was the Macintosh domain) where the Amiga and Atari ST were at their strongest.

            This is an excellent point and one I hadn't fully appreciated until reading your comment. Most of the popular 68K software on Macs (such as DTP) was more amenable, or at least tolerant, of running under emulation. Even the popular games on Mac like Myst weren't as real-time critical as popular Amiga and Atari ST games which tended more toward arcade style and sometimes even accurate arcade ports. While I'm sure there were arcade style games for color 68K Macs, they weren't the majority. Also, because the Mac didn't have so many tightly integrated custom co-processors, my sense is that Mac 68K software wasn't as tightly hardware coupled and counting on specific timing interactions. A fair amount of Amiga software would read and write directly to hardware registers instead of using OS calls and even if it only used OS routines, it could still be highly dependent on precise behavior. Once again, we see that aspects which had made the Amiga and Atari ST great in the 80s, made it harder to navigate the transitions necessary to survive the 90s.

            It would be interesting to have modern emulator authors compare notes about the software libraries between these platforms. While I'm sure Mac emulator authors still found a lot of 'misbehaving' Mac apps to deal with, my sense is the Amiga software library was bonkers to support in emulation. The WinUAE Amiga emulator has long had a precisely cycle-accurate mode which is less performant but simply necessary in many cases. And as mature as WinUAE is, the team is still discovering edge cases where 40 year-old games and graphics apps have never been emulated correctly.

            Conversely, I remember around 1992 I bought a Macintosh emulator for my 68020 Amiga and it performed quite well. I used it for work to run Mac DTP applications. The emulator ran in software but used a small hardware dongle on the Amiga's parallel port to import original Mac ROMs which you needed to buy separately. Of course, both the emulation source and target were 68K-based but it indicates that most Mac software was reasonably well-behaved in terms of hardware dependence. If a little Amiga startup was able to write a pretty good Mac emulator, it was certainly possible for Apple themselves to it better as few years later with a much faster PowerPC CPU.

            Finally, it's clear that post 1990 both Atari and Commodore were in increasingly weaker positions, not only financially but in terms of staff depth. While both still had some remarkably talented engineers, the bench wasn't deep. I know that at least at Commodore, toward the end they'd canceled their much improved, new Amiga chipset project (AAA). Even though it was almost complete with (mostly) working test silicon on prototype boards, they canceled it because it had become obvious future Pentium and RISC CPUs would outperform even the 68060 and AAA custom chips. At the time Commodore folded engineering was working on the 'Hombre', an entirely new design which would have been based on an HP RISC CPU. For graphics the main thrust would have been new retargetable graphics modes for hi-res, high-frequency monitors (1280 x 1024). https://en.wikipedia.org/wiki/Amiga_Hombre_chipset

            The plan was to support legacy Amiga software with a 68K emulator on the RISC CPU driving a new chip created specifically to support legacy Amiga graphics modes. When I later read this, I remember being quite skeptical that hybrid software/hardware emulation would have worked very well for the eclectic Amiga library of legacy software. As much as I loved the Amiga, the OS stack could then only be described as 'crufty'. It had been upgraded a little over the years but still contained major legacy components from different eras and many of the people involved were no longer at Commodore. Given that reality, the plan had been to base the new Amiga on Windows NT.

            But - even if Commodore somehow overcame the myriad technical challenges, lack of resources and depleted talent bench, once a next-gen Amiga isn't based on the 68K, AmigaOS or the custom chips and boots Windows NT in XGA mode - is it still really an Amiga? Certainly, at least some of my software wouldn't have worked anymore so, facing the decision to buy a new, quite different computer, why wouldn't users also look at the, probably, cheaper Packard Bell Pentium running Windows 95 down at Costco? After all, with the Pentium and Windows 95, the PC juggernaut had finally coalesced into a coherent whole that could be compelling to both home users and graphics, gaming, multimedia obsessed hobbyists. And new Doom/Quake quality games were coming out almost weekly. That's when even I bought a PC and started using it as my main daily driver. Of course, I kept my awesome, fully loaded, tweaked out, much beloved Amiga system on the desk alongside it for a couple years. But web pages never quite looked right on the Amiga and sharing files on the network was hardly seamless. Sadly, it was increasingly clear the world had moved on. In many ways the Amiga (and other notable platforms of the 80s) had blazed the trail showing the way to the future - but it was a future they would not be a part of.

        • justin66 2 days ago

          This is a good comment. I guess I don't attach the same weight to the CISC/RISC thing you do, but I agree that it's certainly true that the big "the real problem was x" pronouncements are insufficient to explain what really happened. (but are nevertheless quite interesting when they come from the parties involved!)

          • simonh 2 days ago

            Arguably there is one company that did manage to make the transition from 8 bit hobby home computing and gaming. Apple.

            The Apple II was a contemporary, in fact a predecessor of all these systems. So, what did they do right, or what went right for them, that enabled them to make it? I suspect it was the penetration of the Apple II into education and business that helped make it possible, but suppose Steve Jobs had been in charge at Commodore or Atari?

            • badc0ffee a day ago

              I've thought about this a bit, and what I can come up with is that Apple had the clear lead in the first wave of home PCs in the late 70s (the others being the Commodore PET and the TRS-80 model I), and maintained it. The Apple II had bitmap graphics and colour built-in, and a very fast and relatively cheap disk add-on, but also well thought out expandability. You didn't need to buy a sidecar unit; just throw a card in an empty slot. Importantly, it also worked with inexpensive TVs and monochrome monitors that you could purchase separately. The hardware was also high quality - it had a nice keyboard, and a switching power supply that didn't get hot.

              Fast forward a few years, and the Apple II was still very usable and competitive, with RAM expansion options up to 128k, higher res graphics, and 80 column text, while still supporting the same software.

              One other thing is that the Apple II was wildly profitable. It had no custom chips, just cleverly used commodity chips and some ROMs. This includes the fast and cheap disk system.

            • justin66 a day ago

              > I suspect it was the penetration of the Apple II into education and business that helped make it possible

              I don't know how much it moved the needle but it was astonishing how much schools and home users - parents whose kids used the machines at school - were willing to pay for an Apple II well after it was a technically obsolete machine. It definitely helped them to some extent.

              (don't get me wrong, I love those machines in my bones, but they were pretty overpriced after a while)

              • flenserboy a day ago

                here's a guess: text was sharp on an Apple II with a decent monitor. font shapes were good. no matter how good graphics were on the C64 & Ataris, in comparison, text was always blocky & amateur looking. Tandy did better on this front, but it wasn't enough for them. pretty sure this is the same reason why the Amiga & the ST didn't make more inroads — people looked at them alongside the Mac & technical considerations were quickly forgotten. it's funny to me that this hasn't changed all that much — Windows font rendering looks awful to me, & I'll always pick a Mac or Linux box to use instead if there's a choice, just so I don't have to put up with the fonts. this wasn't always the case — the old system character sets used under DOS were pleasant to use.

            • mrandish 2 days ago

              Excellent question and one I already touched on in a sister reply before I saw your post. https://news.ycombinator.com/item?id=43722230

              Apple is indeed an extraordinary outlier (as is Jobs). If you look into the history of Apple's Gil Amelio days, very near-death and Steve's return, it was IMHO, a remarkable example of a series of fortunate miracles coinciding to allow Steve to brilliantly save the company when it had been only weeks away from death. Jobs calling Bill Gates and convincing him to quickly loan $400M to Apple averted disaster potentially by a matter of days. And Gates only did that because MSFT was being sued for anti-trust by the Justice Dept and needed Apple to survive as an example that Wintel still had some competition. Apple's survival in that period is the closest close thing I think the industry has ever seen.

              To answer your last question, Jobs was undoubtedly incredibly brilliant but it took every ounce of that brilliance AND some crazy good luck for Apple to survive. Ultimately, it was Jobs plus flukes, so no, just Jobs without the flukes wouldn't have changed anything at Atari or Commodore. Even on its death bed Apple had a much better brand, distribution, market potential and talent than Atari or Commodore ever did. Plus Steve had his hand-picked entrepreneurial team from Next with him. The situations at Atari and Commodore were just much weaker in every way, so I don't think any single super hero, no matter how super, could have saved them.

            • bsder a day ago

              > So, what did they do right, or what went right for them, that enabled them to make it?

              Desktop publishing.

              The Macintosh/LaserWriter cash cow absolutely dominated desktop publishing for a very, very, very long time.

              This gave Apple access to enterprise accounts that other computer companies did not have.

          • mrandish 2 days ago

            > I guess I don't attach the same weight to the CISC/RISC thing you do

            I certainly didn't appreciate the impact of CISC vs RISC architecture at the time. I understood the conceptual difference between them at a high level but didn't get why that caused Motorola to decide they couldn't keep scaling beyond the 68060. As a user and fan of 68030 and 040, I just didn't understand why they'd walk away from, arguably, the second most popular flagship computer ISA at the time. And they actually told computer manufacturers that the 68060 would be the end of the 68K line more than a year before 68060 even shipped. I was like, WTF? They weren't even done with the latest, greatest chip when they decided to kill the whole line, orphaning all that fantastic software from ever having any upgrade path to the future.

            Only later did I gain a deeper appreciation for the impact. A few key things informed me:

            * My high level understanding of CISC vs RISC wasn't complete back then. In the late 80s there was a lot of debate among CPU ISA designers on the relative merits between CISC and RISC - and that even extended to the definitions of the terms (which were fuzzy). A good example is this legendary Comp.Sci Usenet discusson: https://yarchive.net/comp/risc_definition.html. Only when I dove into those debates in the last 10 years did I really start to get the larger picture.

            * The part that mattered most between RISC/CISC wasn't that RISC had less instructions than CISC (although it usually did), it was that those instructions were much less complex AND that the addressing modes were much less complex. This meant that, in general, RISC ISAs were easier to decode because instruction and operand length tended to be more fixed. This also had a bunch of other downstream effects generally causing RISC-ish ISAs to be easier to pipeline more deeply, easier to branch predict, easier to speculatively execute, etc. These are all things that enable putting extra gates to work speeding up execution.

            * I was a huge fan of the 68K's insanely powerful addressing modes which let savvy assembly language programmers pack huge functionality into fewer instructions with addressing modes like indirection on both the input and output along with setting a bunch of flags and pre/post operations like decrement/increment. Programmers not only called 68K addressing modes powerful but also things like "orthogonal" and even "elegant." But all those varying instruction lengths with up to 14 different addressing modes plus even more optional flags modifying behavior before and/or after also created a complexity explosion for CPU architects trying to implement the most powerful new optimizations. That's one big reason why the 68060 was over a year late coming to market. It was only dual pipeline but even doing that was triggering unexpected avalanches of design complexity.

            * Both Intel and Motorola realized the only way to continue increasing performance in the future while maintaining software compatibility with their old CISC ISA was to (basically) make future processors RISC CPUs running an extra hardware layer emulating a CISC ISA. It was both hard and costly in terms of gates and performance. Intel's lead in fab process helped them hide that performance cost and keep showing generational net speed increases as they navigated the transition. Motorola realized they'd probably have a generation or two of CPUs that weren't meaningfully faster until they bridged the gap and were on the other side.

            There's a lot more but I'm certainly not a domain expert in CPU design and already summarizing my non-expert understanding of expert debates, so I'll leave it there. But it's pretty fascinating stuff. Both Intel and Moto realized that pure RISC implementations would probably beat them soon. Each company responded differently. Intel made the RISC-emulating-CISC approach to ISA compatibility (broadly speaking) work well enough to survive the transition. Motorola decided it was too risky (probably correctly given their fab technology and corporate resources), and instead chose to break with the past and partner with IBM in moving to Power PC. For Atari, Commodore, Apple et al this was a planetary level asteroid impact. If developers and customers lose all software compatibility with your new products, that makes the choice of moving to your next generation not much different than moving to another platform entirely. Only Apple managed to survive (and even they almost didn't). Arguably, they only treaded water with great design and marketing until saved by the iPod.

            I should also mention there was another huge asteroid for vertically integrated non-Wintel computer platforms right behind the CISC/RISC asteroid. In the early to mid 90s Moore's Law scaling was allowing desktop computers to improve rapidly by growing dramatically more complex. It was getting to be more than one company could do to win on each separate front. On the Wintel side, the market solved this complexity by dividing the problem among different ecosystems of companies. One ecosystem would compete to make the CPU and chipset (Intel, NEC, Cyrix, AMD), another would make the OS (Windows/OS/2), another ecosystem would compete to make the best graphics and yet another would compete on sound (Creative, Yamaha, Ensoniq, etc). It would require a truly extraordinary company to compete effectively against all that with a custom vertically integrated computer. There was no way a Commodore or Atari could survive that onslaught. The game changed from company vs company to ecosystem vs ecosystem. And that next asteroid even wiped out stronger, better-capitalized companies that were on pure RISC architectures (Sun, SGI, Apollo, etc).

            • zozbot234 a day ago

              > But all those varying instruction lengths with up to 14 different addressing modes plus even more optional flags modifying behavior before and/or after also created a complexity explosion for CPU architects trying to implement the most powerful new optimizations

              You certainly see the impact of the "don't add too many instructions/flags" style of design even today in things like RISC-V that doesn't even use condition codes (an unexpected source of complexity in ISA spec, since every instruction must define exactly how it affects or does not affect each of several condition codes - RISC-V has none of that), expects you to use instruction fusion in larger implementations and defines "compressed" instructions as a mere shorthand for existing full-length instructions in order to simplify decode. ARM64 has made different choices on all of these things, it will be quite interesting to see how they compare in real-world scenarios at the higher end of performance.)

        • bsder a day ago

          Agreed. The IBM PC wave was going to sweep the 1980s 8-bit computers aside no matter what they did.

          However, Tandy was the only one that could have survived as they had PC compatibles that were actually better than the original PC computers. Unfortunately, I seem to remember that Tandy had a big accounting/embezzlement thing back about the time that they needed to be hyperfocused on the PC business.

          • mrandish a day ago

            Yes, the Tandy 1000, 2000 and their PC-compatible descendants were what Tandy ended up doing, but being PC-compatible they were more of a "surrender and join'em" strategy in the face of the PC juggernaut instead of "beat'em". When I was referring to Tandy I was thinking of their earlier unique non-PC computers like the Model I, II, III, IV (Z-80), Color Computer (6809) and 16 (68000 Unix-based OS), some of which were sold until 1991.

            If you're interested, this retro site has several articles covering the Tandy PC clones, why they struggled at launch (weren't actually very compatible early on) and how they eventually failed (weren't differentiated or cheaper at the end). https://dfarq.homeip.net/tandy-1000-models/ But Tandy's PC-compatibles sold very well in-between the first and last couple years. More interesting from a retro perspective though, a lot of people don't realize just how dominant Tandy/Radio Shack's unique (non-PC) computers were between 1977 and 1985. They outsold even Apple, Atari and the C64 in many of those years.

            • justin66 a day ago

              The Color Computer was interesting. They kept the 8-bit fire burning a little longer than was truly reasonable, but I wouldn't fault them for it.

              • mrandish a day ago

                The 6809-based Color Computer was my first computer and it was indeed incredibly powerful. Unfortunately, all that power was in the CPU itself, arguably the best 8-bit CPU ever made (because it was really an 8-bit / 16-bit hybrid) and the 68000's immediate predecessor.

                Tandy just used the Motorola reference design so there was nothing unique or proprietary to Tandy. The graphics were just a straight bitmap from CPU RAM, no sprites, tiles, or selectable palette colors. The sound output was a plain six bit DAC that had to be driven by the CPU on interrupts.

    • alexisread 2 days ago

      Actually, it can do a fair impression as well: Crownland has transparent parallax https://www.youtube.com/watch?v=dN5fSp0XGzI

      You're right about the bonkers 3D! https://www.youtube.com/watch?v=qwcN9FraNjQ https://www.youtube.com/watch?v=KatdrdEVEwY&t=532s

      I think the main thing was that Atari (pre83 company) abandoned the 8 bit line too early, and didn't make the 5200 cross-compatible.

      • p0w3n3d 2 days ago

        This Crownland you have linked is a madness of an excellent developer's work, mostly quirks (I guess). According to my knowledge such a fast horizontal scrolling and colourful background is impossible in standard Atari with standard coding.

        And it seems those are Polish guys (https://www.atari.org.pl/forum/viewtopic.php?id=4520)

        -- EDIT --

        I found it: only version for Atari 130XE (i.e. 128KB RAM - this is not the standard, I guess it could run with cartdrige on 65 XL/XE maybe?), year 2006/2007 (fresh :) )

      • sirwhinesalot 2 days ago

        Oh yeah, for sure. Atari was horrifically mismanaged.

        • mrighele 2 days ago

          It's not like Commodore was managed much better, in the end.

      • cmrdporcupine 2 days ago

        I dunno, Tramiel's Atari Corp kept the 8-bit line going for years after the changeover, adding new models. And they even had relative success later in places like Poland.

        One problem is that these kinds of architectures that relied on special custom chips have inevitable obsolescence built in. When your "API" for graphics programming is a custom chipset at a certain clock rate with certain capabilities, it's just not going to scale up past a certain point. You get initial superpowers, but then Moore's law just makes it pointless.

        See also: Amiga.

        • bluGill 2 days ago

          Tramiel arguably mismanaged Atari worse than Warner brothers. Sure Tramiel cared about computers, but his management style was hostile to getting anything done.

          I agree that the custom chips were a dead end. However the ST line could have beat the mac if management was any good. (as could/should the Amiga, though I was Atari locked at the time and so I didn't pay attention to what they were doing)

          • cmrdporcupine 2 days ago

            I mean I agree about Tramiel management. I was an Atari ST user from 86? 87? to 92.

            But I also think there's just no way Atari Corp could have taken on e.g. Apple regardless of management. They simply didn't have the capital, market penetration, etc. to carve out a segment that big. Apple was already a big deal and although the Macintosh was not really a big success at that point, there was a giant company behind it with top tier talent, and lots of cash.

    • curiousObject 2 days ago

      The 6502-based Atari computers had the CPU clock held at 80% higher than the C64. That must have been a very significant impact at a time when the CPU had to do most of the work.

      • RiverCrochet 2 days ago

        It was. Wasn't the Atari's CPU 1.79MHz (3.58Mhz NTSC clock/3)? The NTSC C64 was close to 1Mhz. But it's worse: The C64's CPU was also slowed down by the VIC-II every 8 scan lines to fetch video data, and slowed down additionally if sprites were enabled.

        The PAL C64 was actually slightly under 1Mhz but you had a lot more VBlank time to do stuff.

        • timbit42 2 days ago

          The C64's CPU wasn't slowed down much at all by the badlines, less than a few percent. I'm not sure how much the sprites slowed it. The Atari was faster but how much it was slowed by depended on the video mode. Higher res and higher colors slowed it more, leaving the CPU at around 1.3 MHz. I also don't know how much the Atari player/missile graphics slowed the CPU.

        • sirwhinesalot 2 days ago

          The graphics chip on the Atari also steals CPU cycles, but if you use the 160x100 mode (instead of 160x200) then it can run full speed which means full 1.79MHz no questions asked.

          The lower resolution also means less time plotting pixels so for any sort of software rendered effect the Atari machines are miles ahead.

        • bluGill 2 days ago

          Atari's CPU was slower in PAL countries, but I don't recall what the speed was. (speed based on something in PAL like the NTSC color clock, but I don't recall what it was called)

    • cmrdporcupine 2 days ago

      The C64 had some advantages as you say but its chief one was just... price. It was simply much cheaper from the start.

  • eddie_catflap 2 days ago

    I love the C64 but the Atari 8-bit line was fine indeed (one of my first exposures to home computing was Star Raiders at a family friends house - blew me away). Archer Maclean, author of Dropzone (and other titles) famously labelled them the 'Porsche of home computers'.

    Where I think the 64 had the edge was in the incredible SID chip and I'd argue the amazing hacks that were found for the system over the years that enhanced what the 64 was capable of.

    https://en.wikipedia.org/wiki/Dropzone#Development

  • JKCalhoun 2 days ago

    I was able to get an Atari 400 (not XL, sadly) for a firesale price. The problem with all the Atari's in my mind was that they were not dev-friendly machines.

    Commodore machines came with a rather hefty serial bound book that introduced you to programming and gave you a memory map of the hardware, important PEEKs and POKEs.

    Atari's came with trade secrets.

    • plefebvre 2 days ago

      True, the Atari 8-bits did not come with developer docs and in the early years little information was available. This certainly hurt its initial adoption.

      But starting with De Re Atari by Chris Crawford in 1982, a lot of development material became available. Compute! had a great line of books, including Mapping the Atari.

      It was shame it took so long for that material to appear because the Atari 8-bit have a rather elegant OS, especially compared to its contemporaries.

    • mst 2 days ago

      My first Archimedes came with a ring bound user's manual that was, IIRC, about 1/3 a guide to using the Risc OS GUI, and then 2/3 a programming guide to the version of BBC BASIC that shipped with it.

      (I remember reading it end to end as a child laid on my parents' bed because the light was better in there than in my room, shortly followed by developing the programming addiction that has stuck with me the rest of my life ;)

      It didn't cover arm2 assembly, but my parents bought me an extremely good book that did - and described the chip architecture itself in detail as well.

      I only touched a Commodore at a friend's place to play games on it, but it sounds like they also understood hobbyists :D

    • ajross 2 days ago

      > The problem with all the Atari's in my mind was that they were not dev-friendly machines.

      That was true in the early days of the 400/800, but by 1982 when the 1200XL was released (a few months ahead of the C64) they'd corrected themselves. The board schematics and assembly source for the ROM was a book you could buy at the dealer, and sources like De Re Atari and Compute Magazine had collated all the relevant details of the handful of ASICs such that people could start playing weird tricks.

      It wasn't Woz's Red Book (neither was Commodores documentation), but it told essentially the whole story of the devices down to the MMIO level.

    • Mountain_Skies 2 days ago

      Only a few friends had Atari computers when I was a kid. The one thing that stuck out to me was it had a Help key but most programs told you to press 'H' or some other key for help instead of the Help key, which makes me wonder if knowledge of how to detect that key wasn't in the manual? Atari owners were passionate about their computers and seemed happy with them but at least in my little town, there just weren't very many of them.

      • bluGill 2 days ago

        The first 400 and 800 did not have a help key. As such if you used the help key you either had to refuse to work with the large installed based of those earlier systems, or you had to provide an alternative. The alternative won in most cases and then it wasn't seen as worth also supporting help (remember bytes mattered)

  • rbanffy 2 days ago

    The big pro of the Ataris was their graphics. Replacing a frame buffer with a display list and a dedicated processor that keeps banging out pixels based on its "program" is brilliant. It's an interesting maximalist counterpoint to the Apple II's minimalist approach to color graphics.

    • karmakaze 2 days ago

      The C64 definitely had better sprites and music, no contest. But there's a certain elegance to the way the Atari did their graphics in particular. It was so much more with simple building blocks where you could immediately see the power of it and take a long time extracting value from it.

      The SIO of the Atari is also another standout design, which flies under the radar. It enabled a much cheaper diskette drive than the C64. The designer of the SIO, Joe Decuir went on to make USB and credits his work on SIO as the basis of USB[0].

      Even the use of letters for devices was already ahead of DOS with D: being for diskette, but that was shorthand for D1:, with D2: being another and any other letter could be an installable device with numbered instances. Keyboard/screen I/O was addressable as E:.

      [0] https://en.wikipedia.org/wiki/Atari_SIO

    • pavlov 2 days ago

      It’s a bit like an early GPU.

      • rbanffy 2 days ago

        Very much. It was a direct ancestor of the Amiga graphics architecture, and certainly inspired many other less known architectures.

  • userbinator 2 days ago

    MS has unfortunately now fallen greatly from that, and gotten onto the same aggressive and hostile trendchasing practices as the rest of Big Tech. Only their legacy keeps them from totally losing the market at this point.

    • pjmlp 2 days ago

      As someone that bought into WinRT, saw it as .NET 1.0 done right, and went through all the technology reboots between Windows 8 and WinUI 3.0/WinAppSDK, I can only double down on that remark.

      It was gone so bad that most Windows developers, me included, advise focusing on Win32/Windows Forms/WPF, at BUILD 2024 WPF got back into the spotlight as official Windows GUI framework (WinUI 3.0 keeps being years away from feature parity with UWP, let alone WPF), even Office and XBox teams rather reach out to React Native than bothering to make XAML C++ in WinUI 3.0 work.

      Check the agenda for BUILD 2025, hardly anything related WinUI 3.0 / WinAppSDK.

      And if you want to have some fun, come around into the Github discussion issues from all the related repos.

      • Mountain_Skies 2 days ago

        It doesn't seem like Microsoft and Anders Hejlsberg understand what they did when they picked Go over any dotnet language for the Typescript compiler. Anders and his fans insist it was merely picking the right tool for the job but when even the father of C# prefers using Go, combined with Microsoft's tendency to get old technology rot instead of officially cancelling, it sends a very bad message about the future of dotnet. No one wants the shitshow that has been Microsoft's desktop UI over the past decade to spread into the rest of the dotnet ecosystem but most are wary of it happening. Anders was the very last person in the company who should have been the face of the Typescript compiler project using Go.

        Many have pointed this out but just get shouted down by those haven't had to endure the UI framework pain that the company has put developers through over the past ten to fifteen years. Microsoft officially is completely behind dotnet and is committed to its continued success. Same message they've given for all their UI frameworks. The only difference being dotnet still gets lots of resources but so did all of the frameworks before they were left to rot from resource and leadership starvation.

        • pjmlp 2 days ago

          Indeed, it is one of my favourite ecosystems, however I have always been a generalist, I can't get stuck into being a XYZ Developer for too long.

          As such I get what it means to be in a Microsoft shop, in a UNIX shop, in those that don't care, that use a mix of stacks whatver.

          The .NET team has made great achievements turning the ship around from Windows only into a cross platform product.

          However occasionally when they complain on social media, why despite all this effort, there is still some issues getting .NET adoption over Go, Rust, Java, Python, nodejs, you name it, they should start inside Redmond buildings.

          DevDiv nowadays is no long .NET and C++, regardless of .NET came to be, Microsoft has seen it needs to be back into Java game and even has their own OpenJDK distro.

          The initial implementation of VSCode support for Go was done by Microsoft, and nowadays they have their own Go distro.

          While the .NET team makes great developments to ease cloud native development, the projects Azure works on and contributes to CNCF are a mix of Go and Rust for th most part.

          As for the UI side, from what I can tell, been part of the receiving end, most of the key developers are gone, the new blood are all millennials that grew up with macOS, Linux and Chromebooks, naturally no background experience on Windows developer ecosystem, and strong focus on Web development.

          Naturally they aren't to blame, they know what they know, what apparently is missing is proper managment, resources and guidance, so that they can deliver to what used to be "Developers, Developers, Developers".

          • exceptione 2 days ago

            The smartest thing Microsoft could do, imho, is cancel all their billion UI variants, bless Avalonia and throw all the saved millions of dollars to them.

            Then in one instant they get a real cross-platform UI framework, with a competent team as a free bonus.

            It is actually incredibly compelling, the cross-platform .NET offering.

            If you look at all the Go, Python and Rust UI toolkits, none hits the bar. They invariably write their own toy framework with 2 components, or they duct tape a leaking C++ toolkit on top with bad interop.

            • mixmastamyk 2 days ago

              Linux distros are rewriting their installers and partition tools in web technologies these days. Hard to believe but the direction is clear, no one cares about local UI toolkits.

              • exceptione 2 days ago

                I didn't know that? That is hitting a new low.

                But maybe it should not surprise me, C/C++ got out of fashion (with good reasons), but the devs lost the UI toolkit with it. The average dev does not command a language with an ecosystem that is up to the task. So it is some slow Python + some crapshoot UI.

                Linux finally has an option for good, performant UI, written in a sensible high level language with a vast standard library. Look at HN, people hate the electron stuff. There is just a shortage of competent programmers.

                Java + JavaFX might be an other option next to .net core, but Java is verboten as well. People rather fix runtime crashes in the untyped langue they learnt in their first tutorials than broaden their horizon.

                • bsder 15 hours ago

                  > Look at HN, people hate the electron stuff. There is just a shortage of competent programmers.

                  The problem is that, to first order, the web is the only thing that exists. To second order, the web and Windows are the only things that exist. So, any GUI toolkit needs to get traction there first--which is exactly what Electron does.

                  HN readership, of course, don't care about either of those so anything they develop are doomed to fail from the very start.

      • neonsunset 2 days ago

        FWIW WinRT / WinUI 3 was ported onto NativeAOT (9). There is are greater ongoing efforts[0] to make the two play nice together from the people working in Windows. But I think it's just efforts from specific individuals who care and the outcome solely relies on their motivation and continued employment, and is not facilitated by the org or its culture in any way, if anything, it happens despite it.

        [0]: For example https://github.com/dotnet/runtime/issues/114024

        • pjmlp 2 days ago

          Yes, those individuals have been great contributors and kudos to them, but it is visible, specially from the last community call how things have been going.

  • ack_complete a day ago

    I would say Apple had been doing a fair amount of it prior to Microsoft, if you looked at the way they carefully patched the Monitor II ROM in later models. This is something that Atari didn't get at first when they first revised the OS in the 1200XL and shifted entire sections of code, before reverting major sections back to match the original OS.

    Problem is, there were always programs that abused the OS so much that it would have been impractical to accommodate them, since bundling extra ROMs is costly. The worst case I know of is a program that used an entire section of the OS ROM as an XOR encryption key for sectors on disk.

    As for Atari vs. C64, I love the Atari but it needed an update to the custom chips to compete with the C64 and other newer systems. But instead, Atari was looking at adding a 300 baud modem and a speech synthesizer to the computer instead.

  • ndrake 2 days ago

    Atari 800 XL <3

ilamont 2 days ago

Reports of other software incompatibilities due to the ROM changes would start to come out once the 1200XL was actually released and got into user’s hands, hurting its reputation.

That wasn’t as big a deal in the 80s as it is now. Reputation was limited to real life friends and maybe a few homegrown newsletters or computer clubs.

Very few people were using the Internet to share opinions in the early 1980s, so “reputation“ could be very effectively managed by Atari and other companies through advertising and leaning on trade media to suppress negative reviews and angry letters to the editor.

That is, unless the problems were too big to ignore and customer anger became too great, as was the case with many late era Atari 2600 games.

A bigger issue for the 1200XL was price as well as something not addressed in the article: competition. By this point there were other platforms to consider, often at better price points with attractive features and software.

  • os2warpman 2 days ago

    >That wasn’t as big a deal in the 80s as it is now.

    It was a big deal for me. Software expenses were a huge portion of the cost of owning a computer.

    Almost always the price of the computer was less than the cost of buying software to run on the thing.

    Letter Perfect was around $300. If it didn't run on the 1200XL I'm not shelling out $800 for the computer and another $300 for a compatible word processor.

    I am convinced that cross-vendor incompatibility was THE reason for CP/M's failure. Not anti-competitive behavior, not shenanigans, but the fact that if you spent $495 on the Kaypro version of Wordstar and then bought an Osborne, it wouldn't work. Same Z80, same CP/M, wouldn't work.

    Even today PC manufacturers are only starting to remove the BIOS compatibility layers that allow you to boot >30-year-old versions of DOS on a modern hardware, and Apple has provided binary translators since the 1994 PowerPC transition and supported them for years after breaking native compatibility.

    • guenthert an hour ago

      "I am convinced that cross-vendor incompatibility was THE reason for CP/M's failure. Not anti-competitive behavior, not shenanigans, but the fact that if you spent $495 on the Kaypro version of Wordstar and then bought an Osborne, it wouldn't work. Same Z80, same CP/M, wouldn't work."

      But CP/M had a well defined API. Compliant programs would work on different vendor's computer, much like in the MS-DOS domain. The key difference was, that CP/M had no well-defined disk format, i.e. you couldn't just swap disks (for evaluation purpose only, of course) with your buddy if he didn't happen to own the same type of computer; you had first to transfer the software via other means, e.g. serial interface (perhaps using kermit). A bridge too far for most casual computer users.

  • bluGill 2 days ago

    BBSes were a thing back then, and while it wasn't the interest you did have large discussion. If you could afford compuserve (which charged by the minute!) you had a nationwide audience on a platform that was bigger than the internet of the time. A few people also had access to the internet (via their university), or at least usenet (via their work or the internet) and so there was discussion that way - but compuserve was where it was at.

    • DrillShopper 2 days ago

      It'd be interesting to see the FidoNet echomail traffic about the 1200XL as that's likely the most widely available forum accessible to people with a modem since any local BBS could carry it.

      • bluGill 2 days ago

        I forgot about fidonet, yes that was an option. A few others on that line existed as well.

alamortsubite 2 days ago

Ali Baba and the 40 Thieves and Atari Basketball are two additional games that made use of the 400/800's extra joystick ports. Ali Baba was turn-based, IIRC, so not as exciting of a use case, but playing basketball with three other kids simultaneously was a riot. Very special for the time.

runjake 2 days ago

I had an 800XL and a 520ST, but I don't recall ever seeing or hearing about the 1200XL. I feel like I just entered some bizarro universe. But wow, I really love it's physical design.

  • forinti 2 days ago

    There was an Apple II clone in Brazil that used the same design. It was called TK2000.

    I never found out why they copied the design of a completely different machine. I guess they just liked it.

glonq 2 days ago

I jumped from an 8-bit Atari to a CGA PC/XT clone and it was surprising how much forward I jumped in power and capability and storage, yet how much -backwards- I jumped in graphics and sound. Those were great lil' machines and even got me online (to BBS's at 300baud) back in the day.

bentt 2 days ago

my best friend’s parents bought this and kept it in their bedroom for some reason. They allowed us to play with it, but it had no storage they didn’t buy anything to go with it.

So we would just type programs in from magazines and the worst part about it was that sometimes we wouldn’t finish and then his parents would shut the computer off at night and we’d have to start over.