- Excellent recounting of the history and industry context along with an objectively fair assessment of the probabilities the rumors were true.
- Indeed. The level of dedication, time and skill it required for this guy to analyze and fix the dozen+ incredibly arcane and minuscule differences between the emulator and actual hardware just so he could remove the small mod that detected this one demo is simply unbelievable. All just for this one demo which does truly insane, deeply unnatural things no other demo (or game) ever attempted. Plus the existing mod already allowed the emulator to play the demo perfectly down to the pixel and millisecond. It just bothered him that a special case mod was needed, so he spent well over a year on research, debugging and analysis - including writing several quite sophisticated visualization tools to specifically identify these incredibly minor divergences.
It's double amazing because the demo is incredible for identifying these insanely challenging tricks and getting them to run on primitive hardware no one thought could display this level of colors or animation AND then a software emulation of the vintage IBM 5150 PC (and monitor) being so precise it's able to recreate this one-off insanity in cycle-accurate real-time with NO case-specific hacks. It blows my mind that either one of these actually exists. Both together is the stuff of heroic retro legends. I'm confident none of the 1970s designers of the 6845 chip would have ever believed their chip was capable of generating these graphics.
- Looking at that page took me back to the days of MODEMing to BBSes (and later running a BBS). The release of the ZModem protocol by Chuck Forsberg in 1986 was one of those rare leaps that instantly improved the daily experience of so many people doing so many different things.
- bmonkey325 62 days agoI remember the big thing about Zmodem was it gave a transfer status that was relatable and updated frequently. I remember that Ymodem-G being technically faster (before the Moby Turbo update), but because it didn't update on the screen as often it felt as if I was watching paint dry.
- Going this deep in pursuit of such esoteric exacting precision for a purely hobby project is pretty remarkable.
- That is absolute... MADNESS! And I mean that in the very best possible way. Wildly impressive.
- I remember seeing one of these in a Walmart entry lobby back in the early 90s. It was eye-catching to a video game nerd and looked sort of cool but after playing it a couple times I stopped because, well, it just wasn't a very fun game.
Like so many other high-concept or 'gimmick' games, it managed to get attention but failed on basic game play.
- [ I posted a question (below) over at HN on the thread for this article and got the reply (further down) ]
> "In July 2024, a new company called Tengen Games released its first game, “Zed and Zee,” for the Nintendo Entertainment System (NES) ... Tengen and its parent company, Atari Games, had disappeared 30 years ago after being crushed in court by Nintendo for doing exactly the same thing: manufacturing unauthorized cartridges for the NES."
The article doesn't address how Tengen is now able to produce unauthorized NES-compatible cartridges. Is Tengen paying Nintendo for a license? Did the patents expire? Did relevant legal precedents change? Another possibility might be that, while Atari's 80s legal actions established that intermediate infringement during reverse engineering could be fair use, Atari itself was precluded from relying on that fair use because its lawyers did naughty things. Maybe "new" Tengen reversed engineered it again from scratch without naughty lawyers?
[ Reply ]
> The patent on the lockout mechanism has expired and clean software implementations of the algorithm have been created. So the old legal protections no longer apply.
And while Nintendo still aggressively enforces their copyright on their old games, they probably don't care very much about unlicensed games being created for their very old hardware. It's just not commercially relevant to them.
- I have a 1200XL in my collection (signed on the bottom by one of its hardware designers!). Compared to the 800XL, 600XL, 800 and 400, which I also have, it does indeed have the nicest keyboard. Since I wasn't following the Atari market closely in early 80s, I was interested to learn from the article that the 1200XL was announced and shipped before the 800XL and 600XL and then quickly discontinued.
- > code that destroys chips or other components by overheating/stressing them
I agree with you that, just on general principles, I don't know of any reason writing to a masked ROM chip would have any negative impact. While I didn't have a C64 back in the day (I do now though), I did have a Radio Shack Coco which had 16K of masked ROM for the BASIC interpreter (and another 8K of masked ROM if the optional disk controller cartridge was there). And the Coco never had anything like what Dave describes ("Although it’s impossible to write to ROM, Commodore left out the circuitry in the 1541"). The CPU could write to any address whether it held ROM, RAM, control registers or nothing at all. A masked ROM doesn't even have a write select pin. Some EPROMs have a write select but that requires other voltage etc. I used a lot of EPROMs back in the day because I worked at a company that leased hundreds of complete Coco systems to corporate customers each with it's own unique software on a custom cartridge. Each EPROM was burned by hand because it had proprietary customer data on it. The cost was no problem because one month's lease paid for the whole computer. :-)
Since I wrote the EPROM bank switching assembly language routines that drove the custom ROM cartridge hardware, I hammered EPROMS with writes all the time and it never hurt them (and we had hundreds of systems in all-day use). So that part doesn't make much sense to me unless there was something very unusual about the Commodore 1541 controller hardware (and to be fair, I understand the 1541 was weirdly complicated). EEPROMs could maybe have been effected but those were expensive and I can't imagine Commodore shipped electronically erasable chips in volume when much cheaper masked ROMs would suffice. So I suspect whatever Dave is talking about perhaps got garbled or conflated (as 30+ year-old memories do).
If it's garbled or conflated it could be based on the legendary (but real) undocumented HCF instruction (Halt and Catch Fire). And I know all about that because the Coco's 6809 was the original 8-bit home computer CPU that had that instruction. https://en.wikipedia.org/wiki/Halt_and_Catch_Fire_(computing.... But even HCF wouldn't actually damage your processor, although it could certainly warm it up if you left it running!
Further grasping at straws here... I guess every CPU does have some lifespan limit based on cycles and heat but it's really long. Unless something's very wrong with the chip or system design, that lifespan limit isn't usually a factor for a mass market computer. Another thing which might lead to confusion is that lots of computers over the years have had designs that were "thermally challenged" either through poor design, manufacturing errors or excess cost cutting. In those specific cases, it was possible to run really tight loops on the CPU which would, given some time, warm up the processor more than normal and cause a crash due to exceeding the T-limit (max operating temp) for too long. Some early computers also had RF design issues in how the traces on the motherboard were laid out. On these systems, if the RF shield wasn't grounded and you ran code hammering the address lines in certain ways, it could cause enough ringing to turn traces into little antennas spewing out noise and that could cause the computer to crash due to corrupted signals on the adjacent data lines. Once again, that was just a software crash, not permanent damage, and I never personally saw it happen except on prototypes and wire-wrap boards.
> I call BS on this claim
Unless you're Dave's drinking buddy and there's beer on the table, that specific wording may be just a little bit harsh. I mean, Dave has generated a huge volume of retro writing over a lot of years... and the dude definitely lived it first hand. Mistakes happen and I've certainly conflated or garbled some things from 30+ years ago but I doubt he's just making stuff up. I think he's writing from personal experience and relating the truth as he remembers it. That said, I think it's entirely reasonable to ask him for more clarification whenever something doesn't make sense. As retro-obsessive as he obviously is, like me, I'm sure he'd love to find out something he thought he knew is actually different.
- > Unless you're Dave's drinking buddy and there's beer on the table, that specific wording may be just a little bit harsh
Yeah, maybe, sorry if it came across like that. We use the term "I call BS on that!" very colloquially and loosely here, so I didn't think of it as being offensive. I could have worded that better, I agree.
> "Although it’s impossible to write to ROM, Commodore left out the circuitry in the 1541"
There is no "circuitry" to disable writing to ROM. ROM chips have no r/W pin, so no circuitry could attach to that. The only thing I could imagine is that they "forgot" the circuitry to disable the ROM's outputs when a write was issued. In that case, the CPU and the ROM write to the data bus at the same time. Which would totally garble whatever it is that is on the bus (which doesn't matter, since the write would be lost anyway), and maybe send a few more milliamps through the processor's (or the ROM's) data lines, but I doubt that this would be much more than what those pins are designed to handle in the first place.
One fact though is that the RAM chips they used back then were often very low quality (because they had trouble sourcing the amount they needed to keep up with the demand), and these RAM chips just broke at some point.... Watch any YouTube video about a C64 repair, and you will notice that everyone just complains about those chips. But that is a different issue and wouldn't explain the ROM chips breaking, or why the issue happens because of "writing to ROM"...
- A little sad to see the venerable Motorola 6809 dismissed with only "This was possibly the most sophisticated 8-bit architecture but had much more limited adoption than its competitors."
If we're using emulation anyway, does the installed base over 40 years ago really matter? Of course, I'm biased on this point because, by happenstance, I ended up with the 6809-based Radio Shack Color Computer. Basically, my parents weren't going to pay much for a home computer for my late-teen self in 1980 (because what would you even do with a such a thing?) Even the minimal 4K RAM version was $600 but via a tiny ad in the back of a magazine, I found a non-corporate franchise Radio Shack store out of state selling them for about $450 delivered (no sales tax!). I mailed the check off hoping I wouldn't get scammed.
The computer showed up and it turns out random luck paid off because the 6809 was a fantastic CPU to learn assembler on. It was the most powerful 8-bit CPU because it was really a hybrid 8-bit/16-bit CPU, much like it's later big brother the 68000 (a 16-bit/32-bit hybrid). It had a bunch of 16-bit registers, indexed and program counter relative addressing modes (enabling position independent, re-entrant, pre-emptive multi-tasking code), software and user stacks, layered interrupts and a beautifully orthogonal instruction set (definitely inspired by the PDP with some nods to the IBM 360). And, damn, was it powerful! With the Unix-like, multi-tasking OS-9 operating system you could service up to 8 simultaneous users on serial terminals in real-time on a literal "toy computer" with a 1.8MHZ 6809 and 64K RAM.
For the first several years I was learning assembler, I had no idea how lucky I was. Much later when I eventually looked at the Z80 and 6502, I was shocked how primitive they were. Apple even initially chose the 6809 for the Macintosh and early Mac prototypes were 6809-based before they migrated to the 68000. Even better, Radio Shack sold an assembler in a ROM cartridge so low RAM wasn't a problem. A television for display and an audio cassette recorder to save your programs completed your "software development environment" :-).
Okay, it's true not a lot of people know much about the 6809 today - but back in the day all the cool kids definitely knew it was a powerhouse compared to Commodore, Apple and Atari 8-bits CPUs. However, I think more people today might object to dismissing the 68000 family. The 68K is still legendary for being fun to program in assembler. Sure, it was CISC but it was CISC in perhaps its most mature, pure and idealized form. And while RISC eventually replaced CISC architectures because RISC was more scalable when Moore's Law eventually delivered ever more gates in the 90s, the very CISC 68K was designed to be human legible, expressive and even joyful to program in assembler by hand. Sure, RISC architectures can be hand programmed, but they were clearly conceived for compiler written code and higher level languages.
Learning assembler today is anachronistic anyway, so why not go for the gusto and relive an ISA that legendary OG giants of software seriously described as 'elegant' and 'beautiful'?
- More