Two Stop Bitsnew | comments | tags | ask | submitlogin
  • Darkstar 10 days ago | parent | on: ARCNET: The Sleeping Giant
    In the 90s, when Ethernet cards still cost like 50-100 bucks (more than us kids could afford), I got a large box of around 10 ISA ARCNet cards from a friend who just migrated a small office to Ethernet. There were no hubs or anything.

    A "quick" internet search later (this was before Google, so it took a few hours total), I found the schematics for a passive ArcNet hub and built one using spare parts and a cheap plastic box.

    It worked extremely well, and it was very flexible since you could mix and match star and bus topologies as you liked. So if someone only had a short cable, they could be connected to the computer next to them with a T connector and go to the central hub that way.

    We used this setup for LAN parties for about 2 or 3 years, until everyone had gotten a 100mbit Ethernet card.

    reply
  • Darkstar 13 days ago | parent | on: Second Reality by Future Crew ported to Windows
    My favorite was the "Real Reality" spoof by Never a few years later for Mekka&Symposium. They re-made it entirely with shaky VHS home cameras and papercraft models... Extremely creative, for a time when video editing was still very much not in reach for most people
    reply
  • Darkstar 81 days ago | parent | on: Baldur's Gate: The Original Saga – Guide and Walkt...
    Every time someone tells me how great and extensive Baldur's Gate 3's story and game world is, I refer them to one of the BG1 or BG2 walkthroughs or playthroughs.

    Compared to BG1 and BG2, the world of BG3 looks terribly small and cramped, and the story, while it is still quite good don't get me wrong, just doesn't have the ... vastness and all-encompassing breadth of the story od the old BG games

  • Darkstar 124 days ago | parent | on: Commodore hardware viruses–yes, they were possible
    I call BS on this claim:

    > But if your program tried to write to ROM and did it often enough, you stressed both the CPU and ROM chip and could cause one or the other to overheat and fail.

    I was very much into the C64 scene back in the early 90s and while I heard claims similar to that one (code that destroys chips or other components by overheating/stressing them) there was never any legitimate source of that. It was all just urban legends

    • markran 124 days ago
      > code that destroys chips or other components by overheating/stressing them

      I agree with you that, just on general principles, I don't know of any reason writing to a masked ROM chip would have any negative impact. While I didn't have a C64 back in the day (I do now though), I did have a Radio Shack Coco which had 16K of masked ROM for the BASIC interpreter (and another 8K of masked ROM if the optional disk controller cartridge was there). And the Coco never had anything like what Dave describes ("Although it’s impossible to write to ROM, Commodore left out the circuitry in the 1541"). The CPU could write to any address whether it held ROM, RAM, control registers or nothing at all. A masked ROM doesn't even have a write select pin. Some EPROMs have a write select but that requires other voltage etc. I used a lot of EPROMs back in the day because I worked at a company that leased hundreds of complete Coco systems to corporate customers each with it's own unique software on a custom cartridge. Each EPROM was burned by hand because it had proprietary customer data on it. The cost was no problem because one month's lease paid for the whole computer. :-)

      Since I wrote the EPROM bank switching assembly language routines that drove the custom ROM cartridge hardware, I hammered EPROMS with writes all the time and it never hurt them (and we had hundreds of systems in all-day use). So that part doesn't make much sense to me unless there was something very unusual about the Commodore 1541 controller hardware (and to be fair, I understand the 1541 was weirdly complicated). EEPROMs could maybe have been effected but those were expensive and I can't imagine Commodore shipped electronically erasable chips in volume when much cheaper masked ROMs would suffice. So I suspect whatever Dave is talking about perhaps got garbled or conflated (as 30+ year-old memories do).

      If it's garbled or conflated it could be based on the legendary (but real) undocumented HCF instruction (Halt and Catch Fire). And I know all about that because the Coco's 6809 was the original 8-bit home computer CPU that had that instruction. https://en.wikipedia.org/wiki/Halt_and_Catch_Fire_(computing.... But even HCF wouldn't actually damage your processor, although it could certainly warm it up if you left it running!

      Further grasping at straws here... I guess every CPU does have some lifespan limit based on cycles and heat but it's really long. Unless something's very wrong with the chip or system design, that lifespan limit isn't usually a factor for a mass market computer. Another thing which might lead to confusion is that lots of computers over the years have had designs that were "thermally challenged" either through poor design, manufacturing errors or excess cost cutting. In those specific cases, it was possible to run really tight loops on the CPU which would, given some time, warm up the processor more than normal and cause a crash due to exceeding the T-limit (max operating temp) for too long. Some early computers also had RF design issues in how the traces on the motherboard were laid out. On these systems, if the RF shield wasn't grounded and you ran code hammering the address lines in certain ways, it could cause enough ringing to turn traces into little antennas spewing out noise and that could cause the computer to crash due to corrupted signals on the adjacent data lines. Once again, that was just a software crash, not permanent damage, and I never personally saw it happen except on prototypes and wire-wrap boards.

      > I call BS on this claim

      Unless you're Dave's drinking buddy and there's beer on the table, that specific wording may be just a little bit harsh. I mean, Dave has generated a huge volume of retro writing over a lot of years... and the dude definitely lived it first hand. Mistakes happen and I've certainly conflated or garbled some things from 30+ years ago but I doubt he's just making stuff up. I think he's writing from personal experience and relating the truth as he remembers it. That said, I think it's entirely reasonable to ask him for more clarification whenever something doesn't make sense. As retro-obsessive as he obviously is, like me, I'm sure he'd love to find out something he thought he knew is actually different.

      • Darkstar 123 days ago
        > Unless you're Dave's drinking buddy and there's beer on the table, that specific wording may be just a little bit harsh

        Yeah, maybe, sorry if it came across like that. We use the term "I call BS on that!" very colloquially and loosely here, so I didn't think of it as being offensive. I could have worded that better, I agree.

        > "Although it’s impossible to write to ROM, Commodore left out the circuitry in the 1541"

        There is no "circuitry" to disable writing to ROM. ROM chips have no r/W pin, so no circuitry could attach to that. The only thing I could imagine is that they "forgot" the circuitry to disable the ROM's outputs when a write was issued. In that case, the CPU and the ROM write to the data bus at the same time. Which would totally garble whatever it is that is on the bus (which doesn't matter, since the write would be lost anyway), and maybe send a few more milliamps through the processor's (or the ROM's) data lines, but I doubt that this would be much more than what those pins are designed to handle in the first place.

        One fact though is that the RAM chips they used back then were often very low quality (because they had trouble sourcing the amount they needed to keep up with the demand), and these RAM chips just broke at some point.... Watch any YouTube video about a C64 repair, and you will notice that everyone just complains about those chips. But that is a different issue and wouldn't explain the ROM chips breaking, or why the issue happens because of "writing to ROM"...

    • bmonkey325 124 days ago
      Facts. Only attacks I ever saw were physical where the code would seek the drive head on Atari 810s repeatedly or strobe it or attempt force xt drives in and out of the landing zone to similar effect. obviously over time this is not good for the mechanism.

      I don’t remember cpu therms being an issue until the mid late 90s - and then it was athlons. I could be wrong but I dont remember seeing CPU fans until the Pentium II cartridge but that is probably misremembering nostalgia.

      80s was just robust against thermal - heck ataris had a giant aluminium shield over the mobo

      • Darkstar 123 days ago
        yeah, there definitely were hardware viruses, stepping the drive out of its maximum cylinder was one... I remember there were even hard drives that didn't have a physical stop, so the head just dropped down on the platter at some point. But to exploit that you had to make sure that you are running on the exact (vulnerable) disk drive model, which was already very unlikely.

        I also heard stories of programming graphics card registers in a fancy way to trigger high frequencies in the CRT coils that could, again if the CRT was vulnerable, potentially destroy the coil. But this also relied on very specific hardware to pull it off.

        A generic attack on such a high volume home computer or floppy drive like the C1541 would definitely have made the rounds back then in the computer magazines.

        And the myth that developers deliberately put in code to damage or even destroy the pirates' computers can also be ruled out almost entirely, as (at least in europe) even back then there was a strong legal protection against deliberately damaging other people's property. I distinctly remember reading about this being debunked in the largest German C64 magazine (64'er) by a lawyer....

  • Darkstar 160 days ago | parent | on: The 90s Gamer Experience: Handwritten Notes and Ma...
    Haha I also have a huge binder full of things like that, and two or three small notebooks...

    I couldn't bring myself to throw them all out...

  • Darkstar 168 days ago | parent | on: Why We Didn’t Take Screen Grabs in the 1980s
    This is not entirely true. There were definitely TSRs that would on a keypress save a copy of the screen in a graphics file (or text file for text screens).

    I remember using one of these to take screenshots of my favorite games as a kid and save them as GIFs on a floppy disk... Ah, fun times :)

  • Darkstar 187 days ago | parent | on: Archiving hardware projects
    Interesting take, but probably overkill in this case:

    * the USB-to-serial drivers are standard and included in every OS. Even if 10 years from now they weren't, it's easy to fire up a VM and get it working

    * Serial cables are a standard thing for anyone working with old hardware. And even if someone, 10 years from now, doesn't have one, it's not hard to rig something up, maybe even using jumper cables

    * That leaves the usb-to-serial adapter itself. While these might be more at risk on first glance, the chips they work on are still produced by the millions, and I can't envision a future where those things would cost more than a few bucks new or even from places like eBay

    I mean I totally get this, and I have done similar things in the past (e.g. I have a box with a PCI SCSI controller and an ISA SCSI controller, together with a dozen or so different SCSI cables and terminators, in case I need to dump yet another SCSI device...) but doing this for serial cables just seems... unnecessary?

    • jgrahamc 186 days ago
      More than anything this is so I don't have to go find all the right connectors the next time I want to use these devices. That drives me crazy.
  • Darkstar 198 days ago | parent | on: Xerox Alto Source Code
    this should get a "(2014)" appended to the title as it is rather old news
    • bmonkey325 198 days ago
      Does this really add value ? I assume that everything on this site is more or less "old" by definition.

      If this is truly something our community values I will strive to add this when possible.

  • Darkstar 243 days ago | parent | on: Donut shop in Indiana still using a Commodore 64.
    I wonder how they do things like tax filings etc. Doesn't that have to be in some standard data format? Probably a post-process done on a regular PC with the data somehow grabbed from the C64...?
  • Darkstar 272 days ago | parent | on: This GameCube Mini Is Downright Adorable
    It's not a GameCube, it's just a regular PC in a GameCube-lookalike case
    • bmonkey325 272 days ago
      The replica case is pretty good. Anyone know what emulator is being used? I didn’t see it mentioned in the article.
      • Screwtapello 271 days ago
        Is there any Gamecube emulator of note besides Dolphin?
  • More
lists | rss | source
Search:
Two Stop Bits is a discussion web site about retro computing and gaming.