Two Stop Bitsnew | comments | tags | ask | submitlogin
  • markran 156 days ago | parent | on: Commodore hardware viruses–yes, they were possible
    > he’d seen protection schemes that, if they detected you had tampered with them, would try to break your disk drive in retaliation. The most common way to do this was to send the drive a command to try to move the drive’s stepper motor beyond its physical range. The drive would oblige and try to do the impossible, so it was possible to command the drive to permanently damage its own drive mechanism.

    I didn't have a C64 and my Radio Shack Coco had a less complex disk drive mechanism based around a standard Western Digital disk controller chip, but there were similar copy protections on some software titles.

    While I'm sure someone did tell Dave this story, and that person may have believed it themselves, I suspect it's based on a mistaken extrapolation of a more innocent behavior. Way back in the day, we heard a similar report at my local user's group but a techie friend of mine looked into the offending software title and discovered a reality that was more benign. Basically, during manufacturing disk protections tend to put some non-standard formatting someplace on the original disk and then the software tries to read back the non-standard stuff to verify it's the original disk. These could be extra, missing or mis-numbered tracks or sectors. Some protections also put data on an extra track added beyond the last track. Coco disks had 35 "official" tracks in the specification but users quickly learned that these drives were manufactured as 40 track drives, of which some didn't pass QA tests seeking all the way to track 40 and were sold cheaper to Radio Shack. But I never saw a Radio Shack Coco drive that wouldn't seek to track 36, 37 and usually more. I eventually had four drives and all of them would reliably seek to track 41 or 42. In fact, hobbyists made mods for the disk operating system to add extra tracks to the official count. So, at least on the Coco, there were multiple disk protections that would seek the head "beyond the last track", not to damage the drive but because they knew the original disk had data there which all drives could read but no normal disk copy command would write.

    The other thing to know is that all these floppy drives were inexpensive, mass-manufactured mechanical devices that had varying tolerances between individual units at the factory which only grew with wear over time, temperature, shipping and handling. Also the diskettes themselves weren't exactly made to exacting mil-spec standards. So, to read the disk the controller software would seek to the desired track and try to read the requested sector. It wasn't terribly unusual for a read to fail and time out due to the head moving a bit too slow or perhaps initially undershooting or overshooting the target track. So all controller software would move the head back (usually to track zero) and then try to step back to the desired track and do the read again. If it didn't work, it would repeat this several times hoping to get a good read before eventually failing with an error. When these rapid head resets and retries happened, the drive would make a loud and unusual "gronking" sound that was quite noticeable. And that was just with normal disks and no oddball disk formatting or trick-play head seeking.

    When disk protections would fail to find the expected oddball tracks or sectors, they'd do the same reset/retry behavior with the same furious gronking. Except in the case of disk protections half the sectors on a track could be "special" (on the Coco there were 18 sectors on a track). At 3 or 5 retries each, that's a lot of loud head gronking for a long time as each sector is attempted and fails out in turn. Such was the case with the protected software title my friend disassembled. The erstwhile failed pirate at our user's group meeting (a middle schooler) was trying to start a copy of the game which had none of the "special" sectors present. While I doubt all that gronking was good for the disk mechanism, it wasn't intentionally malicious on the part of the software title. But you can see how the loud gronking sounds which only happened on a failed attempt to pirate a copy of a protected disk could cause people to make assumptions and leap to nefarious conclusions which would then be further embellished through the retelling.

    Of course, I don't doubt that some hobbyist hacker or maybe solo software dev had nefarious thoughts and maybe even played around with how to do it and showed their friends a demo. But I never saw or heard any credible claims a commercial software title sold at scale ever shipped to consumers with the intent to destroy user hardware. Even in those days software was sold by publishers to distributors who then sold to retail stores, who sold to end users. A national wave of failed hardware reports associated with one title could mean blame and perhaps even legal liability for any and all of those parties. And disassembling the software sufficiently to prove it was doing this intentionally would have been much easier than making working pirated copies. To be so reckless, not only would the author have to be really dumb, so would the publisher and anyone who knew about it in advance.

    The kicker is that, in those days, bulk duplication of diskettes (especially funkily formatted diskettes) wasn't all that reliable - meaning there was a pretty high probability that some non-zero percentage of your legit copies sometimes wouldn't read correctly for a paying customer due to varying manufacturing tolerances (or stray magnetic fields in shipping). And, of course, this failure to read could cause the copy protection to detect the legit disk as "pirated". Back in the 80s and 90s I worked for a successful software manufacturer and one of our products was a large, professional tool which eventually grew to occupy well over a dozen 3.5 inch floppy disks. When a disk wouldn't read for a customer it was a costly warranty issue to ship them a new disk set (and there was no consumer internet). As our software and disk count grew, we saw increasing disk failures. So we analyzed it and despite using the top disk duplicator in the U.S. and legit top-notch, direct-from-the-Sony-factory media - once our install was over a dozen diskettes, the statistical best case was almost every fourth customer would have at least one disk from their set fail to read. And this is without any funky formatting! Fortunately, CD-ROM became a thing shortly thereafter but the point is, the top disk duplicator in the country confirmed that "Yep, we do this better than anyone and your media is the best money can buy - and you're getting the expected field failure rate." So, selling hardware destroying time bombs would have been incredibly stupid, because statistically inevitable failures would certainly harm the hardware of more than one legitimate paying customer by mistake, and that would result in a very fast (but quite spectacular) fireball of infamy for any company dumb enough to try it.

  • markran 157 days ago | parent | on: Compute!’s Gazette Magazine returns to print (and ...
    Wow. Doing a print magazine for C64 is so oddly unexpected that I think I'll buy a copy even though I've never really been into C64 (my 8-bit was a 6809-based Radio Shack Coco before my first Amiga).
    • classichasclass 155 days ago
      Except it's unlikely to be exclusively for Commodore systems - they seem to be making it retro-general. Which is fine, but that's not Gazette, either.
  • markran 160 days ago | parent | on: Windows 1.0 with Steve Ballmer (HQ, 60FPS)
    Excellent find! It's almost hard to remember how primitive Windows 1 was at launch. Frankly, GEM, Desqview and several other competitors already on the market were somewhat better.

    To be fair to Ballmer, that looks like it may be one of his internal videos to hype up the field sales force. Ballmer was known for doing those and he cultivated an over-the-top persona in contrast to Gates' nerdy engineer at Microsoft internal and developer events. The schtick eventually devolved into self-parody which Ballmer was happy to go along with as long as the sales force kept making the numbers. And we should never forget the legendary cringe that is "Developers! Developers! Developers!" https://www.youtube.com/watch?v=8fcSviC7cRM.

  • markran 160 days ago | parent | on: New Update 3 for AmigaOS 3.2 Available for Downloa...
    Nice to see OG 68k AmigaOS still getting a little love. I'm one of those who used Amigas daily from 1985 to 1995 and loved the platform. I still love it as a venerable retro platform because it was the most unique and interesting of the retro-era computer platforms.

    Sadly, I've never been able to muster much interest in the subsequent post-Commodore Power PC or retargetable graphics-based Amiga derivatives. Relative to their post-1995 peer platforms none of those Amiga derivatives were compelling. They had all the downsides of being a low-adoption hobby platform with none of the unique upsides the OG Amiga offered vs its peers between 85 and 95 (better graphics, sound, color, multi-tasking). Post 1995-ish most peer platforms had approximately similar resolution, color depth, graphics speed and processing performance to anything derivative Amiga add-ons or upgrades were offering - and usually with more support and better prices. Worse, they didn't even offer much nostalgic appeal because new apps and OS were required - essentially making it little different than transitioning to an entirely new platform anyway.

    As Commodore disappeared into bankruptcy, the era when unique platforms could carve out a market was ending and fundamentally nothing Commodore (or its successors) could do would have done more than delay the inevitable. Platforms like the Amiga had shown the way to the future but eventually the baseline tide was catching up. The age of CISC CPUs and 15khz displays was at an end. Commodore didn't survive long enough to take a solid swing at a RISC machine with >31khz graphics and none of its descendants had anywhere near the resources to even make a serious attempt at anything which might have been uniquely better than current peers. Frankly, even Commodore didn't have the resources to spin a truly competitive new hardware platform with a bespoke OS ready to exploit it.

    Even Intel and Microsoft combined barely managed to eventually make the transition. Maintaining x86 ISA compatibility with microcode translation on top of RISC was an ugly and risky hack that almost didn't work (requiring heroic effort to salvage). After trying to do essentially the same with the 68060, Motorola gave up (perhaps wisely as they didn't even have Intel's process fabrication savvy to help hide the inevitable performance gap of emulating a CISC ISA on a RISC CPU). And on the OS side, it took Microsoft 8 years of iteration to eventually improve Windows to the point where it was really usable as a multi-tasking GUI OS.

    There was simply no way a vertically integrated computer company like a Commodore, Atari, Sinclair, etc could compete against a platform made by separate companies each specializing on one aspect: the CPU & chipset, OS, graphics or sound and then assembled by a manufacturing integrator. Leading edge desktop computers had grown incredibly complex and the 90s was peak Moore's Law acceleration, enabling immense gains for those able to move fast enough. No single company could compete. Keeping up required an ecosystem of companies. And if Commodore (or successors) had shifted to outsourcing all the components, they'd just be yet another low margin integrator like an Acer, Dell, Gateway, etc. And to be fair to Commodore, no one else made the transition either. Even giants like IBM, DEC, HP, SGI, Sun, Next all either gave up on desktop PCs or became low margin integrators (usually as a loss leader for their higher end hardware). Apple barely survived (and wouldn't have without acquiring a new OS from Next, Steve Jobs returning, a last minute $400M lifeline loan from Microsoft and quite a bit of luck).

    • KODust 159 days ago
      I would only add that the iPod is what saved Apple, not the Mac. NeXT’s software didn’t really have anything to do with it. Most Mac users stayed with Mac OS 9 until 2003-2004, because OS X wasn’t ready for prime time until then.

      In terms of Apple’s long term survival, everything Apple did between 1997 and the iPod introduction was just treading water. Steve had stopped the bleeding, but the Mac was still not setting the world on fire in terms of market share or profitability. It was on shaky ground until the Intel transition, coupled with the iPod halo effect, allowed people to feel safe buying Macs again.

      • markran 159 days ago
        > I would only add that the iPod is what saved Apple

        Yes, I agree. I just didn't want to get into a lot of detail in what was basically an aside to address that Apple's Mac platform did (sort of) survive. But as you observed, it was on very shaky ground and hardly a resounding success. Even today, the Mac business isn't as significant to Apple as iOS or services.

      • bmonkey325 159 days ago
        iPod was a halo effect. it gave people a reason to buy a mac. Just like Microsoft sold things to bundle together. Outlook is best when you buy it and use it with Exchange.
  • markran 162 days ago | parent | on: Novasaur CP/M TTL Retrocomputer
    It looks cool but I was confused by this line:

    > "Bitmapped Graphics: - Hi-res mode up to 416x240 with 8 colors and 4 dithering patterns"

    I'm familiar with dithering as a multi-pixel pattern usually done by software to create the appearance of shades in between the colors that individual pixels can display - at the cost of spatial resolution (because it requires multiple pixels for the pattern). So I was thrown by what "dithering patterns" might mean in the context of a line discussing hardware specs like individual pixel resolutions and colors. My guess would be maybe some kind of hardware mode that addresses more than single pixels but that sounds like tiles. Does anyone know?

    • ddingus 161 days ago
      We have a small expansion on that bit of detail:

      >The GPU also supports a text mode where the bytes of video memory alternate between a color byte and a code point representing a text character. The color byte is used with the second video DAC to represent two 8 color values for foreground and background. The text mode can also support a high res graphics mode with two pixels per byte of video memory.

      That explains a bit more. 2 pixels per byte and then here comes 416! That number is odd, meaning they have done some thing unique, IMHO.

      So we get 4 bits per pixel. 8 colors uses up three of them, so that last bit must combine pixel data in some novel way.

      4 dither patterns goes into 8 colors nicely, 2 colors per pattern.

      It could mean each pixel can be one of 8 colors with no constraints.

      Or, each pair of pixels is assigned a color, which would come from the 8 already defined. And maybe the order is toggled in some regular way.

      Oh, wait! Maybe he did what Woz did on the Apple 2!

            One byte has a data form like this:
            
            DD_CCC_CCC
      
            CCC = colors 0 through 7 for the even and odd pixel
      
            DD = dither patterns 0 through 3 which make use of the scanline counter and pixel order in the byte
      
            00 = pixels are assigned colors as given by their respective CCC fields.
      
            01 = pixel color order swapped on odd lines
            
            10 = pixel color order swapped on odd lines
            
            11= both!
      
      Looks like this:

            DD = 00
      
            Pixels go by color at all times
            
            0123456701234567 line 0
            0123456701234567 line 1
      
            A line looks like this:
           00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 00_CCC_CCC 
      
      ...where the two zeroes are bits and the CCC fields are for the odd and even pixel in that byte.

      For brevity and clarity I am just going to write pixel color numbers 0 through 7 rather than all those bits.

            00
            01_23_45_67_01_23_45_67
      
      And I am on Mobile, so I will also ditch delimiters. I think it remains clear enough. Hope so!

            00
            0123456701234567
      
      Ok, here we go!

            DD = 01 -- Horizontal Dither Odd
            
            0123456701234567 line 0
            1032547610325476 line 1
            .
            .
            .
            
            DD = 10 -- horizontal dither even
            
            1032547610325476 line 0
            0123456701234567 line 1
            .
            .
            .
      
            DD = 11 Both odd and even dither!
            
            1032547610325476 line 0
            1032547610325476 line 1
            .
            .
            .
      
            Using just black and white pixels:
            
            00 
            🟈0🟃0🟈0🟃0
            🟈0🟃0🟈0🟃0
            🟈0🟃0🟈0🟃0
            🟈0🟃0🟈0🟃0
            
            01
            🟈0🟃0🟈0🟃0
            0🟃0🟈0🟃0🟃
            🟈0🟃0🟈0🟃0
            0🟃0🟈0🟃0🟃
            
            10
            0🟃0🟈0🟃0🟃
            🟈0🟃0🟈0🟃0
            0🟃0🟈0🟃0🟃
            🟈0🟃0🟈0🟃0
            
            11
            0🟃0🟈0🟃0🟃
            0🟃0🟈0🟃0🟃
            0🟃0🟈0🟃0🟃
            0🟃0🟈0🟃0🟃
      
      Frankly, if this is what they did, It is very expressive. I would get a lot out of a mode like this. Dithers can be pretty expensive either to compute dynamically, or in terms of stored images.

      These patterns can be applied on a byte basis! It is close to the next best thing if we can't have 16 colors.

      What I meant by "he did what Woz did" was put those high bits to good use. On the Apple, Woz shifted pixels a bit to deliver a 6 color high resolution graphics screen. I am sure you all have seen how expressive Apple artifact graphics can be.

      It is way more than what one can do on a 4 color screen, particularly given the pixels get fatter. One trades a lot of detail.

      Now, this scheme has those same attributes! Resolution potential remains high, just like the Apple high resolution screen. Nice, small pixels.

      But now the number of apparent screen colors goes way up! Those pattern variations will yield tons of fairly automated, consistent color impressions.

      Instead of a 8 color screen, or a 16 color one, it is as if more like 24 or even maybe 32 colors are available.

      This is all a big guess of course. But it is a somewhat informed one that takes TTL possibilities into account as I understand them.

      BTW, if this ran at TV NTSC frequencies, and offered 320 pixels per line, or offered some combination of pixels that repeats evenly into the NTSC color cycles, given that odd number of pixels, it would be gorgeous! Just saying.

      • markran 160 days ago
        Thanks for the detailed explanation. If that's what it is doing then it would indeed be interesting and pretty unusual for a modern day homebrew system. I too wondered if maybe the dither pattern referred to something like the Apple II's artifact colors but dismissed it because artifact colors require composite video output to work and there was no explicit mention of composite.
      • bmonkey325 159 days ago
        Here is a sample image from the project site. I can't tell if this jives with your explanation. Also, I am not sure if dithering is the same as artifcating which is what I used to see on Apple II and Atari Hi-Res games sometimes to get extra "colours" out of monitors and tv sets.

        https://hackaday.io/project/164212/gallery#0f87e94323e101952...

  • markran 162 days ago | parent | on: Atari ST turns 40 today
    Yeah, I tried to use Windows prior to 3.1 a few different times and never made it past the first five minutes. While Windows 3.1 was significantly better, I still bailed out after an hour. By the early 90s I just wasn't willing to slip that far backward compared to the more mature, complete and useful options I was familiar with. It wasn't until Windows 95 and the Pentium that I could adopt the PC as one of my main daily drivers.
  • markran 162 days ago | parent | on: Atari ST turns 40 today
    I too was an Amiga fan, moving from an 8-bit Radio Shack Color Computer to the Amiga 1000 in late 85. The price of the Amiga was significantly beyond my early 20s budget so I was only able to finagle one by finding someone who needed some software written on the Amiga and getting them to buy me one in exchange for writing what they needed. But such was the siren song appeal of Amiga's promise in mid-85. I spent months poring over every page, image and word in the "Launch" issue of AmigaWorld Magazine, which was in reality a cleverly disguised extended sales brochure that came out months before the computer itself was available for retail purchase.

    No teenager ever inhaled every inch of a Penthouse magazine in the detail I memorized that issue of AmigaWorld. It was truly computer porn in every sense, an airbrushed fantasy which significantly surpassed the reality of the computer that actually shipped for at least its first year on the market. There wasn't much you could do with a $2000 Amiga 1000 system in the first months other than run Boing, RoboCity and other demos ($2000 including the "optional" chip RAM upgrade (which was in reality required), RGB monitor and external 2nd floppy drive).

    The early launch applications like Graphicraft and Musicraft weren't quite complete enough to be useful for much real production work, largely because when the Amiga 1000 first shipped the paint was still quite wet on the operating system itself. Worse, Addison Wesley the publisher of the official developer docs for the Amiga took their sweet time actually printing and shipping the damn books, despite the fact it was only a re-layout of the docs Amiga supplied in Xeroxed form to early developers. Unable to wait any longer as my Amiga-purchasing benefactor needed their software, I drove three hours away late at night to the house of a developer who had the original Amiga docs and took them to an all-night Kinko's and spent the hours between midnight and dawn copying every page by hand (at night because the developer needed the docs by day because he was late on his own application for his employer). But... strangely, we loved the early Amiga 1000 anyway. We thought we were buying a Penthouse Pet but what showed up was an infant that did little other than cry all night and need its diapers changed :-). Thankfully, a year or so later the OS had matured enough and sort-of real tooling, apps, docs and source code examples (in the form of Fish Disks) started appearing enough that the Amiga's fantasy potential slowly started to become real.

    Dave's right that most Amiga-centric people from back in the day may still simply be incapable of assessing the Atari ST in a completely fair and balanced way. That's probably because we didn't assess the early Amiga in a fair and balanced way either (but in the other direction). So, he did a great job setting aside his own ingrained worldview to put the ST in it's rightfully deserved historical place.

    The ST really was a hell of a deal, and in all honesty, I could never have afforded an Amiga 1000 if I'd actually had to pay cash for it instead of labor. While the early ST certainly didn't deliver on its own promise either (due to its own serious teething pains), it did hold the promise of being the super cute, fun girl next door who could become your best friend and, if you were lucky, actually marry. It paled only when compared to the Penthouse Pet-fantasy promise of the Amiga, which the early Amiga certainly didn't live up to (and by a much larger margin than the ST), but which, somehow... eventually, the Amiga mostly managed to do - although largely in the form of the later A500 and A2000.

    • larsbrinkhoff 161 days ago
      To the teenage me, the Amiga 1000 looked gorgeous, but was an impossible dream due to its entry price. The Atari ST520 was affordable and still looked very nice.
  • markran 168 days ago | parent | on: Working on twostopbits' code
    Love it. And love this site. Never a day when I don't visit!
    • jgrahamc 166 days ago
      That's great. Glad you like it.
  • markran 190 days ago | parent | on: Leningrad-1: a 44-IC Soviet style ZX-Spectrum clon...
    The link above (https://alex-j-lowry.github.io/leningrd.htmlI%E2%80%99m) didn't work for me on Firefox x64 but taking the last few characters off does work: https://alex-j-lowry.github.io/leningrd.html
  • markran 198 days ago | parent | on: Why I went back to using a ThinkPad from 2012
    I'm still running a Samsung Note 20 Ultra which is coming up on five years old. I specifically bought the phone brand new over a year after it came out. I actually had to hunt to find a new-in-box unit. The reason is it's the last high-end Galaxy that has removable storage in the form of a micro SD card. I've replaced the battery but otherwise it looks and works great. I've looked at new flagship phones but they don't have any features I care about. They don't run apps noticeably faster, the battery life isn't noticeably better and the camera doesn't take meaningfully better photos. Yet I'd have to spend a few hours getting it all configured and then learning and dealing with the new model's inevitable quirks.

    One reason people who could buy anything are choosing older tech over the newest releases isn't nostalgia or to save money, it's because a lot of new tech products are regressing as useful features get removed to increase profit margins, enable some trendy style or new business model. Hell, it's getting hard to even buy a TV without built in "smart" features and advertising that can't be disabled.

  • More
lists | rss | source
Search:
Two Stop Bits is a discussion web site about retro computing and gaming.