- Cool video showing this off : https://www.youtube.com/watch?v=gvT9-ZfW1Iw
- I feel your pain. Allegedly Canada is metric too. But some things hold over from the dark times. Housing. Food. Personal weight. Inside temperature is imperial. Outside temperature is metric.
- My first Unix was BSD/386 (as a product of the University of California) but my first job out of college was HP-UX on a K250. Later on they got an L-class. I still miss the PA-RISC.
There's a 9000/350 here running 8.x, a couple Galaxy 1100 systems running 10.10, a 425t with a PA-RISC upgrade that needs something to do, an RDI PrecisionBook 160 (B160L) running 11 and a C8000 also with 11. And a beat-up K260 I/O card on the wall as a memento. :)
- A link to the original page with the actual explanation could've been nice, but all the osnews 'commentary' adds is cringe.
I mean, I'm in a metric country, but this is the sort of thing that makes me want to start quoting disk sizes in Roman stadia or maybe Egyptian royal cubits, just to watch this real-life 'Ackchyually' Guy froth about it.
- I can relate. I certainly wasted money indirectly by missing lectures playing trade wars in university.
- Native is always best but developing natively means your dev env - assembler/compiler/debugger - must live in the same address space as your target program. Things were super tight keeping your assembler/monitor and code lived in 8’16k total.
- I spent several hundred dollars of my saved up birthday and summer job money to play a war or two when I was a late teenager.
- Yeah, i enjoyed watching that more than maybe I should have.
- Thanks. I reposted with the fix. All I can do is delete this bad post.
- That was fun. Had that old training film feel. I remember megawars on compuserve but with the meter running there was no chance for teenage me to play. Just too much money.
- The URL in the link has a typo, i.e. a spurious trailing double quote " character. The correct URL is https://yeokhengmeng.com/2024/08/dos-on-thinkpad-x13-gen1
- Tiff was ahead of its time. Future proof. First kind of container format I ever encountered. Which was both good and bad. If you didn’t have the code to handle a particular extension you got nothing. I remember attempting to write a decoder - sometimes I would get images and sometimes I got nothing. Was it a bug or was my code too primitive to be valuable. I gave up. In the old days. It wasn’t like you just google for a new compression algorithm to learn about. If your library or personal collection of books and magazine collection didn’t have a lead. You were out of luck.
- HP/UX was my first Unix, back in the early 1990s. Unfortunately didn’t have access to any fancy workstation, just a dumb terminal and it looked really primitive compared to the Atari ST I had at home, until I learned to appreciate it
- > A lot of this software was all developed on a timeshare and cross compiled to a target machine.
True. And sort of by definition, the 1.0 version of most new computer platform's operating system had to be initially brought up on earlier computer systems. For example, the earliest versions of Amiga system code, including the famous Boing ball demo, were created on text terminal-based computers from the little known, short-lived Sage Computer Corp, due to the Sage being 68000-based and already having a C compiler and 68K assembler ported over to it. So, it wasn't for cross-compiling, it was simply because it was already working. When you're writing the low-level code to read the keyboard and write to the display and disk, it's much easier to do with a working keyboard, display and disk drive. As for how the ENIAC, UNIVAC and other OG computers of the late 40s and early 50s were brought up without any predecessors to lean on, I think those devs were simply made of sterner stuff and bootstrapped their code into the first hardware by plugging tubes, swapping wires and flipping switches. :-)
While cross-compiling from existing, big iron systems was a common pattern in the earliest days of new platforms, after a computer was available a lesser-known pattern occurred around third-party and hobbyist software development before reasonably priced native development tools became available to end consumers (which could take 12 or 18 months). For example, my first assembly language code was written using a tiny monitor/debugger tool I ordered from a classified listing in small magazine for $10 and received on audio cassette tape. It wasn't an assembler because it didn't read or write files and couldn't handle labels or macros but it could turn bytes in memory into assembler mnemonics and vice versa - and for a poor teenager, that was enough to get started.
Fortunately, most 8-bit home computers were quite capable of running a simple assembler from ROM or floppy. My first native assembler was written in the computer's ROM BASIC and it was every bit as primitive as it sounds. As bad as it was, it could still be a tempting option for cash-strapped hobbyists and students because mini-computers and cross-compilers were very expensive and the early "professional grade" native assemblers could cost more than the whole computer. So, us poor 'platform-native' hobby devs worked with cheap, minimalist tools cobbled together by other platform-native devs. It was slow and painful but it (mostly) worked. On low-res 80s home computers like the Atari 800, C-64 and Radio Shack Color Computer, some developers added an external serial terminal via serial cable just to get 80 x 24 text and a better keyboard since that made code editing less painful than using the 32 or 40 column native text display on a television.
- > So, if that’s 2K, what’s the real, DCI 2K, then?
Exactly! It was so bizarre when they started doing that. It seems like they went from the "#K" being based on the vertical resolution (sort of) to it being (vaguely) based on the horizontal resolution - except continuing to use both inconsistently. And 4K became widely used to refer to 3840, so it doesn't even mean DCI 4K (which is actually 4096).
- Nice! Brings back great memories of the first time I saw a Lisa. It was in a regular computer store in early 1983, shortly after the Lisa went into wide retail release. Playing with the Lisa for a few minutes was one of those powerful epiphanies that had tremendous impact on me. It happened to arrive at the perfect time in my development because I was ready to understand what it meant.
I was just around two years into owning and using my first computer which had a sub-1 Mhz, 8-bit CPU and 4K (though I'd upgraded it to 16k by this time). I'd taught myself to program in BASIC from the manuals and by modifying program listings I'd typed in from magazines. Then I taught myself assembly language and was just starting to run into the harsh limits of color, resolution and speed when I saw the Lisa.
The Lisa was not only my first experience with a WIMP interface (Windows, Icons, Menus, Pointer), it was my first awareness of the concept - and it instantly expanded my horizons of what a computer could be, unleashing a torrent of new ideas, possibilities and questions. That night I went home and started working on writing a simple windowing graphical user interface on my 8-bit computer. For several months I spent all my spare time working on it. And as my first extensive 'from scratch' assembly language program that tried to actually to do hard things (instead of just being a quick demo or code test), I learned an enormous amount - from making my own bitmap font and text renderer to optimized line drawing and synchronized screen refresh. I was determined to recreate a passable version of the 'zooming' window outline and eventually nailed it. Interestingly, the few minutes I spent with the Lisa was apparently enough to "get it" because I never went back to see it again in the following months, despite the store being quite close by.
The other thing I remember is the $8,000 price tag being essentially incomprehensible to me. It was by far the most expensive computer I'd ever even heard of, much less seen. That was as much as a fairly nice new car! The computer I had was the cheapest available at $400. Computers I wished I could get (the Atari 800 and C64) were $600-$700. The most expensive computers I was specifically aware of were $1500-$2000 (Apple II / IBM PC). $8000 was hard for my teenaged self to even process.
- 4K, like in resolution? I would argue that 2K is the worst offender. In a move of historical revisionism, 2560x1440 was rebranded by manufacturers as 2K, a decade after the fact, or so.
So, if that’s 2K, what’s the real, DCI 2K, then?
- Sad to see Stephen go. We could perhaps have convinced him to share more of his story, if we had time 💔
- Yes, it was quite different here in the U.S. Most consumer Amiga users in the U.S. didn't really know it was different over in Europe but I did - and I was jealous!
As a regular advertiser I got copies of the larger European Amiga magazines in the late 80s and early 90s which weren't available here in the U.S. outside of a small number of specialized places in major cities. A typical copy of Amiga Format Magazine wasn't just much thicker than U.S. Amiga publications, the look, feel and tone was also dramatically different. Being so consumer and games focused made it incredibly vibrant and energetic. You can feel the difference just comparing online PDFs of early 90s AmigaWorld and Amiga Format magazines. Sadly, I never got to see the UK/Euro scene first-hand back in the day. By the time I began regularly visiting Europe, Commodore was circling the drain and the Amiga scene was winding down.
In the U.S. the most active (and profitable) parts of the Amiga community were focused on the 2000, 3000 and 4000 models and tended to be more "creative professionals", prosumers, serious high-end enthusiasts and even academics using it as a lower-cost workstation alternative to Sun/Apollo/SGI. My Amiga users group had members from CalTech, NASA/JPL, and even a few "can't talk about how we use it" defense contractors like Lockheed.
While the U.S. consumers who used the Amiga purely as a home computer certainly played games enthusiastically, they were often equally interested in digital art, graphics, desktop publishing and/or computer programming. Thanks to products like the Video Toaster, Amigas were used for film and video production, computer graphics and 3D rendering by TV stations and movie studios. Stephen Spielberg's production company even had an Amiga rendering farm producing all the visual effects for a prime-time network TV show starring top celebrities. Today, quite a few of the 'gray beards' around Hollywood visual effects and TV production have Amiga roots because in the late 80s they were the disruptive 'young turks' pushing newfangled desktop production techniques.
- Eh, I think this one’s just a mistake on my part, but you can’t edit the URL after posting. Sorry about that!
- I'm generally okay with calling out AI posts but it has to be really clear and obvious. Especially in a niche topic forum like this. AI slop farms hunting for page views have little incentive to post in such a low-traffic forum. Plus we know our revered TSB overlord BMonkey is a legit retro enthusiast.
As someone else mentioned, AI proofing for non-native English speakers is a thing which can be necessary. If you're, say, a native Farsi speaker who's super into vintage retro computing, you probably don't have a lot of Farsi-language options (and if you are a Farsi-speaker, I'd love a post on 70s/80s computers in Persia - and I won't mind if it's AI-assisted). Here on TSB, I'd say even if it clearly looks AI written, it needs to also be inaccurate, trite or off-point to be called out. Your previous AI call out was spot on, this one... not so much.
- Gist of this article is something overlooked in the dawn of the microcomputers. A lot of this software was all developed on a timeshare and cross compiled to a target machine. Atari CoinOp ran on a VAX 11/780 and the home division used a DG MV8000 or MV10000 to cross compile on. Far from my teenage vision of game devs in cubicles on Atari 800s writing the next awesome game.
Thanks @starac for sharing.
- Grok. ChatGPT and now Gemini say. No. LinkedIn is legit. Maybe helped by LLM. Sure. As an ESL I use LLMs to help my writing. When I drop articles or use idioms incorrectly I’m not helping our community. My Quebec keyboard often leads to humours tags that must make our site overlord do a facepalm.
- Absolutely nothing there was written with AI. Unless you count spell checks as AI
- This is quite the achievement. If you have more time you always do a better job. It also helps to have a prototype to view the technique used previously and be able to take a better path.
Ports of games were often fixed price and a hard deadline to deliver. Missed deadlines often imposed penalties or non payment all together. Causing perfect to be the enemy of the good.
- 90mm is ~ 3.543 inches if you round you will yield 3.5 which for discussions in the vernacular seems fine. Certainly not as ridiculous as 4k(resolution)
- Released yesterday, this video explains how the breakthrough sprite scaling hardware in Sega's 1986 arcade driving game OutRun was fully emulated on an Amiga in highly optimized assembly language running at 30 fps in 256 colors. The author released his finished OutRun arcade port a couple months ago for free download, along with commented source code. While there was an officially licensed Sega OutRun Amiga port back in the 80s, it was extremely disappointing compared to the arcade original. This homebrew port of the game is vastly superior and nearly identical to the arcade - which is quite the trick.
This is the third video in a series explaining how this remarkable performance optimization was achieved. Each video stands alone but the first two are also linked along with the game, source and info on how to run it via emulation if you don't have your Amiga handy. The videos are quite accessible to non-programmers and they also show how the original arcade hardware worked, which is interesting as it was the first in Sega's line of "Super Scaler" hardware which soon became legendary in top arcade titles of the late-80s and early 90s (https://sega.fandom.com/wiki/Category:Super_Scaler_games).
- Holy. That’s inane. But completely on brand for IBM.
- Thanks for posting that direct link. The original post's link took me to other older posts by that profile's owner and I couldn't find the relevant one. Maybe something to do with needing to be logged in to that site or having different post sorting settings than default.
The issue of "what I see at a link, isn't what you see at the same link" seems to be getting worse, especially on social media. And a lot of sites no longer even care enough to provide an explicit Permalink.
- More

https://products.vmssoftware.com/vmsxde