- Fun thread from Tube Time on Bluesky: https://bsky.app/profile/tubetime.bsky.social/post/3lxxn23kn...
- This was interesting and really well written.
- Thanks! I somehow missed this being released. Looks like a good read.
- I'm not usually someone to knee-jerk hate on an AI-generated pic, but, really, what's the point of the one here? Why not just use a pic from IBM marketing materials that accurately depicts the device being discussed? Or not use a pic at all? It just seems lazy and pointless.
- Looks to be from 1987 (most likely); certainly not 1984.
- Even as a teenager in 1995, I was baffled by VRML. The web barely had interactivity, barely had commerce, barely had halfway-decent page layout! And yet VRML was supposed to be the "next big thing"??
It felt like how it would have if there'd been a massive industry push for smell-o-vision hardware and browser support for an odor-describing markup language. Like, how did that get settled on by so many as the next thing to tackle?!
- I'd enjoy hearing from someone knowledgeable about Linux kernel evolution what they consider the big contributors to modern Linux taking up many times more resources than it did back when I ran early Slackware on a similar 486 with 12 MB of RAM. NOT so I can rant about how "modern software is bLoATeD because programmers are LaZy" (ugh), but I'm truly curious what has changed, because we usually tend to think of software growing over time for several reasons, some of which don't seem very applicable here:
1. Using a higher-level language/runtime that's more efficient for programmer productivity but that outputs less efficient code — nope, still using C.
2. Decreased effort on optimization. I'm sure this is a factor, but I also wonder how applicable it is here. In my day job, I may carelessly use an exponential algorithm or an inefficient data structure because it's quicker to just get the job done, but my impression is that the kernel is still going to pay a lot of attention to such things. Not quite as much as 30 years ago, probably, but kernel engineers aren't just using bubble sort all over the place now.
3. Increased functionality. The modularity of Linux, especially the kernel, mitigates this somewhat: sure, a 2025 kernel supports thousands more devices, file systems, etc than a 1995 one did, but you can compile most of those out. Still, my impression is that we end up with a kernel not just somewhat larger but many times the size of the old ones.
4. Tunables (or things that potentially could be tunable). This would actually be my best guess for where big easy wins could be had: code paths that allocate a 1 MB buffer when they could get by—and would have, years ago—with a few KB. On a system with GBs of RAM, larger allocations like that are probably advantageous but they can really crush an old system.
Likewise in userland: `ls` surely has more options now, but not _that_ many more, right?
- Define Linux? Kernel. Userland. ?
Kernel has really only gotten bigger for security, scheduler, file system and network.
Command line, server only installs on raspberry pi are surprisingly light on disk and ram consumption. We expect ipv6, a good file system that doesn’t blow over in a power outage. I don’t think twice about pulling USBC power when things go sideways. 486 Pc era. No chance. That was a moment to reacquaint myself with a diety.
Userland wants more. Some standards are followed. Some code is shared (good and bad). I want 4K hidpi and fonts and colours on a raspberry freaking pi. That convenience costs in terms of storage, compute, and ram. Compare to Mac or windows and Linux looks anorexic.
- As the last line says: Here there's a bill there's a key!
- I was surprised to read this part: "For now, FreeDOS 1.4 can't run Windows for Workgroups in enhanced mode, but can run Windows 3.1 in standard mode."
I know very little about the project but I'd have guessed that ~100% DOS compatibility would've been achieved early on. There's just not very much to DOS! I'm sure there are reasons, of course, would be interested if anyone knows.
- Nice. Thank you for what you do making this site a fun place to visit.
- More
TBH. The image looks more kitch than something an ai can generate.
Somedays, I can be a couple of bytes short of a file.
Also, is it just me, or is that an illustration style you'd rather connect with the Apple II? Which may be well why the keyboard is placed like it is and may explain the over-all cutesy appearance.
The Atari art style is equally unmistakeable. I have a coffee table book devoted to their art...
art of atari book - i have the hardback and iTunes digital : https://a.co/d/3X4mHWX
Could be wrong, though. AI is getting nearly impossible to detect in most tasks.
It is both a blessing and a curse that most LLMs and generative AI were not adequately trained on vintage computing materials.
(But, I've to admit, I still don't know what people were thinking, just how a few industry leaders reacted to this. – I was just a teenager, but, as I remember it, everything was coming to a stall as potential customers were waiting for what IBM would come up with. Then, there was some disappointment, but more importantly, after a small pause, "well, it's IBM, this is the industry standard, we'll go with this." At least, this is the impression I got from reading the magazines. I specifically remember arguments, like, this is not a great architecture for an office machine, but it may make sense for things like process control, it's meant to be a general machine.)
Once the XT shipped and Lotus 123 became available that became a killer app that was hard to beat.
It was more of a base, you could build an actual system on, and, as it even lacked a serial port, you couldn't even run the simplest control tasks out of the box. It was more of what could be called a "smart backplane". (This is probably ok, if your IBM sales person comes to your office to make a custom bundle for your needs – and while no dime will be spent on an unused component, it will probably be still expensive –, but it makes it particularly hard to sell this in any other way. Which gave rise to all those local PC bundling & packaging shops at the corner.)
I'm sure, much consideration had gone into the concept, and there's probably some prehistory to this (as IBM had several projects for a home or personal computer in the second half of the 1970s, neither of saw the light of day.) Or was it just about showing the flag, like, "well, theoretically, we have a machine that could do all this, so please shut up (and we'll be happy to sell 1000 units of this PoC.)"? But the IBM PC is that taken for granted that it is (un)surprisingly hard to come by any on its background.
(Of course, an LLM won't tell us any about what may be actually interesting about this.)
But I'm not entirely convinced that IBM just didn't know how to do this. E.g., there's the Aquarius concept (1977), which had progressed to working production prototypes, which would have come attractively packaged and with an app store based on bubble memory cards. (Apparently, this was canceled over concerns regarding the reliability of those bubble memory cards.) As a marketing concept, this would have been about 25–30 years ahead of its time – and it would have separated this neatly from any other IBM business. However, any such elaborate concept would probably have struggled in an organisation like this, where any move may endanger what has already been secured.
Maybe, the remarkable lack of context of the IBM 5150 was its internal selling point?
(Imaginary internal sales pitch: "See, this machine has no specs. We won't even say what it's for. We'll just tell them, the Little Tramp likes it, so you like it, too. No, it won't eat into mainframes.")
*) More about the IBM Aquarius (including photos) can be found in Paul Atkinson's book "Delete. A Design History of Computer Vapourware", Bloomsbury, 2013.
RPi I think lives this ethos today (not by cost) but you can get a SBC which is just a basic compute unit and add on custom stuff to get to a solution - wifi, nvme storage, solder and smoke and wiring goodness. Despite the linux complexity - its about as close to a retro experience as you can get today.
PS: One of the things, I kind of don't get, is this entire approach to acquiring an office PC, of going around, like, "no this person doesn't need a floppy drive, this person doesn't need a printer either, no parallel port for them, this person may need 256 KB more, so give them at least 128 KB, well, this one requires at least a screen, etc." I don't think that this was what customers expected deploying PCs to the office would look like, involving an entire requirements committee, fearful of spending either too little or too much. It would have been much easier and probably also cheaper (for all parties involved) and certainly more attractive to come up with just a few standard configurations and load this off onto everybody's desks, like an actual product. (Much like it was with the PS/2. But, then, IBM wasn't really into selling products.) – On the other hand, admittedly, it made the IBM PC specs-wise a moving target, when it came to any competition.