- KODust 16 days agoIn casual conversation, yeah. I'll forgive it in classified ads and such; better to be precise with money on the line to prevent misunderstandings.reply
- bmonkey325 18 days agoModel M and Model F - shipped with lots of IBM gear. Searching for docs on specific models is probably more effective than names. Another example is "Thinkpad", that covers a generation of models and documenation and features. some with Big Blue some with Lenovo.reply
- Fun thread from Tube Time on Bluesky: https://bsky.app/profile/tubetime.bsky.social/post/3lxxn23kn...
- This was interesting and really well written.
- Thanks! I somehow missed this being released. Looks like a good read.
- I'm not usually someone to knee-jerk hate on an AI-generated pic, but, really, what's the point of the one here? Why not just use a pic from IBM marketing materials that accurately depicts the device being discussed? Or not use a pic at all? It just seems lazy and pointless.
- Interesting. I asked grok and Chad Jippity and those models believe it’s a human. The details and footnotes look human to me.
TBH. The image looks more kitch than something an ai can generate.
- I think the keyboard labeling makes it pretty clear that it's AI.
- OOOF. completely missed. yeah. thats AI - completely missed the comment about so many ways to enter a "6".
Somedays, I can be a couple of bytes short of a file.
- I really learned from this that "6" seems to be the default character. (Interestingly, there is no 6, where you'd expect it, like in the number row or on the numeric keypad, indicating that "6" is not a number.)
Also, is it just me, or is that an illustration style you'd rather connect with the Apple II? Which may be well why the keyboard is placed like it is and may explain the over-all cutesy appearance.
- agree - looking at the old byte archive(s) we got up on the site. the older apple ads had a sketched liked quality to some of the cutaway drawings. Its why I thought it was period / familiar.
The Atari art style is equally unmistakeable. I have a coffee table book devoted to their art...
art of atari book - i have the hardback and iTunes digital : https://a.co/d/3X4mHWX
- The intro paragraph suggests AI writing too, so maybe the weirdo pic is a warning to stay away.
- I don't think the article is AI. I use AI to write a lot of research documents, with footnotes, and it never feels as casual as this writing, even if you prompt it that way. And I'm using all the SOTA tools.
Could be wrong, though. AI is getting nearly impossible to detect in most tasks.
- That was exactly my reaction. Feels like the entire article was just a listicle generated by an LLM with Deep Research.
It is both a blessing and a curse that most LLMs and generative AI were not adequately trained on vintage computing materials.
- What's not to like about a keyboard with so many ways to enter a "6"? ;-)
(But, I've to admit, I still don't know what people were thinking, just how a few industry leaders reacted to this. – I was just a teenager, but, as I remember it, everything was coming to a stall as potential customers were waiting for what IBM would come up with. Then, there was some disappointment, but more importantly, after a small pause, "well, it's IBM, this is the industry standard, we'll go with this." At least, this is the impression I got from reading the magazines. I specifically remember arguments, like, this is not a great architecture for an office machine, but it may make sense for things like process control, it's meant to be a general machine.)
- The original 5150 shipped with 16k and no floppy - just a cassette port. lots of people upgraded with at least a single 160k floppy and 64k was possible with a max of 256k. Apple had nothing like that at the time for any price.
Once the XT shipped and Lotus 123 became available that became a killer app that was hard to beat.
- I think, it was really an odd choice to do this: Out of the box, it was really just a home computer with BASIC in ROM and a cassette port. But, clearly, it was also way too expensive for this. (The original sales flyer had an image of a happy family with the kid playing a space game on the home TV. It may be interesting to know how many tens of units IBM actually sold in this configuration and for that particular use in total.)
It was more of a base, you could build an actual system on, and, as it even lacked a serial port, you couldn't even run the simplest control tasks out of the box. It was more of what could be called a "smart backplane". (This is probably ok, if your IBM sales person comes to your office to make a custom bundle for your needs – and while no dime will be spent on an unused component, it will probably be still expensive –, but it makes it particularly hard to sell this in any other way. Which gave rise to all those local PC bundling & packaging shops at the corner.)
I'm sure, much consideration had gone into the concept, and there's probably some prehistory to this (as IBM had several projects for a home or personal computer in the second half of the 1970s, neither of saw the light of day.) Or was it just about showing the flag, like, "well, theoretically, we have a machine that could do all this, so please shut up (and we'll be happy to sell 1000 units of this PoC.)"? But the IBM PC is that taken for granted that it is (un)surprisingly hard to come by any on its background.
(Of course, an LLM won't tell us any about what may be actually interesting about this.)
- I imagine it's some combination of IBM having no experience marketing to home users and wanting to be able to have a low "starting price".
- There may be some to this.
But I'm not entirely convinced that IBM just didn't know how to do this. E.g., there's the Aquarius concept (1977), which had progressed to working production prototypes, which would have come attractively packaged and with an app store based on bubble memory cards. (Apparently, this was canceled over concerns regarding the reliability of those bubble memory cards.) As a marketing concept, this would have been about 25–30 years ahead of its time – and it would have separated this neatly from any other IBM business. However, any such elaborate concept would probably have struggled in an organisation like this, where any move may endanger what has already been secured.
Maybe, the remarkable lack of context of the IBM 5150 was its internal selling point?
(Imaginary internal sales pitch: "See, this machine has no specs. We won't even say what it's for. We'll just tell them, the Little Tramp likes it, so you like it, too. No, it won't eat into mainframes.")
*) More about the IBM Aquarius (including photos) can be found in Paul Atkinson's book "Delete. A Design History of Computer Vapourware", Bloomsbury, 2013.
- Small world. A LOL for you - I used a PS/2 Model 50 for RT data aquisition in the late 80s - doing RT DSP of signal analysis off a bio-amp. Northgate and Gateway had non conforming DMA that the IBM implemented correctly.
- I think what Jerry Pournelle always said in his Chaos Manner column was true then and is still true today: “The computer you want always costs $5000.”
RPi I think lives this ethos today (not by cost) but you can get a SBC which is just a basic compute unit and add on custom stuff to get to a solution - wifi, nvme storage, solder and smoke and wiring goodness. Despite the linux complexity - its about as close to a retro experience as you can get today.
- I'm still curious, what this machine was meant to be. One version is that it was about something that would enable small local tasks, like data entry, editing, etc., but would require some kind of IBM mainframe for any serious task, like actually processing or managing this data. Much like the IBM 3270 PC. (So, really a front door to renting out mid-sized machines?) But in this conception, the PC would have soon been superceded by the XT running Lotus 1-2-3 and its storage capabilities, and finally dead by the advent of the 386 machines, which were perfectly able to run all of this locally. Are the latter even PCs, conceptionally, or were they something new, but still something, people could envision in the original PC (and maybe had expected from it, all along)?
PS: One of the things, I kind of don't get, is this entire approach to acquiring an office PC, of going around, like, "no this person doesn't need a floppy drive, this person doesn't need a printer either, no parallel port for them, this person may need 256 KB more, so give them at least 128 KB, well, this one requires at least a screen, etc." I don't think that this was what customers expected deploying PCs to the office would look like, involving an entire requirements committee, fearful of spending either too little or too much. It would have been much easier and probably also cheaper (for all parties involved) and certainly more attractive to come up with just a few standard configurations and load this off onto everybody's desks, like an actual product. (Much like it was with the PS/2. But, then, IBM wasn't really into selling products.) – On the other hand, admittedly, it made the IBM PC specs-wise a moving target, when it came to any competition.
- Looks to be from 1987 (most likely); certainly not 1984.
- Even as a teenager in 1995, I was baffled by VRML. The web barely had interactivity, barely had commerce, barely had halfway-decent page layout! And yet VRML was supposed to be the "next big thing"??
It felt like how it would have if there'd been a massive industry push for smell-o-vision hardware and browser support for an odor-describing markup language. Like, how did that get settled on by so many as the next thing to tackle?!
- I'd enjoy hearing from someone knowledgeable about Linux kernel evolution what they consider the big contributors to modern Linux taking up many times more resources than it did back when I ran early Slackware on a similar 486 with 12 MB of RAM. NOT so I can rant about how "modern software is bLoATeD because programmers are LaZy" (ugh), but I'm truly curious what has changed, because we usually tend to think of software growing over time for several reasons, some of which don't seem very applicable here:
1. Using a higher-level language/runtime that's more efficient for programmer productivity but that outputs less efficient code — nope, still using C.
2. Decreased effort on optimization. I'm sure this is a factor, but I also wonder how applicable it is here. In my day job, I may carelessly use an exponential algorithm or an inefficient data structure because it's quicker to just get the job done, but my impression is that the kernel is still going to pay a lot of attention to such things. Not quite as much as 30 years ago, probably, but kernel engineers aren't just using bubble sort all over the place now.
3. Increased functionality. The modularity of Linux, especially the kernel, mitigates this somewhat: sure, a 2025 kernel supports thousands more devices, file systems, etc than a 1995 one did, but you can compile most of those out. Still, my impression is that we end up with a kernel not just somewhat larger but many times the size of the old ones.
4. Tunables (or things that potentially could be tunable). This would actually be my best guess for where big easy wins could be had: code paths that allocate a 1 MB buffer when they could get by—and would have, years ago—with a few KB. On a system with GBs of RAM, larger allocations like that are probably advantageous but they can really crush an old system.
Likewise in userland: `ls` surely has more options now, but not _that_ many more, right?
- Define Linux? Kernel. Userland. ?
Kernel has really only gotten bigger for security, scheduler, file system and network.
Command line, server only installs on raspberry pi are surprisingly light on disk and ram consumption. We expect ipv6, a good file system that doesn’t blow over in a power outage. I don’t think twice about pulling USBC power when things go sideways. 486 Pc era. No chance. That was a moment to reacquaint myself with a diety.
Userland wants more. Some standards are followed. Some code is shared (good and bad). I want 4K hidpi and fonts and colours on a raspberry freaking pi. That convenience costs in terms of storage, compute, and ram. Compare to Mac or windows and Linux looks anorexic.
- As the last line says: Here there's a bill there's a key!
- I was surprised to read this part: "For now, FreeDOS 1.4 can't run Windows for Workgroups in enhanced mode, but can run Windows 3.1 in standard mode."
I know very little about the project but I'd have guessed that ~100% DOS compatibility would've been achieved early on. There's just not very much to DOS! I'm sure there are reasons, of course, would be interested if anyone knows.
- More

For example, I never heard anybody back in the day refer to the “5160”, it was always “PC XT” or just “XT”. (The 8514[/A] being the big exception!) But I think newcomers to the scene these days would think the opposite. It seems like to me that this is a quite recent trend, too. I wonder how that happened!
Anyway, this was a really odd little machine. Such a glimpse of the PS/2 design language that was soon to come. Doubly-so if you had the CRT for it: https://www.globalgaragesale.net/item/156173/vtg-ibm-5140-pc...
Playing King’s Quest on its squat little internal screen was … an experience. I imagine it was better at running 1-2-3 but I can’t say I did much of that back in those days!