- Great! Maybe add the controls and gameplay description to the hosted page, so that users may learn how to interact?
- You can play it in online emulation here:
1) POLF, PET 2001N with graphics keyboard (turn left/right is actually "," and ";"):
https://www.masswerk.at/pet/?prg=polf&rom=4&ram=16k&...
2) POLF, PET 2001/B with business keyboard:
- bmonkey325 122 days agoThanks for this. After playing this for a bit. It feels 3D. I thought my mind was filling in from wolfenstein watching the video. But live - different story. Legit. Cool. 😎
- The link "Punched-Card Typography" at the bottom really should lead to the original site, which, besides interactive animations [1][2], includes a font editor [3].
[1] https://www.masswerk.at/misc/card-punch-typography/
[2] https://www.masswerk.at/misc/card-punch-typography/part2-ibm...
[3] https://www.masswerk.at/misc/card-punch-typography/editor.ht...
- Notably, this was already much the same situation for the early "professional GUI", like the Xerox Star, or Perq (and, to a certain extent, the LISA, aiming at the low end of that market, centering on general productivity). In the early 1980s, it was really too early for this, without major investments in opening up that market in the first place. (In other words, it was really about the imaginary of what an organization or business was and how it should operate. As it turned out, nothing in this was self-explanatory.) At this stage, Jobs is clearly hoping for that market developing its own momentum, as organizations were becoming aware of the technology. Nevertheless, NeXt still failed, like its predecessors.
I guess, in the end, nothing substantial really became of that market, with evolving capabilities from the general productivity market eventually swallowing that segment. Which is also pretty much the story of NeXt Step's afterlife in OS X. Notably, and not entirely without irony, the professional GUI was a major prerequisite for this development kicking off, for having commercial GUIs and powerful general productivity appliances, at all, but it was still pretty much doomed to fail, right from the beginning.
- Just adding to your statements …
It was too damned expensive and wasn’t colour. I saw the demo and thought holy #@$&! But even a Mac IIfx was about half the cost. Colour and had “pro” software. PCs were half of that and ran word and word perfect (and those janky keyboard templates) or 123 and did what business wanted. AutoCAD and pro engineer were about the most pro apps on PCs that anyone ran.
- > My father had switched from Microsoft GW BASIC to Microsoft QuickBASIC around this time on the Olivetti M24. This did away with the need for line numbers and introduced named locations you could GOTO or GOSUB from.
This inspired me to add a QB preprocessor to the PET 2001 emulator (https://masswerk.at/pet/).
Since I don't want to be greedy or overly partial to the PET, here is, for all friends of other system, a tiny stand-alone QBASIC-to-BASIC transformer (should be agnostic of any dialects):
- Neat stuff!
I your PET emulator and tried it in Firefox but had trouble (the keyboard buttons didn't seem to work, just in case that's news to you.
- Hum, I'm actually developing und testing this Firefox first. – Maybe a case of bit gremlins in the wire? (Try a force reload: CTRL/Command+SHIFT+R.)
Also mind that there are certain differences, when using "Edit mode" and "Games mode". (E.g., in edit mode, the shift keys are sticky, in games mode not.) Moreover, in games mode, the CAPS LOCK key of your physical keyboard acts as a toggle for virtual joystick and numeric key block mappings.
- That's mysterious, as it doesn't work on Firefox at for me. In Chrome after it says "READY" I can use the keyboard below it to things.
On Firefox I get the "READY" but am then unable to use the keyboard or the drop-downs. When I click on a drop-down and wait for half a minute it might finally show it, but it's unusable. Firefox also regularly comes up with a warning that the page is slowing down Firefox, and the browser indeed seems to be hogging a single core.
This is in Firefox 126.0 on Fedora 39.
[update: I tried with safe-mode to disable any extensions but this doesn't make a difference]
- Hum, this sounds like some kind of memory pressure.
(This is hard to overcome: the emulator has to render in a 60Hz duty cycle and we're emulating a CPU at 1 Mhz. If sound is enabled, we also have to sample sound at 1 MHz, and resample this to 48KHz digital audio.)
There seems to be some serious misconfiguration at play, though. For me, this runs on the oldest modern machine, I have around, a 2008 MacPro with the original ATI 256MB graphics card, under Firefox 78.15.0esr quite perfectly. This is probably puny in comparison to your setup.
I guess, inadequate hardware acceleration may cause similar symptoms. I've seen this in the past (and in different context) with Chrome on older machines, where the frame rate drops to something like 2 or 4 fps, while it keeps up perfectly with hardware acceleration disabled. May be worth a try, maybe FF and your GPU driver do not play well together. Compare: https://support.mozilla.org/en-US/kb/performance-settings
- I've to admit, I don't see computers and Andy Warhol as that perfect match, I have seen it been suggested to be. Warhols' approach to serialization built on this as an industrial process, and this being an industrial process almost exclusively. Graphical computers suddenly made this accessible to individuals. From now on, there was really no difference between a single copy and serial reproduction, as exposed by what became known as the Desktop Publishing revolution. Which pretty much led to Warhol falling out of favor.
(My impression of that Amiga demo is really more that of Warhol being lost, and realizing that he was going to lose his claim to the machine. Like any great artist, Warhol was reflecting the cultural and productive conditions of his time, and this time was coming to an end, as the gap between the kind of imagery corporations could produce and what was available to individuals was closing.)
- I liken it to the “bicycle for the mind”, it enables more people to do great things - but in the hands of a skilled master it can be quite something.
At one time artists could only make one of something. Later with 4 colour processes you could make a lot of something so that more persons could have and enjoy the creation. With digitsl even more could enjoy something - created by a master or not.
- Idk, given that Warhol's mastery was in reproducing – and elevating – that kind of blunt impression, serial corporate imagery was dumping ubiquitously onto the public, the fact that now everyone could produce a false-color image in solarized aesthetic (something that had been previously a complex process, involving various crafts and a certain amount of sloppiness) by 3 clicks anywhere on the screen and send it off to serial reproduction by another click, was not great news.
To me, the demo has something morbid about it. It's a bit like demoing a generative AI interface with Hayao Miyazaki, showing how everybody can now impersonate his style – and everything that had been sacred to him, like the combination of craftsmanship and imagination to capture a personal expression of a fragile world – by typing a few careless words into a chat prompt. ("Totoro in a vintage seaplane, Miyazaki, award winning, high resolution.")
- I see it in a way but you still had to have eye for composition and style. There are still videos of people wielding MS paint to produce art that awes me and produces something I could never do.
In the era of dall-e or midjourney feels like a cheat code in comparison.
- I guess, it's still somewhat similar. For example, producing a posterized image previously involved separating and clamping brightness ranges using orthographic film and expertly judging exposure time, and masking these masks by other masks, or tracing images on masking film, before you could even think of preparing the actual printing process. Notably, both methods involved an interpretation of the image, aesthetic judgement, and expertise, guided by experience how this would behave robustly in a printing process. Now, there is a "posterize image" dialog, which does the same by a single click. – The modern approach is hard to control, though, which may be one of the reasons, we don't see that aesthetic not that much, anymore. It practically vanished from common experience. The impact of "easy to use" technology on our culture and shared imagination is not to be underrated, and may not work out the way, we may have expected.
- Author here. – If you know any PET 2001 programs written for the Japanese character ROM, please let me know. (It may be nice to have a demonstration program for the emulator.)
- Interesting! I always wondered what happened with really constrained 8-bit platforms like this when they tried to make them work for complex logographic languages with thousands of characters.
Also interesting that they chose katakana over hiragana, but perhaps the choice was because katakana is more readable at pixel resolutions? I don't know enough Japanese to know if it was based on a technicality of the language.
I don't think you mention it, but three of those characters are technically kanji -- the ones for day/month/year that are included.
Did you get a copy of the actual Jap ROM or did you have to retro-convert a Western ROM?
- Regarding the character ROM, this can be found at Bob Zimmers' great Commodore 8-bit archive, see: https://www.zimmers.net/anonftp/pub/cbm/firmware/computers/p...
Regarding kana/kanji – well, my terminology may be fuzzy. But the emulator has "kanji" in the menu…
Regarding katakana vs hiragana: I guess, the simpler strokes were one of the reasons. Moreover, it seems that in daily use it can be distilled down to more or less the size of the Western alphabet. While there are 46 syllabograms in use, 8-bit computers managed to get away with about half of this. (The Sharp MZ-80 is another example.)
For fun, have a look at my own attempt at squeezing hiragana into an 8-bit character generator: https://www.masswerk.at/char8/#U3040 (A rendering demo can be found here: https://www.masswerk.at/rterm/ )
- One of the most interesting stories around bubble memory may be that of the IBM Aquarius. Contrary to the common notion that IBM had "totally missed the personal computer revolution of the 1970s", IBM had indeed several personal and/or home computer projects in the works, some internal projects, and there are also sketches of designs commissioned to Eliot Noyes Associates.
The most complete of these projects must have been the IBM Aquarius, as of 1977, which had progressed to fully designed production prototypes, developed under the lead of Bill Lowe and featuring a quite gorgeous, dark-red product design by Tom Hardy, which can be seen in "Delete" by Paul Atkinson (Bloomsbury Academic, London, New York, 2013). The most intriguing aspect of this project probably was that it would come with an entire ecosystem for the retail of its software (think IBM going full Apple App Store in the 1970s), as featured by a card slot on the right side of the computer, where software on cards was to be inserted and could expose special functionality by a (sensor) keypad that would show up in key-size cutouts in the case. And, as may be guessed from context, the software on these cards was to be delivered by the wonders of bubble memory. However, while bubble memory was often touted by IBM as a future technology, it eventually proved as the Achilles heel of the project, when the completed prototype was demonstrated to the executive board and concerns were raised regarding the reliability of the unproven technology.
Tom Hardy, as quoted in "Delete": "The Aquarius would have blown the socks off of everybody. I felt, and a lot of people felt this was going to be a big deal and make IBM believe in this whole business." And, regarding the demise by bubble memory: "Because it was relatively new for these kind of applications and whereas the team had gotten this thing to work, the company just didn't want to take a chance and push it. IBM should have been able to do it. If they would have pushed that technology and put all the resources behind it like they'd done with other things in the past, a lot of folks thought that it would have been successful and would have just blown the whole thing wide open."
While the IBM of the mid-to-late 1970s was evidently not the company to do this (and I can see the risks that might have been posed by this to the reputation of what was still the core product), it's one of the more fascinating what-if scenarios.
[Edit] The IBM Aquarius prototype can be seen here, side-by-side to the earlier "Yellow Bird" (which seems to have been more of a design study than a full-fledged project): https://saccade.com/blog/2019/04/delete-a-design-history-of-.../
- I have questions. E.g., How ist the IBM 5100 a microcomputer? (It doesn't feature a MPU.) Or, the Sharp MZ-80K (1978) is missing (arguably both from the kits and the complete systems lists)…
- I read the description of the PALM processor inside the 5100 and seems like a legit mpu for a computer https://en.wikipedia.org/wiki/IBM_PALM_processor
What about this processor excludes it from being considered ? Asking for a friend …
- The PALM processor is an entire board. "Microcode" does not imply microprocessors, it was actually thought to be one of the major properties of 2nd generation machines, long before this.
A PALM processor can be seen here: https://hackaday.com/2023/12/19/bringing-apl-to-the-masses-t.../
[Edit] To be fair, the Wikipedia article on the PALM processor does mention this: "The PALM processor was a circuit board containing 13 bipolar gate arrays packaged in square metal cans, 3 conventional transistor–transistor logic (TTL) ICs in dual in-line packages, and 1 round metal can part."
- Sorry to be thick but I am still not following what prevents this from being an mpu/cpu. Does it not execute stored programs. Access memory ?
Compared to pong which is clearly just circuitry
Also. Pentium ii comes on a board / cartridge. Is it not an mpu ? https://en.wikipedia.org/wiki/Pentium_II
- The general idea is that microcomputers are about microprocessors (MPUs), which combine a CPU, consisting of an ALU, internal registers and bus logic (and maybe some port registers) in a single chip. Generally, the 4-bit Intel 4004 is considered to be the first commercial microprocessor, soon followed by the 8-bit Intel 8008.
The Intel 8008 was actually begun before the 4004 and implemented the design of an earlier, discrete component processor (of the Datapoint 2200 from 1969) as a microprocessor. (This project was initiated by the Computer Terminal Corporation, AKA Datapoint, who comissioned a microprocessor implementations of their processor to Intel, Fairchild and Texas Instruments. The Intel project was the only one to succeed, but the implementation was rejected for being already outdated and too slow at the time, leaving Intel with rights to the design. The rest is history…)
Generally, early coin-op video games, like Computer Space, Gotcha, Space Race, or Pong are not considered microprocessor based, they even lack a CPU in the stricter sense. They are really just somewhat more complex video pattern generators, at least, this is what Nolan Bushnell called them – and he ought to know ;-).
(These early games lack any stepping logic, apart from single-directionally progressing counter chains, which drive discrete TTL logic, which in turn produces the video signal. Movement is exclusively effected by presets to these counter chains and gameplay, like object collisions, is determined by simple logic gates, based on the state of these counter chains. There is nothing Turing-complete in this logic. Commonly, Tank is considered the first coin-op game to use any large scale integrated logic at all (masked ROMs for sprite data instead of discrete diode matrices) and Western Gun, AKA Gun Fight (US-release in 1975) to be the first arcade video game to be based on a microprocessor.)
Regarding the Pentium II – well, that's a tricky question. There are a few contenders to the title of the first microprocessor, even predating the i4004 (e.g., the Four-Phase Systems AL1 from 1969,) the claims of which are generally rejected for not containing the entire processor in a single chip (but two or three of them). Following this logic and tradition, we had to reject the Pentium II, as well. However, due to another, completely unrelated marketing tradition, we commonly do consider the Pentium II to be a MPU.
- As a lesser known fact, the second parameter in "WAIT 6502,1" works as a multiplier. So, "WAIT 6502,2" will print the message twice, with 3 it will print trice, and so on. Even 0 works, filling the entire screen.
- More