- BASIC on the early 8, and more generally 16 bit machines was quite a bit more empowering than pop media tends to speak to.
My uncle Bob (seriously, I have the generic uncle "Bob"), developed real estate contracts using a combination of C64 BASIC and some word processor that allowed for conditional and parametric document assembly, almost Word Perfect style!
He built up quite a business with those efforts!
A bit later a friend wrote an entire trucking business on the PC running GWBASIC.
I myself started out on a beat up Atari 400 with the Atari BASIC cartridge and the cassette storage peripheral I struggle to recall the name of right now.... 410! That was it.
I wrote TV test and alignment programs. Learned all that working at a TV repair shop as a kid. The Atari had just a couple capabilities that made a huge difference too!
One of those was at least 8 grey shades. I know GTIA could deliver 16 and I ended up using them once I made enough to get a newer 800 XL machine.
Another feature was full overscan graphics. 48 bytes per line instead of 40. That made it possible to draw the full frame patterns and properly identify the safe area for viewers wanting the factory setup, and expand viewing for others without showing blank non raster regions on their screen.
Side bar:
Older sets would often under scan by quite a bit! Correcting that often meant a lot to those viewers.
End Side bar
Another feature was enough colors to calibrate a TV for good color more than close enough. I could get purity tests, set color delay phase and some other items pretty well!
Last feature was 320 pixels in the safe area NTSC. That is two pixels per color clock cycle. When set to monochrome, those pixels were just right for focus, convergence, linearity and the whole test pattern.
All this was some percent off the pro gear, but I found out most people do not care. And I mostly didn't either.
As a famous YouTube I love says, "Good enough for the girls I go out with" (AvE)
BASIC with a few PEEK and POKE commands and the occasional bit of machine language was enough to do a lot!
COMPUTE! Published a nice assembler and disassembler too. For some work, a guy could get setup well enough to produce good programs.
Getting back to XP...
I wrote the above for perspective. Of course XP can make sense. So can DOS, an Amiga, and Windows 3.11, just ask Southwest airlines.
Fact is many of us here can probably work magic with whatever gets put into our hands. I can.
And all these skills couple with microcontrollers too.
Perhaps that warrants discussion here too one day. The skills are a great match and when one can build hardware feature matched to the use case?
Boom goes the Dynamite!
- I just read through this writeup and my compliments to the author!
From a semi regular XP user, this writeup is exemplary. Well done.
- I would use it. And in fact I do have an XP machine setup for embedded development. It runs great and the tools are stable and robust.
When I need to use those devices, I can just fire it up and go.
Risks are low too. Keep online use by the book, focused and there just us not that much to worry about
- The year is 1998. I was tasked with making a movie to communicate some tech at a trade show. Two minutes of video on a loop.
I had setup the animation and kicked off the renderer expecting to come back to work the following day to see all my frames done and ready to be assembled into the movie, which would be written to a few VHS tapes for overnight transport to the show floor.
I go home, life is good, sleep, awake, show up at work, and oh shit!
The renderer script had died a few hundred frames in, meaning I was screwed.
I was doing this work on SGI IRIX, which was a futuristic OS at the time. I was about to experience just what X11 can do!
Some quick math told me the movie could still be done! But, I was going to need to render on damn near every machine in the building.
No problem. One of the managers mooched some temporary render licenses meaning I was set! All I had to do was install the software on the machines, hand each of them a bunch of frames, kick it all off and build the movie as the data gets computed.
I did the whole thing from my desk using X11 to run applications, including the amazing SGI software manager, and setup the renders.
At one point, I had windows open to about 10 machines and each of them was in various stages of software installation too. Some could take the package, others needed space to be free, still others had a dependency, and on it goes.
I pushed all the boxes hard, even rendering on some other users machines without them even knowing!
For 6 hours straight, X11 and I pushed software around, moved render frames to my primary machine, assembled bits into the movie and inch, by inch it got done.
Our sysadmin came by to tell me he had never seen system loads hit these levels. I had many of the machines buried doing frames as fast as they could.
On that day, X11 and a fine UNIX with great tools shined! Got the movie done and written to tape just an hour before the transport person would not make it overseas to the show.
That was multiuser graphical computing in action.
I had been learning UNIX under high pressure to give it up and go all Windows. After that experience, no way. Not gonna happen.
These days I still use Linux everywhere I can. And one last thing:
X11 works great. Being able to remote display is powerful. I was very impressed when I used X to render on a user box, or few.
At any given time, I have forgotten some UNIX, but what I remember always gets me through whatever challenge of the day is.
Go X11! Multi user graphical computing can be amazingly powerful. Would be a shame to lose out on that capability.
- I found the authors comment interesting:
And then says he will live in the Apple //e more.I use my retro machines in exceedingly boring ways
Yes! The Apple is just good enough to get most things done and the cards and slots make it a great 8 bit workstation.
- The best thing about Apple 2 graphics is one can do just about anything reasonably on a 6 color graphics screen.
4 colors just is not quite enough. And on many systems we have sprites to add that little bit of extra color needed to make everything stand out well enough to not be a problem. The Apple lacks sprites, but does have 6 colors to use instead.
2 colors is a special case, and so long as the pixels are roughly square and are of a decent size, 2 colors is also enough to do most anything minus color.
The second best thing about Apple 2 graphics is that it is artifact graphics! Normally, this kind of thing is not desirable. "Real" colors driven by some graphics chip offer a real color signal of some kind. And with that, offer more overall colors in more hues and often intensities. But, unless the video system is really fast, that color capability often comes with a price, namely pixels that are twice as wide as they are tall. The pixel art aspect ratio is a bit funky, and is often a poor match for many images.
IBM CGA, TANDY Color Computer 3 and some others do offer 320 pixel or greater 4 or more color displays. These individual pixels are nearly square.
The Apple, due to how artifact graphics tend to work, offers a 280 pixel color display that does allow a single pixel to be a color pixel. And it is a nearly square pixel! ( some TVs do mangle it all up into twice as wide as tall anyway)
In fact, it takes two adjacent pixels with the same value to get white or black; otherwise each pixel will be a color pixel.
This little detail is what makes Apple 2 graphics stand out, despite the machines age and very limited video system.
And the last thing about Apple graphics I really like is how using patterns in ways very similar to what one would do on a monochrome display yields a very plausible perception of more colors!
Really great pixel art is possible and we have seen a ton of it over the years.
- Yes. Super Breakout added to that with a bit better brick resolution, intriguing game variations, such as advancing walls, and sounds.
Breakout was the first title I played on a VCS and it seemed fantastic at the time!
- That is basically how it works.
For anyone interested in exploring this idea without having to go full on assembly warrior, give Batari Basic a try. It's actually quite well done and it exposes the limited resources of the VCS in an easy to understand way.
Your basic program is compiled down to 6502 machine language and it all runs at basically native speed.
I found it extremely productive. One can knock out a simple game concept in an hour.
And yes, the sharp limits really does boil down to less is more. You won't find yourself looking through tons of options and or making one of a very large number of possible choices either.
- Yes ! the vs code extension : Atari Dev Studio has batari basic and 7800basic available as well as full on assembly coding. What a great tool !!!!
https://marketplace.visualstudio.com/items?itemName=chunkypi...
- Hope so!
- Wasn't mentioned. The reverence for Jobs is legendary, but for Woz, they don't seem to know what to do.
- It probably has something to do with Woz choosing to become irrelevant to Apple after approximately 1980. His last interesting project for Apple was the Disk II controller, as far as I can tell.
If it were me, I’d still celebrate him & the Apple II, but it’s Tim Cook’s call.
- I guess we will need to wait to 2027 to see what Apple is really made of
- What are you using for video output?
Just curious:
TTL, like an Apple
Custom chip from say, TI?
Microcontroller?
FPGA?
- I tried with an fpga, but it is difficult to get the needed features and performance at a reasonable price. I also tried with a RP2040, and this seems to be easier, although the performance is not quite there yet.
- Have you looked at the Propeller chips from Parallax?
The P1 can do retro grade graphics from very solid TV graphics through higher resolution VGA.
The P2 can drive any display, had 640x480 HDMI capability built in, and analog video of all kinds is very well supported.
It would work in both of the classic ways for you. One way is to have it read the actual video RAM when the CPU allows that, and the other would be for it to have its own video RAM and the CPU communicates to it in some fashion.
- More
Unfortunately, dark cloud are comming. Firefox become so bloated that it start to badly work w/ X11 over network.