Linux again

I recently did some experiments with “old” hardware – a Skylake i5-6500T mini pc running Debian 12 with KDE Plasma, configured so that I can use it either as a stand-in replacement for my home server, or a fully set up Linux desktop for myself in case I need it for something; I don’t know, if both Microsoft and Apple make their operating systems non-functional at the same time for some reason. I intentionally left the machine with 8GB RAM just to see if it’s enough, and it seems to be more than enough for the server, where it uses up 1.4GB, and barely sufficient for desktop, where it uses up almost all the RAM when I run all the things that I normally do. It’s all quite snappy, but I did notice one thing; when I play videos on YouTube on full screen, or even when I’m using one of the high-bandwidth modes such as 1080p@60, it frame drops like crazy and is as smooth as a country road in Siberia during the melt season. My first guess was that the Skylake iGPU doesn’t support the modern codecs used by YouTube for those high bandwidth modes, but then I thought more about it and decided it might be a Linux issue. I didn’t feel like installing Windows on that machine just to test my hypothesis, so I took out the second device I recently got on ebay, the Thinkpad T14 with i5-10310U, a Comet Lake CPU with support for all the modern codecs. Played that same 4k video test on Win11, with perfect results, zero frame drops. Then I rebooted into Ubuntu 24.04, same test, and it frame drops almost the same as the Skylake machine.

I did all the recommended stuff on Linux; tried different browsers, tried to toggle GPU acceleration on and off, and the only thing I managed to do is make it behave worse, not better.

Switch to Linux they say, you’ll solve all your Microsoft problems they say. Well it’s true, you’ll solve your Microsoft problems, and instead of Microsoft you’ll have problems caused by thousands of pimply masturbators with attitude issues who can’t agree on the colour of shit, which is why there are hundreds of Linux distros and they all have the same issues, because polishing the GPU drivers and the window manager is hard. But the important thing is that the Linux community is getting rid of “nazis” who think there are only two genders, and at the same time they get rid of Russian developers because “stand with Ukraine”.

Yeah. The frustrating thing about Linux is that so many things work well, and then you run into something important like this. Maybe Huawei will rework Linux into something that actually works well, and give it wider hardware support and localisation, doing for Linux desktop what Android did for mobile. Maybe. However then America is going to block it because they won’t be able to install their spyware.

PC lineage

I was watching some YouTube videos about old computers, thinking: which ones are predecessors of our current machines, and which ones are merely extinct technology, blind alleys that led nowhere?

It’s an interesting question. I used to assume that home computers from the 1980s are predecessors of the current machines, but then I saw someone work on an old minicomputer running UNIX, PDP-10 or something, and that thing felt instantly familiar, unlike ZX Spectrum, Apple II or Commodore 64, which feel nothing like what we have today. Is it possible that I had it wrong? When I look at the first Macintosh, it feels very much like the user interface we use today, but Macintosh was a technological demonstration that didn’t actually do anything useful yet, because hardware was too weak. But where did the Macintosh come from? Lisa, of course. And Lisa was the attempt to make Xerox Alto streamlined and commercially viable. All three were failures; the idea was good, but the technology wasn’t there yet. The first computers that feel exactly like what we are now using were the graphical workstations from Silicon Graphics and Sun, because they were basically minicomputers with a graphical console and a 3d rendering engine.

It’s almost as if home computers were a parallel branch of technology, related more to Atari arcade machines than the minicomputers and mainframes of the day, created as attempts to work with inferior but cheap technology, which evolved from Altair to Apple II to IBM PC, which evolved from 8088 to 80286 to 80386, when Microsoft copied the Macintosh interface and made it into a mass market OS, as technology became viable, then Windows evolved from 3.00 to 95 to 98… and then this entire technological blind alley went extinct, because the technology became advanced enough to erase the difference between the UNIX graphical workstations and personal computers, and so Microsoft started running a mainframe kernel on a PC, which was called NT, at version 4 it became a viable competition to Windows 95, and Windows 2000 ran NT kernel, and the 95/98/ME kernel was retired completely, ending the playground phase of PC technology and making everything a graphical workstation. Parallel to that, Steve Jobs, exiled from Apple, was tinkering with his NEXT graphical workstation project, which became quite good but didn’t sell, and when Apple begged him to come back and save them from themselves, he brought the NextStep OS and that became the OS X on the new generation of Macintosh computers. So, basically, the PC architecture was in its infancy phase and playing with cheap but inferior hardware until the prices of hardware came down so much that the stuff that used to be reserved for the high-cost graphical workstations became inexpensive enough that the graphical workstations stopped being a niche thing, went into main stream, and drove the personal computers as they used to be into extinction.

Just think about it – today’s computer has a 2D/3D graphical accelerator, integrated on the CPU or dedicated, it runs UNIX (Mac OS and Linux) or something very similar, derived from the mainframe NT kernel (Windows), it’s a multi-user, seamlessly multitasking system, but it all runs on hardware that’s been so integrated it fits in a phone.

So, the actual evolution of personal computers goes from an IBM mainframe to a DEC minicomputer to a UNIX graphical workstation to Windows NT 4 and Mac OS X, to iPhone and Android.

The home computer evolution goes from Altair 8800 to Apple I and II to IBM PC, then from MS DOS to Windows 3.0, 95, 98, ME… and goes extinct. The attempt to make a personal computer with graphical user interface goes from Xerox Alto to Apple Lisa to Macintosh, then to Macintosh 2, OS being upgraded to version 9… and going extinct, being replaced by a successor to NEXT repackaged as the new generation of Macintosh, with the OS that was built around UNIX. Then at some point the tech got so miniaturised that we now have phones running UNIX, which is a mainframe/minicomputer OS, directly descended from the graphical workstations.

Which is why you could take a SGI Indigo2 workstation today and recognise it as a normal computer, slow but functional, and you would take the first IBM PC or Apple II and it would feel like absolutely nothing you are accustomed to. That’s because your PC isn’t descended from IBM PC, it’s descended from a mainframe that mimicked the general look of a PC and learned to be backwards compatible with one.

Tariffs

Trump introduced those super high tariffs on every country America is in trade deficit with, which, essentially, means every country.

As a result, those countries are going to introduce reciprocal tariffs on America, which translates into a trade war.

What’s going to happen next is global disentanglement of supply chains based on toxicity. But more directly, some things are going to get more expensive. Expecting them to get more expensive, people will buy the existing stock quickly, and the manufacturers are going to stop supply until the pricing is figured out. This means both scarcity and high prices.

So, obviously, during the weekend I looked into the stuff I will have to buy in the next six months to a year, overlapping with the stuff that’s going to get more expensive within that timeframe, and as a result I bought two Apple laptops. Biljana needs a replacement for her Intel 16″ Macbook Pro from 2019, so she’s getting a 16″ M4 Max Macbook Pro. My 13″ M1 Macbook Air is also due for replacement because I broke the tab key and my eyes aren’t what they used to be so I got myself the 15″ M4 Air, but this time I got 24GB of RAM because 8GB was limiting. Essentially, I just flushed my shopping list for the year because I see no benefits in waiting.

So, yeah, my prognosis for this is that the economy will go down, prices will go up, availability of things will go down, and the general standard of living will be degraded across the West. Also, I expect wars to get much worse, and quickly. Buying laptops is not what you would normally do in those conditions, but I’d rather replace failing hardware now when it’s merely preventative maintenance, than later when it might be a serious problem.

Linux (in)security

This just came out:

Basically, 9.9/10 severity is a nightmare. RCE means people can execute code on your machine remotely, and 9.9/10 probably means root permissions. This is as bad as it gets. Even worse, the security analyst reporting this says the developers were not interested in fixing it and rather spent time explaining why their code is great and he’s stupid, which is absolutely typical for Linux people.
Canonical and Red Hat confirm the vulnerability and its severity rating.
So, when Linux people tell you Linux is better than Windows and Mac, and everybody should switch to it, just have in mind that an open source project was just caught with its pants down, having a 9.9/10 severity remote code execution bug FOR A DECADE without anyone noticing until now.

Edit: It turned out it’s not super terrible. The vulnerability is in CUPS, and the machine needs to be connected to the Internet without firewall in order for the attack to work, which is not a normal condition, however the CUPS code has more holes than Emmentaler cheese and uninstalling cups-browsed is recommended.

Old computers

I’ve been watching some YouTube videos about restoring old computers, because I’m trying to understand the motivation behind it.

Sure, nostalgia; people doing it usually have some childhood or youth memories associated with old computers and restoring those and running ancient software probably brings back the memories. However, I’ve seen cases where an expert in restoring ancient hardware was asked to recover actual scientific data recorded on old floppy disks (IBM 8”), stored in some data format that was readable only by an ancient computer that no longer exists, and it was an actual problem that had to be solved by getting an old floppy drive to connect to an old but still reasonably modern computer running a modern OS, and communicating with the drive on a low enough level to access the files and then copy them to modern storage. Also, they recovered ancient data from the Apollo era by using a restored Apollo guidance computer to read old core memories and copy them to modern storage for historical purposes. Essentially, they recovered data from various museum pieces and established what was used for what purpose. They also brought various old but historically interesting computers, such as Xerox Alto, to a working condition, where all their software could be demonstrated in a museum. So, there’s the “computer archaeology” aspect of it that I do understand, and that’s perfectly fine. However, it’s obvious that most old computers that are restored end up being used once or twice and then moved to some shelf, because they are not really useful for anything today. The interesting part is, there are some very old machines that are being actively used today, and they actually do the job so well there is no reason for replacing them with the new equipment, because they obviously do what they are designed to do perfectly (for instance, supervising a power plant or running a missile silo) and since modern hardware doesn’t run the old software, you can’t just replace the computer with a new faster model that you plug into the rest of the system. No; the interfaces are different now, everything is different. You can’t just plug the modern workstation PC in place of a PDP 11. You’d need to move all the data from tape drives and 8” floppies and old hard drives first. Then you’d have to replace the printers and somehow connect to the old peripherals, for instance the sensors and solenoids. And then you’d have to rewrite all the old software and make it so reliable that it never breaks or crashes. And the only benefit of that would be to have more reliable hardware, because the stuff from the 1970s is 50 years old and breaks down. It’s no wonder that the industry solved the problem by simply making a modern replacement computer with all the old interfaces, with modern hardware running an emulation of the old computer that runs all the ancient software perfectly, so that it keeps doing what it was designed to do but without old capacitors and diodes exploding. There are examples of this approach that made their way to consumer electronics – for instance, modern HP 50G or HP 12C calculators have an ARM CPU running emulation of obsolete proprietary HP Voyager and Saturn processors, running all the software written for the original obsolete platform, because rewriting all the mathematical stuff in c and building it for a modern micro-controller platform would be prohibitively expensive since there’s no money in it. However, simply using modern hardware, writing an emulator for the old platform, and using all the legacy software works perfectly fine, and nobody really cares whether it’s “optimal” or not. Now that I think about it, there must be tons of legacy hardware embedded in old airplanes and similar technological marvels of the time, that are still in use today, and maintaining the aging electronics must be a nightmare that can’t be solved by merely replacing it with the new stuff. In all such cases, emulating the old hardware and running everything on an ARM or building a gate-accurate FPGA replica and just connecting all the old stuff to it to buy time until the entire machine is retired, is quite a reasonable solution to the problem. There must be a whole hidden industry that makes good money by solving the problem of “just make a new and reliable computer for it and leave everything else as it is because it works”.

So, I can imagine perfectly well why one would keep a PDP 10, VAX 11 or IBM 360 running today, if the conversion to a modern platform is cost-prohibitive. However, that got me thinking, what’s the oldest computer I could actually use today, for any purpose.

The answer is quite interesting. For instance, if I had a free serial terminal, VT100 or something, and had a workshop with a Raspberry Pi or some other Linux server, I could connect the ancient terminal to it and display logs and issue commands. It could just work there for that single purpose, and perhaps be more convenient than connecting to the linux server with my modern laptop in a very filthy environment. However, I don’t really know what I would do with a much more modern machine, such as an original IBM PC, or the first Macintosh. They are merely museum pieces today, and I can’t find any practical use for them. So, what’s the next usable generation? It would need to have connectivity to modern hardware in order for me to be able to exchange data; for instance, I could use a very old laptop as a typewriter, as long as I can pull the text I wrote out of it and use it on a modern machine later on. Ideally, it would have network connectivity and be able to save data to a shared directory. Alternatively, it should have USB so I can save things to a thumb drive. Worst case, I would use a floppy disk, and I say worst case because the 3.5” 1.44MB ones were notoriously unreliable and I used to have all kinds of problems with them. It would have to be something really interesting in order for it to be worth the hassle, and I’d probably have to already have it in order to bother with finding a use for it. For instance, an old Compaq, Toshiba or IBM laptop running DOS, where I would use character-graphics tools, exclusively for writing text.

But what’s the oldest computer I could actually use today, for literally everything I do, only slower? The answer is easy: it’s the 15” mid-2015 Macbook Pro (i7-4770HQ CPU). That’s the oldest machine that I have in use, in a sense that it is retired, but I maintain it as a “hot spare”, with updated OS and everything, where I can take it out of a drawer, take it to some secondary location where I want a fully-functional computer already present, not having to assume I’ll have a laptop with me. When I say “fully functional”, I don’t mean just writing text, surfing the web or playing movies, I mean editing photos in Lightroom as well. The only drawback is that it doesn’t have USB C, but my external SSD drives with photo archive can be plugged into USB A with a mere cable replacement, so that would all work, albeit with a speed reduction compared to my modern systems. So, basically, a 4-th generation Intel, released in 2014, is something I can still use for all my current workloads, but it’s significantly slower, already has port compatibility issues with the modern hardware (Thunderbolt 2 with mini-DP connectors is a hassle to connect to anything today as it needs special cables or adapters), and is retired, to be used only in emergencies or specific use-cases.

I must admit that I suffer from no nostalgia regarding old computers. Sure, I remember aspiring to get the stuff that was hot at the time, but it’s all useless junk now, and I have very good memory and remember how limited it all was. What I use today used to be beyond everybody’s dreams back then – for instance, a display with resolution that rivals text printed on a high-res laser printer, with the ability to display a photograph in quality that rivals or exceeds a photographic print, and the ability to reproduce video in full quality, exceeding what a TV could do back then. I actually use my computer as a HiFi component for playing music to the NAD in CD quality. Today, this stuff actually does everything I always wanted to do, but the computers were vehicles for fantasy rather than tools to actually make it happen. I can take pictures with my 35mm camera in quality that exceeds everything I could do on 35mm film, and edit the raw photos on the computer, with no loss of quality, and with no dependence on labs, chemicals or other people who would leave fingerprints on my film. So, when I think about the old computers, I can understand the nostalgia about it, but the biggest part, for me, is remembering what I always wanted computers to do, and feeling gratitude that it’s now a reality. The only thing that’s still a fantasy is a strong AI, but I’m afraid that the AI of the kind I would like to talk to would have very little use for humans.