Linux (in)security

This just came out:

Basically, 9.9/10 severity is a nightmare. RCE means people can execute code on your machine remotely, and 9.9/10 probably means root permissions. This is as bad as it gets. Even worse, the security analyst reporting this says the developers were not interested in fixing it and rather spent time explaining why their code is great and he’s stupid, which is absolutely typical for Linux people.
Canonical and Red Hat confirm the vulnerability and its severity rating.
So, when Linux people tell you Linux is better than Windows and Mac, and everybody should switch to it, just have in mind that an open source project was just caught with its pants down, having a 9.9/10 severity remote code execution bug FOR A DECADE without anyone noticing until now.

Edit: It turned out it’s not super terrible. The vulnerability is in CUPS, and the machine needs to be connected to the Internet without firewall in order for the attack to work, which is not a normal condition, however the CUPS code has more holes than Emmentaler cheese and uninstalling cups-browsed is recommended.

Old computers

I’ve been watching some YouTube videos about restoring old computers, because I’m trying to understand the motivation behind it.

Sure, nostalgia; people doing it usually have some childhood or youth memories associated with old computers and restoring those and running ancient software probably brings back the memories. However, I’ve seen cases where an expert in restoring ancient hardware was asked to recover actual scientific data recorded on old floppy disks (IBM 8”), stored in some data format that was readable only by an ancient computer that no longer exists, and it was an actual problem that had to be solved by getting an old floppy drive to connect to an old but still reasonably modern computer running a modern OS, and communicating with the drive on a low enough level to access the files and then copy them to modern storage. Also, they recovered ancient data from the Apollo era by using a restored Apollo guidance computer to read old core memories and copy them to modern storage for historical purposes. Essentially, they recovered data from various museum pieces and established what was used for what purpose. They also brought various old but historically interesting computers, such as Xerox Alto, to a working condition, where all their software could be demonstrated in a museum. So, there’s the “computer archaeology” aspect of it that I do understand, and that’s perfectly fine. However, it’s obvious that most old computers that are restored end up being used once or twice and then moved to some shelf, because they are not really useful for anything today. The interesting part is, there are some very old machines that are being actively used today, and they actually do the job so well there is no reason for replacing them with the new equipment, because they obviously do what they are designed to do perfectly (for instance, supervising a power plant or running a missile silo) and since modern hardware doesn’t run the old software, you can’t just replace the computer with a new faster model that you plug into the rest of the system. No; the interfaces are different now, everything is different. You can’t just plug the modern workstation PC in place of a PDP 11. You’d need to move all the data from tape drives and 8” floppies and old hard drives first. Then you’d have to replace the printers and somehow connect to the old peripherals, for instance the sensors and solenoids. And then you’d have to rewrite all the old software and make it so reliable that it never breaks or crashes. And the only benefit of that would be to have more reliable hardware, because the stuff from the 1970s is 50 years old and breaks down. It’s no wonder that the industry solved the problem by simply making a modern replacement computer with all the old interfaces, with modern hardware running an emulation of the old computer that runs all the ancient software perfectly, so that it keeps doing what it was designed to do but without old capacitors and diodes exploding. There are examples of this approach that made their way to consumer electronics – for instance, modern HP 50G or HP 12C calculators have an ARM CPU running emulation of obsolete proprietary HP Voyager and Saturn processors, running all the software written for the original obsolete platform, because rewriting all the mathematical stuff in c and building it for a modern micro-controller platform would be prohibitively expensive since there’s no money in it. However, simply using modern hardware, writing an emulator for the old platform, and using all the legacy software works perfectly fine, and nobody really cares whether it’s “optimal” or not. Now that I think about it, there must be tons of legacy hardware embedded in old airplanes and similar technological marvels of the time, that are still in use today, and maintaining the aging electronics must be a nightmare that can’t be solved by merely replacing it with the new stuff. In all such cases, emulating the old hardware and running everything on an ARM or building a gate-accurate FPGA replica and just connecting all the old stuff to it to buy time until the entire machine is retired, is quite a reasonable solution to the problem. There must be a whole hidden industry that makes good money by solving the problem of “just make a new and reliable computer for it and leave everything else as it is because it works”.

So, I can imagine perfectly well why one would keep a PDP 10, VAX 11 or IBM 360 running today, if the conversion to a modern platform is cost-prohibitive. However, that got me thinking, what’s the oldest computer I could actually use today, for any purpose.

The answer is quite interesting. For instance, if I had a free serial terminal, VT100 or something, and had a workshop with a Raspberry Pi or some other Linux server, I could connect the ancient terminal to it and display logs and issue commands. It could just work there for that single purpose, and perhaps be more convenient than connecting to the linux server with my modern laptop in a very filthy environment. However, I don’t really know what I would do with a much more modern machine, such as an original IBM PC, or the first Macintosh. They are merely museum pieces today, and I can’t find any practical use for them. So, what’s the next usable generation? It would need to have connectivity to modern hardware in order for me to be able to exchange data; for instance, I could use a very old laptop as a typewriter, as long as I can pull the text I wrote out of it and use it on a modern machine later on. Ideally, it would have network connectivity and be able to save data to a shared directory. Alternatively, it should have USB so I can save things to a thumb drive. Worst case, I would use a floppy disk, and I say worst case because the 3.5” 1.44MB ones were notoriously unreliable and I used to have all kinds of problems with them. It would have to be something really interesting in order for it to be worth the hassle, and I’d probably have to already have it in order to bother with finding a use for it. For instance, an old Compaq, Toshiba or IBM laptop running DOS, where I would use character-graphics tools, exclusively for writing text.

But what’s the oldest computer I could actually use today, for literally everything I do, only slower? The answer is easy: it’s the 15” mid-2015 Macbook Pro (i7-4770HQ CPU). That’s the oldest machine that I have in use, in a sense that it is retired, but I maintain it as a “hot spare”, with updated OS and everything, where I can take it out of a drawer, take it to some secondary location where I want a fully-functional computer already present, not having to assume I’ll have a laptop with me. When I say “fully functional”, I don’t mean just writing text, surfing the web or playing movies, I mean editing photos in Lightroom as well. The only drawback is that it doesn’t have USB C, but my external SSD drives with photo archive can be plugged into USB A with a mere cable replacement, so that would all work, albeit with a speed reduction compared to my modern systems. So, basically, a 4-th generation Intel, released in 2014, is something I can still use for all my current workloads, but it’s significantly slower, already has port compatibility issues with the modern hardware (Thunderbolt 2 with mini-DP connectors is a hassle to connect to anything today as it needs special cables or adapters), and is retired, to be used only in emergencies or specific use-cases.

I must admit that I suffer from no nostalgia regarding old computers. Sure, I remember aspiring to get the stuff that was hot at the time, but it’s all useless junk now, and I have very good memory and remember how limited it all was. What I use today used to be beyond everybody’s dreams back then – for instance, a display with resolution that rivals text printed on a high-res laser printer, with the ability to display a photograph in quality that rivals or exceeds a photographic print, and the ability to reproduce video in full quality, exceeding what a TV could do back then. I actually use my computer as a HiFi component for playing music to the NAD in CD quality. Today, this stuff actually does everything I always wanted to do, but the computers were vehicles for fantasy rather than tools to actually make it happen. I can take pictures with my 35mm camera in quality that exceeds everything I could do on 35mm film, and edit the raw photos on the computer, with no loss of quality, and with no dependence on labs, chemicals or other people who would leave fingerprints on my film. So, when I think about the old computers, I can understand the nostalgia about it, but the biggest part, for me, is remembering what I always wanted computers to do, and feeling gratitude that it’s now a reality. The only thing that’s still a fantasy is a strong AI, but I’m afraid that the AI of the kind I would like to talk to would have very little use for humans.

Dependence on computers

I started writing about something in the comment section, but I decided it’s relevant enough to make it an article.

The CrowdStrike event looks like a very mild example of something I’ve been worrying about for years, namely a widespread systemic persistent IT outage that puts payment systems worldwide out of commission.

Basically, everybody is using digital payment for everything these days, so what happens if it all goes out for some reason? Oh, you’ll use cash. You mean, the ATM is going to work? No it isn’t. You mean, you have cash and will just use it? You mean, the cash register computer will not be afflicted, and the cashier will be willing to take your money without the ability to print out the invoice and register the transaction? Or will all the stores close until this is dealt with? In which case you will have to rely on whatever food and hygienic/medical supplies you have at your place, because you’ve been prepping? Oh wait, you’ve been prepping but since nothing happened you just consumed all the stuff and there isn’t any now? Yeah, that.

I mean, the first level of preparing for an IT outage is to have an air-gapped spare laptop stashed in some drawer, with Linux/Windows dual boot in case one of those two is the cause of failure, but the next question is, what do you connect to, if the cause of the problem is general, so the telecoms are down, banks are down, online services are down, AWS/Azure can’t process your credit card so it locks you out of your servers, GoDaddy is down so you can’t transfer your domains somewhere out of the afflicted area, or DNS is down so you can’t reach anything, or the satellites are down so Starlink doesn’t work. And let’s say it’s something really major so the consequences take so long to clear, there’s serious breakdown of services everywhere.

The first answer everybody has to this is something along the lines of “it’s unlikely that all the computer systems will go out at once”. True, it’s unlikely, but it was also previously unseen that all the enterprise win10 machines go out at once and half the world gets instantly paralyzed. Those machines aren’t independent. Microsoft enforces push updates, and the big corporations have unified IT policies which means they all enforce updates to all their machines. Also, everybody seems to run Windows, which means it’s no longer necessary for an attack vector or a blunder to target billions of computers independently, because it’s a single failure that can propagate from a single point and instantly take down enough of the network that the rest have nothing to connect to.

Also, there have recently been revelations that OpenSSL had severe vulnerabilities. The vast majority of Internet infrastructure uses OpenSSL. A systemic vulnerability that can be targeted everywhere means… you tell me.

Someone will say that people would adapt, and my answer is, what does that even mean? Every single store I’ve been in for the last decade or so uses bar-code readers to scan items, and then the computer pulls out the item data, most notably the price, from the database, so that the cashier can charge you. More recently, all those computers are required to connect to the state tax service where every bill needs to be “fiscalised” for taxation purposes. If Internet fails, the cash register can’t “fiscalise” bills and that’s going to be a problem. If the cash register is out because it’s always a Windows machine and you saw what can happen to those, and it’s connected to the Internet or the “fiscalisation” won’t work, the cashier won’t be able to tell how much the item you want to purchase costs and thus won’t be able to charge you. They don’t have prices on items anymore, like they did in the ‘80s. Everything is in the database.

Some say, run Linux, or buy a Mac. Great, but it doesn’t actually solve anything, because if every Enterprise and most smaller companies run everything on Windows, and those computers all bluescreen, what are you going to connect to, with your Linux PC? How does your computer even matter if you go to a store and you can’t buy anything, and how does it matter if you try to go online and most of everything is down, because OpenSSL has been attacked by something that gets root permissions on your computer and encrypts its filesystem?

I’ve been recently thinking that Internet isn’t so much a framework for connecting computers, but really a separate plane of existence. When I’m using my computer, I’m not really on an island in Croatia, I’m on the Internet. Imagine all the beings that exist in the physical world, but without an Internet connection, like trees, birds, cats and so on. In order to interact with them or even perceive them, you need to switch planes of existence, between physical world and the Internet. However, some aspects of the physical world, like our civilization for instance, have been abstracted into the Internet to such a degree that you can’t even use them anymore if you don’t have access to all kinds of Internet-based infrastructure, which is not currently perceived as a problem, but might become one really fast if something fundamental breaks down with the Internet.

Also, if a nefarious government or a corporation wants to lock you out of the Internet for “non-compliance”, you are really fucked, which makes it a really big sword of Damocles hanging over our heads, forcing everybody to be good and obedient slaves.

Intel SNAFU

Regarding the Intel CPU issues, I must say I expected that; I couldn’t tell which manufacturer will have the issue first, but with the arms race of who’ll be the first to make the 0 nm node dye that draws a megawatt of power and needs to be cooled with liquid Helium, it’s a perfectly logical outcome. If I had to guess, I’d say they made transistors out of too little material at some critical part of the dye, and with thermal migration within the material at operational temperatures the transistors basically stopped working properly.

So, basically, the real question is who actually needs a CPU of that power on a single-CPU client machine? We’re talking 24 cores, 32 threads, 5.8 GHz boost clock, 219W max. power at full turbo boost. This is sheer insanity, and it’s obvious that my ridiculous exaggeration about megawatts of power isn’t even that much more ridiculous than the actual specs of that thing. So, who even needs that? I certainly don’t. Gamers? They probably think they do, and they are likely the ones buying it. Developers? Video renderers? Graphics designers? I don’t know. But putting that many cores on one CPU, and clocking them this high, is sheer madness reminiscing of the Pentium IV era, where they clocked them so high, and with such dye layout, that a piece of malware appeared that deliberately iterated along the central path until it overheated so much the dye cracked, destroying the CPU.

I’m writing this on a Mac Studio M2 Max, which has more power than I realistically need, even for Lightroom, which is extremely demanding, and it’s idling at 38°C during the summer. It never makes a sound, it never overheats, and it’s absolutely and perfectly stable. I actually had the option of getting the model with twice the power for twice the money, which is an excellent deal, and I didn’t bother, because why would I? At some point, more power becomes a pointless exercise in dick-measuring. Sure, bigger is better, but is it, when it’s already half a meter long and as thick as a fire extinguisher? So this is the obvious consequence – Intel tried to win a dick measuring contest with AMD and made itself a meter long appendage, because bigger is better, and now it turned out it’s not fit for purpose? How surprising.

 

Thoughts about computers

I’ve been thinking about computers lately, for multiple reasons, so I’ll share a few thoughts.

It’s interesting how people tend to have weird prejudice about things based on the label. For instance, “gaming” computers are supposedly not “serious”, you shouldn’t buy that stuff if you’re doing serious work with your computer. You should get a “business” or a “workstation” machine.

That’s such incredible nonsense, because what does “gaming” even mean at this point? It’s basically “workstation” with RGB lights. If a PC or a laptop is “gaming”, it usually means a powerful graphics card, overbuilt power supply, high-performance cooling system, and generally high-end components designed to be able to work on 100% indefinitely. What’s a workstation PC? Well, it has a powerful GPU, overbuilt power supply, high performance cooling, high-spec components that are designed to be able to work on 100% indefinitely, only the graphics card has drivers for CAD, which means it supports double precision floating point arithmetics, and it’s designed more “seriously”, which means no RGB and the box looks normal, and not like an alien space ship with glowing vents. It’s also more expensive, in order to milk the “serious” customers for money, which means that if you want to have the greatest performance for your money, get gaming components, get a normal-looking case, turn off the RGB nonsense and there you go. The only situation where you would actually want a real workstation machine is if you’re running server workloads and you actually need a server CPU. Otherwise, “gaming” translates into “great for photo and video editing and programming”.

It’s interesting that everything that tends to be labelled as “business” tends to be shit. That’s because a computer for “business” is the cheapest possible piece of crap that will still work, so penny-pinchers that buy equipment for general staff that slaves their lives away in cubicles for a meager wage can feel good knowing they spent the least possible amount of money on the “workforce”. That’s never the computer the boss is getting for himself. He’s getting a 16” Macbook Pro. He’s certainly not getting the “business” model. You don’t get a business-grade computer if you’re doing business, you get it if you are considered to be equipment required for doing business, and you need to be cheap.

There’s also the question whether to buy a Mac or a “PC”, as if somehow a Mac is not a PC. Let me write down my own experience. I used to write books on IBM T41 laptops running Linux. The keyboards were great, and the screen had lots of vertical space, being 4:3 ratio, so that worked fine, but they tended to die on me, because I used them on my lap, and I didn’t have air conditioning in the room so they overheated during the summer; the motherboards would die, so I would just get another used T41 (they were several generations obsolete and dirt cheap at that point), put in my hard drive and continue writing. At some point, after I went through two IBMs in two years, I decided I had enough of that shit and got a 13” Macbook Air. Now, that thing was indestructible, a proverbial cockroach that would survive a nuclear war. I had to retire it because it had only 2GB of RAM and became unbearably slow after five or so years of use, and traded it in for some new piece of hardware, but there’s obviously a reason why so many people buy Macbooks, and it’s not because they are stupid so they buy “overpriced junk”. If anything, the old Thinkpads were junk compared to the Macbook. I replaced the Air with a mid-2015 15” Pro, which is also a cockroach – I’m using it to write this, it’s 8 years old, I more-less retired it a few years ago and still works just fine. The screen, touchpad and keyboard are still great, but it’s significantly slower than the modern machines, so I wouldn’t do serious heavy lifting on it, but for all the normal tasks it’s just fine. The only interventions I did on it was to change a bloated battery when it was 5 years old, and I replaced the 256GB SSD with a 1TB Samsung. So, my answer to “why would you buy a Mac” is “because I want it to work reliably and well until it’s a million generations obsolete and I want to replace it anyway”. It doesn’t just die, the user interface is great, and it’s usually among the fastest machines you can get, and considering how well it works and how long it lasts, it’s dirt cheap. The only exception are the generations with the touchbar and butterfly keyboard. They were shit, and everybody who got one regrets their decisions.

It’s not that I have some general recommendation, such as “just get a Mac” or “just get a gaming machine”. In fact, it is my experience that, today, the computers are so good you really have many good options, but that’s only if you avoid the “economy” and “business” stuff, which is what junk made of obsolete components sold to businesses at clearance prices is called.