Useless technologies

Maybe I’m getting old, but as someone who’s been living on the bleeding edge of technology since 1984 or so, and early-adopting all kinds of gadgets, there are increasingly more things that leave me indifferent, or I actually find them annoying. Let me cite some examples.

  • Smart watches. I saw how they work, said “meh”, never got one. Instead, I use a very classic mechanical watch. It is a very elegant piece of technology that is powered by hand motions, is easily serviceable and lasts for a very, very long time. Also, it isn’t landfill fodder.
  • Social media. Strictly speaking not a gadget or a tech artifact, but I see the entire phenomenon as extremely worrisome, and it does kind of function as the spirit behind lots of gadgets. My main problems with social media are that they are corporate-owned, and as such have a chokepoint of censorship, where a political group can take control of the company, which is relatively easy, and then basically control behavior of billions. Also, they promote groupthink and mobbing, and, by definition, reduce people to very simplified patterns of thought and emotion. Absolutely none of that is a good thing. I certainly don’t object to people communicating online, and it is definitely possible to create very constructive forums, but the general trend is that Facebook, Google, Twitter and others are implementing political censorship and reducing the level of human mental diversity and complexity.
  • Virtual/hybrid reality devices. The only place where I can see use for them are driving/flight simulators. For everything else, they are just a great way of getting motion sickness.
  • Mobile apps that are essentially just single-website browsers. Just use the web browser to go to that site and there’s your “app”. Pointless. Also, dedicated media library apps. A better version of this is called a file system and a media player.
  • Appliances with Li-ion batteries built in, with no apparent thought given to servicing or replacement of the battery, which shows that the device’s useful life is obviously limited by that of the battery. It’s an obvious example of wastefulness and planned obsolescence. I’m talking about electric shavers, vacuum cleaners, toothbrushes, smartwatches, digital styluses, earphones/headphones, phones, tablets, laptops etc. It’s not that I object to things not being tethered to a wall with a power cord, but just make batteries that are standard, modular and easy to replace and recycle, thank you.
  • Electric cars, but shortage of electricity. It’s not that we don’t have designs for safe and efficient nuclear power plants. Thorium-molten-salt technology that uses liquid nuclear fuel not only is safe (it’s trivially easy to make it fail in a safe manner without causing meltdowns), it also uses radioactive waste from our solid-fuel plants as fuel, basically transmuting everything into either fissile or inert form. The technology is absolutely awesome, but you don’t hear the “eco” leftists talking about it, because all they care for is stupid and toxic shit such as the windmills and solar panels. It’s all weak, inefficient and unreliable garbage. The only things that are actually great are geothermal and nuclear plants, and you can define geothermal plants as nuclear plants using unconcentrated nuclear fuel in-situ, because the Earth heat is nuclear in origin. Electric cars are not “clean”, they are toxic garbage, and they don’t use a “clean” or “abundant” resource, because electricity can be both dirty and scarce, and, thanks to the leftists, it is increasingly more so. Power the entire civilization with liquid fuel fission reactors, and power the cars with modern isotope sources instead of those huge, heavy and fragile Li-ion batteries, and then we can talk about electric cars. Safety-wise, I would rather have a nuclear reactor than a Li-ion battery in my car, and as for the solar panels and windmills, has anyone given any thought to their lifespan, recycling requirements and ecological impact? Thought so.
  • Hipster tech. Here I’m thinking about film cameras that intentionally produce inferior pictures because it’s “retro”, or using vinyl records mastered from digital sources because it’s “analog”. That’s just affectation.
  • Podcasts and video. I don’t object to those as such, but when a guy online makes a video in which he basically shows stock footage and reads from a script, it might as well be an article in textual form. Reading is a thing.
  • Nuclear fusion. Stop trying to make that work, it works only when gravity provides containment for free, i.e. in the stars, but when you have to create containment yourself, it’s extremely expensive which makes the whole idea ineffective, dirty, cumbersome and fragile. It’s also not “clean”, it produces as much radiation as nuclear fission, it’s just that the nuclear fuel is gaseous, and not solid or liquid. Liquid nuclear fuel used for fission solves all the problems of solid-fuel fission reactors, and remove all the complexity, difficulty and cost associated with fusion.

Basically, I find it annoying when people who discard a technology that consumes nuclear waste to produce clean electricity, in favor of technology that consumes rare elements to produce chemical waste and dirty electricity, have some incredible urge to lecture everybody about environmental impact. Also, I find it annoying when people who can’t write a web application to save their lives think they are tech savvy because they have all the social media apps on their phones. Also, I find it annoying when people who virtue signal online about the evils of “capitalism” and “consumerism” prefer to buy a locked-in unserviceable device because it’s “more elegant”, and they also bully people who still use an older device because they are “poor”. Also, they use hipster tech, but ridicule “boomers”, without actually knowing what a “boomer” is, only that it’s someone older than they are, with an assumption that they are the smartest generation because they have smartphones and stuff. But the old guys who made them their fucking smartphones, they are stupid and “out of touch with the modern things”.

It is also my impression that we are witnessing a historically unique phenomenon where the younger generations are significantly less tech-savvy than the ones before them. Basically, the WW2 generation went to the Moon, the baby-boomers invented IT, my generation invented all the IT infrastructure that runs on top of that, and the younger generations basically just use it all to exchange memes online. Basically, the level of intellectual degradation is visible from the prevalence of conspiracy theories that cast doubt on the existence of all kinds of things that were obvious to the generations before. I’ve seen all kinds of nonsense – Moon landings aren’t real, space tech isn’t real, Earth is flat, nuclear weapons aren’t real.

All I can say to this is, idiocracy is real.

Hardware upgrades

Every time I write an article about how I don’t need a new computer because my old one works just fine, you know what’s going to happen next. 🙂

I basically concluded that I’m spending too much mental energy arguing with myself that I don’t need to spend money on new computers despite the fact that most of my gear is 4-5 years old and old computers are neither fast nor reliable, and for someone like myself who does basically everything with computers, it’s not the best idea to keep the old stuff for too long, because it increases the probability of random failure, so preventative maintenance in form of replacing the workhorse machines every 5 years at max is simply a reasonable thing to do.

So, I got both a new laptop and a desktop upgrade kit; I upgraded the desktop to Ryzen 5900X with 64GB of RAM, and I got a new M1 Macbook Air. The desktop thing is obvious – I basically replaced the motherboard, which was actually producing issues with sound going mute for a second every now and then, which must be a USB issue since my sound works through s Schiit Modi 3 DAC connected via USB, and not the built-in audio. The issue went away with motherboard change. I also got rid of the DDR3 RAM, which was running at 1600MHz, and I got a high end CPU. Basically, in most things I do it conformed to my predictions – it’s no faster in normal desktop use than my old system, but Lightroom runs significantly faster, and I actually use 44GB of RAM under serious load, so 64GB is not overkill.

As for the laptop, the 15” Mac was not the best match for my usage case (too big and awkwardly shaped for writing books/articles in my lap), so I bought an Asus Zenbook ultralight a couple of years ago, because the new Air had shit keyboard and shit CPU, and it cost significantly more, but the screen and battery on the Zenbook were sub-par, and I prefer Mac OS on the laptop, so I got the new Apple Silicon Air now, which is fast, has great keyboard, touchpad and screen, and the battery life is better than anything I could imagine existing on a laptop. I managed to compile the GNU tools I needed in Macports, and unless I need a dedicated portable Lightroom machine later on, I’m done with hardware now and I can go back to blissfully ignoring it into the foreseeable future. Basically, I replaced both my old laptops with a single new one, except for Lightroom on a vacation, where the 15” Macbook pro will continue to be used, due to bigger screen and more RAM, which make it a superior photo editing machine. The new Air has sufficient resources for everything other than Lightroom, which is a specialized task that requires so much more resources it just makes no sense to try and cram it all into an ultraportable, especially since the screen size is one of the main limitations, and for everything else that I do the Air is as fast as the new Ryzen desktop.

About hardware reviews

I’ve been watching the hardware reviews on YouTube pretty much regularly for the last couple of years, and I see a clear trend there.

Remember those commercials on TV where the host is super-excited about the set of knives he’s trying to sell you, or some kitchen appliance, or some additive for automotive oil? Yeah, that’s those supposedly freelance PC hardware reviewers. They are the new kitchen appliance salesmen.

That doesn’t say they are completely useless. If you’re interested in what they are trying to sell you, they are quite informative, or otherwise I wouldn’t be watching them, but never in a million years should you forget that they are basically an arm of the advertisement sector. Their revenue stream comes from generating interest for what they are presenting, and interest, for the product manufacturers, means increased sales, so there is a clear motive for the manufacturers to funnel their advertising budget to the hardware reviewers who can generate the most useful form of customer interest, which can be described as “enthusiasm for the product and the brand”.

One of the ways of creating enthusiasm for what they are trying to sell you is to create a perception that we live in a parallel universe where your current computer is slow, and you really need an upgrade. An excellent example of this is a video I watched just now, basically saying that the new thing Apple released is so good, everything else is a steaming pile of garbage and you should get the shiny new gadget in order for your life to have meaning and purpose. Yawn.

Let’s put things into perspective, shall we? The computers have been an absolute overkill for the last several years. Laptops and desktops with Intel Haswell CPU, released in 2013, if they are equipped with a good SSD and GPU, are blazingly fast. I have a Haswell Macbook pro 15” and I use it to edit photos when I’m away from home, and the stuff I’m doing with it isn’t trivial – some of it is creating panoramas of more than 10 RAW files from a 24MP camera in Lightroom – and guess what, it’s fast and great. I have absolutely no complaints on its performance, unless we’re talking about GPU. Sure, I’ve seen and worked with nominally faster computers, but for the most part “fast” is a number in a benchmark, not something you actually feel. If you’re running a task that really takes a while, such as re-indexing an entire catalog in Lightroom, it’s going to take hours no matter what hardware I use. Whether it takes two or four hours is quite irrelevant, because that’s an operation I do once in a few years, and when I do it I just leave the computer overnight to do its thing, and it’s done in the morning. I’m certainly not going to sit behind it for hours with a stopwatch, and no, upgrading hardware isn’t going to make it exponentially faster, because it needs to load tens of thousands of RAW files and create JPEG previews for all of them, and that’s such a massive workload it doesn’t really matter if your computer is twice as fast, because it’s still taking hours. In normal operation, the difference between fast and faster is measured in tenths of a second. There’s no feeling of “OMG this new one is so fast, my old one is garbage”. It’s “oh, nice, it’s somewhat snappier, you almost kinda feel the difference if you really try”. I’ve seen a double blind test of SSD speed between SATA, NVMe and the new gen-4 NVMe, where people who actually work with those things professionally all day all guessed wrong trying to guess which is which, because it’s all fast enough for whatever you have to do, and a 10x difference in benchmarks is imperceptible by the user. How can that be, you might ask. Well, as I said, computers have been very good for the last ten years or so, and once you have an SSD and a GPU competent enough for the resolution and refresh rate of your monitor, you’re not going to perceive any lag in normal use. Sure, the benchmarks are going to show bigger numbers, but it’s like speed and acceleration – you don’t perceive speed, you perceive only acceleration. It also depends on the task you’re using it for. For instance, I use a Raspberry Pi 3B+ as a local backup and development server, and I don’t perceive it as “slow” for what it does; it runs a development copy of this server, and I use it to test code before I deploy. It doesn’t even have an SSD, just a microSD memory card and an USB thumb drive. Why don’t I use something faster, like a Raspberry Pi 4? It uses more power, so I would be hesitant to leave it always on, so it would be worse for the purpose. The same goes for the NUC – it’s faster, it’s better in every conceivable way, but it doesn’t matter for the intended purpose. If something is fast enough, faster doesn’t mean better, it means “meh”.

I’m in a weird position where most of my computers are more than 4 years old, all the YouTube salesmen are trying to sell me expensive new and shiny hardware, and if I listened to them and replaced all my “garbage” hardware, it would cost me enough to buy a car, and it would produce exactly zero perceivable difference in practical use for the things I do. One reason for that is that I actually did upgrade things that matter to me – the SSD in both my Macbook Pro and my desktop PC is a 1TB Samsung 970 EVO NVMe. If you know tech, you know how fast that thing is. I have 32 GB of RAM in the desktop, the monitor is a huge 43” 4K thing, and the GPU is powerful enough to run games at the monitor’s refresh rate in the native resolution; and yes, it’s completely silent in normal operation. That thing is essentially un-upgradable, because whatever you change you don’t really get a better result, you just waste money. The same goes for the Raspberry pi server: I have a NUC here I could replace it with, so it’s not even a matter of money, it’s a matter of why the fuck would I do that, because it would do the same tasks equally well. At some point, upgrading feels like changing your dishwasher just because there’s a new model. No wonder the YouTube salesmen are trying so hard to create enthusiasm for the whole thing, because it’s really, really hard to give even the slightest amount of fuck at this point.

Apple M1 chip

Here’s my take on it, based on what we know so far.

It’s a modified mobile APU, which means it has both strengths and drawbacks of other Apple’s mobile chips: it produces great computational power and uses very little energy. That much is obvious from the reviews.

The drawbacks are that the peripheral connections seem to be an afterthought. It appears to have two thunderbolt connections, but if you read carefully it turns out that when you connect one monitor to the USB C, the other USB C can’t connect the second monitor, and it’s questionable how much you can connect to it at all, because although they call it thunderbolt, it doesn’t work with e-GPU, and it’s questionable how many PCIe lanes it exposes, if any. Also, the connected monitors seem to mostly work at 30Hz, with 60Hz support being very… let’s say “selective”. Basically, it’s an iPad pro chip slightly tweaked to allow for the peripherals, but the “tweak” in question wasn’t serious enough to support anything even remotely resembling the level of connectivity we normally expect on desktop computers.

Also, they modified the recovery boot menu, completely turning off the option that previously allowed us to boot from an external drive. This means two things. First, if your SSD dies, you’ll need to change the motherboard, you can’t just plug in an USB or Thunderbolt SSD and install the OS there, and continue using the computer. Second, no more installing Linux on a Mac. That there’ll be no BootCamp Windows is already known. They completely locked the hardware in. If they lock the software in as well, a Mac will become a very nicely decorated prison cell.

Also, since the RAM is on the chip itself, that means no RAM expansion. This is a step further from soldering the RAM onto the motherboard, and we’ve only seen this level of non-expandability on smartphones and tablets. One would expect it from an ultrabook, but on a Mac Mini or a Macbook Pro, the inability to change the SSD or upgrade RAM is terrible news. Those systems are so closed off, they feel claustrophobic – no RAM upgrade, no SSD upgrade, peripherals reduced to one monitor, with the other USB C port switching to low capacity mode when that happens, which means the bus is quite constrained and, in lack of another word, anemic.

Having all that in mind, it’s questionable what can one do with all that computational power? It reminds me of my iPhone, which has as much power as a desktop computer, but you are so constrained by all the limitations of the form factor, lack of peripherals, and limitations of the OS, that it remains just a phone that does the trivial phone tasks really, really quickly. For professional use, where you have to consider connecting two external monitors, a fast external storage drive, LAN and similar things, which is what you would actually use to do non-trivial work, the Intel machines are vastly superior, and my recommendation would be to look into the 16″ Macbook Pro, and the new 2020 iMac, which are both excellent. The new machines are good for applications where battery life is the absolute priority, and where you need extreme computational power, but with little or no demands for peripheral connectivity.

Technical issues

I’ve been busy with the technical stuff lately; not only that I had to transcode the database of the forum, to be usable by the new software, and write/adapt parts of the software to fit the needs of the community using it, but I also had a disproportionate number of hardware failures this year. Most of them were bloated or weakened LiIon batteries on phones and laptops, but we also had two NVMe drive failures, including one in the recent days, which was hard to diagnose because I initially suspected some driver or faulty Windows update, but as evidence allowed me to narrow it down, I started to suspect the Samsung system drive, and my confidence in that assessment grew to the point where I preventatively replaced it without waiting it to fail completely and force me to rebuild the system from the ground up. And yes, since I cloned and replaced the drive I had no more system freezes. As in the case with the two failed drives before (Mihael’s Adata SATA drive, and Biljana’s Samsung PCI-E drive in the 13″ Macbook pro), it was controller failure, which produces symptoms so similar it made it possible for me to diagnose it prior to complete failure this time. All in all, I had an increased number of drive failures since we moved away from HDD to SSD technology, and literally none of them were due to NAND wear, which everybody feared initially; it’s always the controller that goes, and it’s the worst case scenario because if you don’t catch it in time, it’s complete and irrecoverable data loss. However, only Mihael’s drive went all the way, because we were late in reacting to it malfunctioning for days, and likely weeks. With Biljana’s laptop, I already had some experience with SSD controller failure so I replaced her drive preventatively and all the symptoms ceased, and I did the same with my own system drive on the main computer. Basically, the symptoms look very much as if the system bus is clogging up and the system events are not going through. When that takes place, I’m starting to suspect the system SSD controller failure. This, of course, puts Apple’s practice of soldering SSD components directly onto the motherboard, so that the SSD can’t be replaced, into perspective. That’s just asking for trouble, because it turns something that can be a simple and straightforward process of “back the old drive up, replace it with a new one, and restore from backup” into motherboard write-off and replacement, and those are expensive. Sure, it can be a welcome excuse for replacing your obsolete computer with a new, more modern one, but in 2 out of 3 cases of SSD failure that I had recently, only one computer was obsolete and ready to be replaced, and two were perfectly good machines that required only a system drive replacement. I am seriously not a fan of having SSD and RAM soldered onto motherboards, because those are the two main things that have to be either upgraded or replaced due to failure, and not allowing for that is just screaming “planned obsolesecence”. It’s like not allowing the GPU to be replaced in a gaming PC, knowing that it’s the first thing that will need to be upgraded in order to keep the machine functional. Sure, I have a habit of keeping the old hardware in use until it becomes completely useless, which means I could occasionally use some sort of a push to buy a new, more capable system, but on the other hand, if I see nothing wrong with the system I’m using, in the sense that it does everything instantly and is not causing me any troubles, why would I have to be forced by the manufacturer to throw it away just because some component went off prematurely? The system I’m using plays Witcher 3 on 4K at 60 FPS, on ultra. It’s not a slow and decrepit system by any stretch of the imagination. If I had to replace the whole computer just because the system drive failed, I would be really pissed, and that’s exactly what would have happened with Apple, if I used one of their fancy-looking machines with SSD soldered on. The only one of their current machines that’s actually designed properly is the new Mac Pro, but that one is so expensive it makes no sense to buy it, unless you hate your money and want to see it burn. Someone will say that you have to pay for quality, but that’s really bullshit since they use the same Samsung NVMe drives I buy off-the-shelf to build my own systems, and based on my experience the drives they use are exactly as likely to fail as any other Samsung drive. So, sure, you can solder it onto the motherboard, but then I want a 5 year warranty on the motherboard with instant replacement in case of component failure, no weaseling out.