About hardware reviews

I’ve been watching the hardware reviews on YouTube pretty much regularly for the last couple of years, and I see a clear trend there.

Remember those commercials on TV where the host is super-excited about the set of knives he’s trying to sell you, or some kitchen appliance, or some additive for automotive oil? Yeah, that’s those supposedly freelance PC hardware reviewers. They are the new kitchen appliance salesmen.

That doesn’t say they are completely useless. If you’re interested in what they are trying to sell you, they are quite informative, or otherwise I wouldn’t be watching them, but never in a million years should you forget that they are basically an arm of the advertisement sector. Their revenue stream comes from generating interest for what they are presenting, and interest, for the product manufacturers, means increased sales, so there is a clear motive for the manufacturers to funnel their advertising budget to the hardware reviewers who can generate the most useful form of customer interest, which can be described as “enthusiasm for the product and the brand”.

One of the ways of creating enthusiasm for what they are trying to sell you is to create a perception that we live in a parallel universe where your current computer is slow, and you really need an upgrade. An excellent example of this is a video I watched just now, basically saying that the new thing Apple released is so good, everything else is a steaming pile of garbage and you should get the shiny new gadget in order for your life to have meaning and purpose. Yawn.

Let’s put things into perspective, shall we? The computers have been an absolute overkill for the last several years. Laptops and desktops with Intel Haswell CPU, released in 2013, if they are equipped with a good SSD and GPU, are blazingly fast. I have a Haswell Macbook pro 15” and I use it to edit photos when I’m away from home, and the stuff I’m doing with it isn’t trivial – some of it is creating panoramas of more than 10 RAW files from a 24MP camera in Lightroom – and guess what, it’s fast and great. I have absolutely no complaints on its performance, unless we’re talking about GPU. Sure, I’ve seen and worked with nominally faster computers, but for the most part “fast” is a number in a benchmark, not something you actually feel. If you’re running a task that really takes a while, such as re-indexing an entire catalog in Lightroom, it’s going to take hours no matter what hardware I use. Whether it takes two or four hours is quite irrelevant, because that’s an operation I do once in a few years, and when I do it I just leave the computer overnight to do its thing, and it’s done in the morning. I’m certainly not going to sit behind it for hours with a stopwatch, and no, upgrading hardware isn’t going to make it exponentially faster, because it needs to load tens of thousands of RAW files and create JPEG previews for all of them, and that’s such a massive workload it doesn’t really matter if your computer is twice as fast, because it’s still taking hours. In normal operation, the difference between fast and faster is measured in tenths of a second. There’s no feeling of “OMG this new one is so fast, my old one is garbage”. It’s “oh, nice, it’s somewhat snappier, you almost kinda feel the difference if you really try”. I’ve seen a double blind test of SSD speed between SATA, NVMe and the new gen-4 NVMe, where people who actually work with those things professionally all day all guessed wrong trying to guess which is which, because it’s all fast enough for whatever you have to do, and a 10x difference in benchmarks is imperceptible by the user. How can that be, you might ask. Well, as I said, computers have been very good for the last ten years or so, and once you have an SSD and a GPU competent enough for the resolution and refresh rate of your monitor, you’re not going to perceive any lag in normal use. Sure, the benchmarks are going to show bigger numbers, but it’s like speed and acceleration – you don’t perceive speed, you perceive only acceleration. It also depends on the task you’re using it for. For instance, I use a Raspberry Pi 3B+ as a local backup and development server, and I don’t perceive it as “slow” for what it does; it runs a development copy of this server, and I use it to test code before I deploy. It doesn’t even have an SSD, just a microSD memory card and an USB thumb drive. Why don’t I use something faster, like a Raspberry Pi 4? It uses more power, so I would be hesitant to leave it always on, so it would be worse for the purpose. The same goes for the NUC – it’s faster, it’s better in every conceivable way, but it doesn’t matter for the intended purpose. If something is fast enough, faster doesn’t mean better, it means “meh”.

I’m in a weird position where most of my computers are more than 4 years old, all the YouTube salesmen are trying to sell me expensive new and shiny hardware, and if I listened to them and replaced all my “garbage” hardware, it would cost me enough to buy a car, and it would produce exactly zero perceivable difference in practical use for the things I do. One reason for that is that I actually did upgrade things that matter to me – the SSD in both my Macbook Pro and my desktop PC is a 1TB Samsung 970 EVO NVMe. If you know tech, you know how fast that thing is. I have 32 GB of RAM in the desktop, the monitor is a huge 43” 4K thing, and the GPU is powerful enough to run games at the monitor’s refresh rate in the native resolution; and yes, it’s completely silent in normal operation. That thing is essentially un-upgradable, because whatever you change you don’t really get a better result, you just waste money. The same goes for the Raspberry pi server: I have a NUC here I could replace it with, so it’s not even a matter of money, it’s a matter of why the fuck would I do that, because it would do the same tasks equally well. At some point, upgrading feels like changing your dishwasher just because there’s a new model. No wonder the YouTube salesmen are trying so hard to create enthusiasm for the whole thing, because it’s really, really hard to give even the slightest amount of fuck at this point.

Apple M1 chip

Here’s my take on it, based on what we know so far.

It’s a modified mobile APU, which means it has both strengths and drawbacks of other Apple’s mobile chips: it produces great computational power and uses very little energy. That much is obvious from the reviews.

The drawbacks are that the peripheral connections seem to be an afterthought. It appears to have two thunderbolt connections, but if you read carefully it turns out that when you connect one monitor to the USB C, the other USB C can’t connect the second monitor, and it’s questionable how much you can connect to it at all, because although they call it thunderbolt, it doesn’t work with e-GPU, and it’s questionable how many PCIe lanes it exposes, if any. Also, the connected monitors seem to mostly work at 30Hz, with 60Hz support being very… let’s say “selective”. Basically, it’s an iPad pro chip slightly tweaked to allow for the peripherals, but the “tweak” in question wasn’t serious enough to support anything even remotely resembling the level of connectivity we normally expect on desktop computers.

Also, they modified the recovery boot menu, completely turning off the option that previously allowed us to boot from an external drive. This means two things. First, if your SSD dies, you’ll need to change the motherboard, you can’t just plug in an USB or Thunderbolt SSD and install the OS there, and continue using the computer. Second, no more installing Linux on a Mac. That there’ll be no BootCamp Windows is already known. They completely locked the hardware in. If they lock the software in as well, a Mac will become a very nicely decorated prison cell.

Also, since the RAM is on the chip itself, that means no RAM expansion. This is a step further from soldering the RAM onto the motherboard, and we’ve only seen this level of non-expandability on smartphones and tablets. One would expect it from an ultrabook, but on a Mac Mini or a Macbook Pro, the inability to change the SSD or upgrade RAM is terrible news. Those systems are so closed off, they feel claustrophobic – no RAM upgrade, no SSD upgrade, peripherals reduced to one monitor, with the other USB C port switching to low capacity mode when that happens, which means the bus is quite constrained and, in lack of another word, anemic.

Having all that in mind, it’s questionable what can one do with all that computational power? It reminds me of my iPhone, which has as much power as a desktop computer, but you are so constrained by all the limitations of the form factor, lack of peripherals, and limitations of the OS, that it remains just a phone that does the trivial phone tasks really, really quickly. For professional use, where you have to consider connecting two external monitors, a fast external storage drive, LAN and similar things, which is what you would actually use to do non-trivial work, the Intel machines are vastly superior, and my recommendation would be to look into the 16″ Macbook Pro, and the new 2020 iMac, which are both excellent. The new machines are good for applications where battery life is the absolute priority, and where you need extreme computational power, but with little or no demands for peripheral connectivity.

Language peculiarities

I’ve been asked many times what’s the difference between a crow and a raven, and my answer was that crow is the species in general and raven is a male specimen, something like sheep and ram. However, I never felt perfectly satisfied with this answer, until one day I found out that the Icelandic word for raven is “hrafna”, and then it clicked – they come from two completely different languages, raven is the Norse “hrafna”, and crow is the Latin “corvus”.

I apologize in advance if you learned about this in kindergarten or elementary school, but it’s actually new to me. 🙂 The same goes with the names of the days of the week, where some are obvious, but I didn’t really get the etymology of others until recently:

Sunday – Sun
Monday – Moon
Tuesday – Tíw (old English for Norse Týr), the one-armed god of war
Wednesday – Wōden/Wotan (old English for Odin)
Thursday – Thor
Friday – Freya
Saturday – Saturni dies, Latin

Technical issues

I’ve been busy with the technical stuff lately; not only that I had to transcode the database of the forum, to be usable by the new software, and write/adapt parts of the software to fit the needs of the community using it, but I also had a disproportionate number of hardware failures this year. Most of them were bloated or weakened LiIon batteries on phones and laptops, but we also had two NVMe drive failures, including one in the recent days, which was hard to diagnose because I initially suspected some driver or faulty Windows update, but as evidence allowed me to narrow it down, I started to suspect the Samsung system drive, and my confidence in that assessment grew to the point where I preventatively replaced it without waiting it to fail completely and force me to rebuild the system from the ground up. And yes, since I cloned and replaced the drive I had no more system freezes. As in the case with the two failed drives before (Mihael’s Adata SATA drive, and Biljana’s Samsung PCI-E drive in the 13″ Macbook pro), it was controller failure, which produces symptoms so similar it made it possible for me to diagnose it prior to complete failure this time. All in all, I had an increased number of drive failures since we moved away from HDD to SSD technology, and literally none of them were due to NAND wear, which everybody feared initially; it’s always the controller that goes, and it’s the worst case scenario because if you don’t catch it in time, it’s complete and irrecoverable data loss. However, only Mihael’s drive went all the way, because we were late in reacting to it malfunctioning for days, and likely weeks. With Biljana’s laptop, I already had some experience with SSD controller failure so I replaced her drive preventatively and all the symptoms ceased, and I did the same with my own system drive on the main computer. Basically, the symptoms look very much as if the system bus is clogging up and the system events are not going through. When that takes place, I’m starting to suspect the system SSD controller failure. This, of course, puts Apple’s practice of soldering SSD components directly onto the motherboard, so that the SSD can’t be replaced, into perspective. That’s just asking for trouble, because it turns something that can be a simple and straightforward process of “back the old drive up, replace it with a new one, and restore from backup” into motherboard write-off and replacement, and those are expensive. Sure, it can be a welcome excuse for replacing your obsolete computer with a new, more modern one, but in 2 out of 3 cases of SSD failure that I had recently, only one computer was obsolete and ready to be replaced, and two were perfectly good machines that required only a system drive replacement. I am seriously not a fan of having SSD and RAM soldered onto motherboards, because those are the two main things that have to be either upgraded or replaced due to failure, and not allowing for that is just screaming “planned obsolesecence”. It’s like not allowing the GPU to be replaced in a gaming PC, knowing that it’s the first thing that will need to be upgraded in order to keep the machine functional. Sure, I have a habit of keeping the old hardware in use until it becomes completely useless, which means I could occasionally use some sort of a push to buy a new, more capable system, but on the other hand, if I see nothing wrong with the system I’m using, in the sense that it does everything instantly and is not causing me any troubles, why would I have to be forced by the manufacturer to throw it away just because some component went off prematurely? The system I’m using plays Witcher 3 on 4K at 60 FPS, on ultra. It’s not a slow and decrepit system by any stretch of the imagination. If I had to replace the whole computer just because the system drive failed, I would be really pissed, and that’s exactly what would have happened with Apple, if I used one of their fancy-looking machines with SSD soldered on. The only one of their current machines that’s actually designed properly is the new Mac Pro, but that one is so expensive it makes no sense to buy it, unless you hate your money and want to see it burn. Someone will say that you have to pay for quality, but that’s really bullshit since they use the same Samsung NVMe drives I buy off-the-shelf to build my own systems, and based on my experience the drives they use are exactly as likely to fail as any other Samsung drive. So, sure, you can solder it onto the motherboard, but then I want a 5 year warranty on the motherboard with instant replacement in case of component failure, no weaseling out.