About hardware reviews

I’ve been watching the hardware reviews on YouTube pretty much regularly for the last couple of years, and I see a clear trend there.

Remember those commercials on TV where the host is super-excited about the set of knives he’s trying to sell you, or some kitchen appliance, or some additive for automotive oil? Yeah, that’s those supposedly freelance PC hardware reviewers. They are the new kitchen appliance salesmen.

That doesn’t say they are completely useless. If you’re interested in what they are trying to sell you, they are quite informative, or otherwise I wouldn’t be watching them, but never in a million years should you forget that they are basically an arm of the advertisement sector. Their revenue stream comes from generating interest for what they are presenting, and interest, for the product manufacturers, means increased sales, so there is a clear motive for the manufacturers to funnel their advertising budget to the hardware reviewers who can generate the most useful form of customer interest, which can be described as “enthusiasm for the product and the brand”.

One of the ways of creating enthusiasm for what they are trying to sell you is to create a perception that we live in a parallel universe where your current computer is slow, and you really need an upgrade. An excellent example of this is a video I watched just now, basically saying that the new thing Apple released is so good, everything else is a steaming pile of garbage and you should get the shiny new gadget in order for your life to have meaning and purpose. Yawn.

Let’s put things into perspective, shall we? The computers have been an absolute overkill for the last several years. Laptops and desktops with Intel Haswell CPU, released in 2013, if they are equipped with a good SSD and GPU, are blazingly fast. I have a Haswell Macbook pro 15” and I use it to edit photos when I’m away from home, and the stuff I’m doing with it isn’t trivial – some of it is creating panoramas of more than 10 RAW files from a 24MP camera in Lightroom – and guess what, it’s fast and great. I have absolutely no complaints on its performance, unless we’re talking about GPU. Sure, I’ve seen and worked with nominally faster computers, but for the most part “fast” is a number in a benchmark, not something you actually feel. If you’re running a task that really takes a while, such as re-indexing an entire catalog in Lightroom, it’s going to take hours no matter what hardware I use. Whether it takes two or four hours is quite irrelevant, because that’s an operation I do once in a few years, and when I do it I just leave the computer overnight to do its thing, and it’s done in the morning. I’m certainly not going to sit behind it for hours with a stopwatch, and no, upgrading hardware isn’t going to make it exponentially faster, because it needs to load tens of thousands of RAW files and create JPEG previews for all of them, and that’s such a massive workload it doesn’t really matter if your computer is twice as fast, because it’s still taking hours. In normal operation, the difference between fast and faster is measured in tenths of a second. There’s no feeling of “OMG this new one is so fast, my old one is garbage”. It’s “oh, nice, it’s somewhat snappier, you almost kinda feel the difference if you really try”. I’ve seen a double blind test of SSD speed between SATA, NVMe and the new gen-4 NVMe, where people who actually work with those things professionally all day all guessed wrong trying to guess which is which, because it’s all fast enough for whatever you have to do, and a 10x difference in benchmarks is imperceptible by the user. How can that be, you might ask. Well, as I said, computers have been very good for the last ten years or so, and once you have an SSD and a GPU competent enough for the resolution and refresh rate of your monitor, you’re not going to perceive any lag in normal use. Sure, the benchmarks are going to show bigger numbers, but it’s like speed and acceleration – you don’t perceive speed, you perceive only acceleration. It also depends on the task you’re using it for. For instance, I use a Raspberry Pi 3B+ as a local backup and development server, and I don’t perceive it as “slow” for what it does; it runs a development copy of this server, and I use it to test code before I deploy. It doesn’t even have an SSD, just a microSD memory card and an USB thumb drive. Why don’t I use something faster, like a Raspberry Pi 4? It uses more power, so I would be hesitant to leave it always on, so it would be worse for the purpose. The same goes for the NUC – it’s faster, it’s better in every conceivable way, but it doesn’t matter for the intended purpose. If something is fast enough, faster doesn’t mean better, it means “meh”.

I’m in a weird position where most of my computers are more than 4 years old, all the YouTube salesmen are trying to sell me expensive new and shiny hardware, and if I listened to them and replaced all my “garbage” hardware, it would cost me enough to buy a car, and it would produce exactly zero perceivable difference in practical use for the things I do. One reason for that is that I actually did upgrade things that matter to me – the SSD in both my Macbook Pro and my desktop PC is a 1TB Samsung 970 EVO NVMe. If you know tech, you know how fast that thing is. I have 32 GB of RAM in the desktop, the monitor is a huge 43” 4K thing, and the GPU is powerful enough to run games at the monitor’s refresh rate in the native resolution; and yes, it’s completely silent in normal operation. That thing is essentially un-upgradable, because whatever you change you don’t really get a better result, you just waste money. The same goes for the Raspberry pi server: I have a NUC here I could replace it with, so it’s not even a matter of money, it’s a matter of why the fuck would I do that, because it would do the same tasks equally well. At some point, upgrading feels like changing your dishwasher just because there’s a new model. No wonder the YouTube salesmen are trying so hard to create enthusiasm for the whole thing, because it’s really, really hard to give even the slightest amount of fuck at this point.

Apple M1 chip

Here’s my take on it, based on what we know so far.

It’s a modified mobile APU, which means it has both strengths and drawbacks of other Apple’s mobile chips: it produces great computational power and uses very little energy. That much is obvious from the reviews.

The drawbacks are that the peripheral connections seem to be an afterthought. It appears to have two thunderbolt connections, but if you read carefully it turns out that when you connect one monitor to the USB C, the other USB C can’t connect the second monitor, and it’s questionable how much you can connect to it at all, because although they call it thunderbolt, it doesn’t work with e-GPU, and it’s questionable how many PCIe lanes it exposes, if any. Also, the connected monitors seem to mostly work at 30Hz, with 60Hz support being very… let’s say “selective”. Basically, it’s an iPad pro chip slightly tweaked to allow for the peripherals, but the “tweak” in question wasn’t serious enough to support anything even remotely resembling the level of connectivity we normally expect on desktop computers.

Also, they modified the recovery boot menu, completely turning off the option that previously allowed us to boot from an external drive. This means two things. First, if your SSD dies, you’ll need to change the motherboard, you can’t just plug in an USB or Thunderbolt SSD and install the OS there, and continue using the computer. Second, no more installing Linux on a Mac. That there’ll be no BootCamp Windows is already known. They completely locked the hardware in. If they lock the software in as well, a Mac will become a very nicely decorated prison cell.

Also, since the RAM is on the chip itself, that means no RAM expansion. This is a step further from soldering the RAM onto the motherboard, and we’ve only seen this level of non-expandability on smartphones and tablets. One would expect it from an ultrabook, but on a Mac Mini or a Macbook Pro, the inability to change the SSD or upgrade RAM is terrible news. Those systems are so closed off, they feel claustrophobic – no RAM upgrade, no SSD upgrade, peripherals reduced to one monitor, with the other USB C port switching to low capacity mode when that happens, which means the bus is quite constrained and, in lack of another word, anemic.

Having all that in mind, it’s questionable what can one do with all that computational power? It reminds me of my iPhone, which has as much power as a desktop computer, but you are so constrained by all the limitations of the form factor, lack of peripherals, and limitations of the OS, that it remains just a phone that does the trivial phone tasks really, really quickly. For professional use, where you have to consider connecting two external monitors, a fast external storage drive, LAN and similar things, which is what you would actually use to do non-trivial work, the Intel machines are vastly superior, and my recommendation would be to look into the 16″ Macbook Pro, and the new 2020 iMac, which are both excellent. The new machines are good for applications where battery life is the absolute priority, and where you need extreme computational power, but with little or no demands for peripheral connectivity.

Technical issues

I’ve been busy with the technical stuff lately; not only that I had to transcode the database of the forum, to be usable by the new software, and write/adapt parts of the software to fit the needs of the community using it, but I also had a disproportionate number of hardware failures this year. Most of them were bloated or weakened LiIon batteries on phones and laptops, but we also had two NVMe drive failures, including one in the recent days, which was hard to diagnose because I initially suspected some driver or faulty Windows update, but as evidence allowed me to narrow it down, I started to suspect the Samsung system drive, and my confidence in that assessment grew to the point where I preventatively replaced it without waiting it to fail completely and force me to rebuild the system from the ground up. And yes, since I cloned and replaced the drive I had no more system freezes. As in the case with the two failed drives before (Mihael’s Adata SATA drive, and Biljana’s Samsung PCI-E drive in the 13″ Macbook pro), it was controller failure, which produces symptoms so similar it made it possible for me to diagnose it prior to complete failure this time. All in all, I had an increased number of drive failures since we moved away from HDD to SSD technology, and literally none of them were due to NAND wear, which everybody feared initially; it’s always the controller that goes, and it’s the worst case scenario because if you don’t catch it in time, it’s complete and irrecoverable data loss. However, only Mihael’s drive went all the way, because we were late in reacting to it malfunctioning for days, and likely weeks. With Biljana’s laptop, I already had some experience with SSD controller failure so I replaced her drive preventatively and all the symptoms ceased, and I did the same with my own system drive on the main computer. Basically, the symptoms look very much as if the system bus is clogging up and the system events are not going through. When that takes place, I’m starting to suspect the system SSD controller failure. This, of course, puts Apple’s practice of soldering SSD components directly onto the motherboard, so that the SSD can’t be replaced, into perspective. That’s just asking for trouble, because it turns something that can be a simple and straightforward process of “back the old drive up, replace it with a new one, and restore from backup” into motherboard write-off and replacement, and those are expensive. Sure, it can be a welcome excuse for replacing your obsolete computer with a new, more modern one, but in 2 out of 3 cases of SSD failure that I had recently, only one computer was obsolete and ready to be replaced, and two were perfectly good machines that required only a system drive replacement. I am seriously not a fan of having SSD and RAM soldered onto motherboards, because those are the two main things that have to be either upgraded or replaced due to failure, and not allowing for that is just screaming “planned obsolesecence”. It’s like not allowing the GPU to be replaced in a gaming PC, knowing that it’s the first thing that will need to be upgraded in order to keep the machine functional. Sure, I have a habit of keeping the old hardware in use until it becomes completely useless, which means I could occasionally use some sort of a push to buy a new, more capable system, but on the other hand, if I see nothing wrong with the system I’m using, in the sense that it does everything instantly and is not causing me any troubles, why would I have to be forced by the manufacturer to throw it away just because some component went off prematurely? The system I’m using plays Witcher 3 on 4K at 60 FPS, on ultra. It’s not a slow and decrepit system by any stretch of the imagination. If I had to replace the whole computer just because the system drive failed, I would be really pissed, and that’s exactly what would have happened with Apple, if I used one of their fancy-looking machines with SSD soldered on. The only one of their current machines that’s actually designed properly is the new Mac Pro, but that one is so expensive it makes no sense to buy it, unless you hate your money and want to see it burn. Someone will say that you have to pay for quality, but that’s really bullshit since they use the same Samsung NVMe drives I buy off-the-shelf to build my own systems, and based on my experience the drives they use are exactly as likely to fail as any other Samsung drive. So, sure, you can solder it onto the motherboard, but then I want a 5 year warranty on the motherboard with instant replacement in case of component failure, no weaseling out.


I’ve been intrigued by the concept of minimalism, which I see mentioned occasionally.

It’s not a simple issue, because although the first thing that comes up is an uncluttered living/working space, and having a “the fewer the better” approach to things, it’s not necessarily about “less is more”, when you dig deeper. If anything, it’s about reducing dependency, reducing resource drain, and reducing clutter.

However, I’ve seen people who purport to live a minimalist lifestyle, often living in either a very small apartment or even a van, and if there is a trend there, it’s that they compensate for the lack of space and assets with greater investment in time and work, basically having to re-arrange things all the time just to retain a functional environment, and they have to use extensive workarounds to get things done. If I’m watching the people who just prefer to get things done, there’s another trend: they tend to have a large number of various specialized tools, and they don’t care that one could probably do with less; no, they have just the right type of a hammer or an axe or a chainsaw for just that particular type of job, and it’s better. Also, there’s always an inevitable amount of clutter around them, because that’s what happens when you actually do things, but there’s always order to the madness; all the things are normally stored in very specific, often labelled places, and after you’re done with them, they are returned to their specific place. It’s just that you don’t return everything immediately; you leave things around you while you work, and you clean up afterwards. A certain amount of chaos obviously has to accompany the creative process, because you can either focus on what you’re doing, or you can focus on cleaning up, but not both at the same time. Sure, you can do it, but the quality of what you’re doing will suffer. For instance, when I’m writing, I couldn’t care less about the empty cup of coffee or a bag of peanuts on my desk, or if everything is aligned perpendicularly to create the illusion of order. I care about what the keyboard feels like, what the monitor feels like and what the mouse feels like, because that’s what I’m using to create. If there’s a problem with those things, it interferes with my concentration and impedes my creative process, but a certain amount of chaos might actually help, because it doesn’t impose the subconscious stress of trying to keep things orderly all the time.

Also, I could have only one computer, and that would be a minimalist way of doing things, but I don’t; I have multiple computers, specialized for what I use them for. A desktop machine is completely silent, it’s comfortable to use, and it’s powerful. It’s something that just gets things done, and it can cool itself easily even if I push it at 100% for days. Then there’s the 15” laptop, which becomes the primary computer when I’m on vacation. It can do everything the desktop machine can do, except gaming, and you can ask why I don’t just use that for everything, because I could plug peripherals into it and it would do just fine as a desktop machine, but it would overheat, it would be noisier, and it wouldn’t last. So, I’m already at two computers, just for the convenience of not killing the laptop with a 16h/day regime. Then there’s the ultralight laptop, which I use in bed because the 15” is too heavy and cumbersome, or for reading or doing things from a couch or in some weird position. It’s an awfully specialized thing to have a dedicated machine for, but I do, and I find it incredibly convenient, for the same reason I have several types of pocket knives, and several different types of shoes. Sure, you can do everything with just one pair of jeans, and just one pair of shoes, but I find that awfully inconvenient, and although it seems simple and elegant, it forces you to constantly adapt to the inadequacies of the equipment you’re using, and it introduces stress, hassle and just breaks your concentration from the things you actually want to do. Sometimes less is indeed more, but if you ever tried to fix something that unexpectedly broke, you’ll know how convenient it can be to have a certain amount of junk somewhere that can be adapted to fix something. If you don’t have your small personal junk yard, you’ll be forced to go drive to a store every time you need a SATA cable or a screw for the SSD or a nail to hang a painting, or a wood screw to fix something that got loose. No, it doesn’t look elegant, and having capability to create or fix things will not make your place look like Apple store, but at some point you need to ask yourself if minimalism is actually contributing to or detracting from your productivity. So, no, less is not more if you need that spare SATA cable, and it’s definitely not more if your one and only computer unexpectedly died and you don’t have a secondary one to look for possible solutions on the Internet.

That’s where I departed from the conventional interpretation of minimalism, and started thinking about defining it as something more akin to self-reliance, or not depending on others to solve your problems. A minimalist approach in that sense doesn’t consist of having only one computer, and it being an elegant iMac or a Macbook Pro. It consists of using generic components you can source locally to build your own computer, building it in ways that make it easy for you to repaste the CPU, change noisy fans, clean up dust, install and set up the OS yourself, and be able to maintain the whole thing without anyone’s help. Sure, such a box doesn’t look elegant, but it becomes very elegant if you need to take off the CPU cooler and change the paste, because the whole thing isn’t glued in behind the screen. It’s two big screws to remove the side panel, and some more screws to remove the cooler, and everything is big enough to work on comfortably and quickly. Essentially, the more elegant and “clean” things look, the more pain in the arse they can be to maintain if something goes wrong with them, and sometimes you can’t even fix them at all, you’re expected to just throw them away and get another one, because that’s also “elegant” and “clean”.

The same goes for software. The older I am, the more I tend to use the most user-unfriendly, basic tools imaginable, such as connecting to a local server via ssh, connecting to the database via shell tool where I type all the commands manually, with no fancy GUI tools, I type code in pico editor, rsync it to a production server, and it all works the same regardless of what computer I actually use to do it – I couldn’t care less whether it’s a fancy and elegant Mac, or a Raspberry pi board connected to other shit with wires hanging. What is minimalistic and elegant in this approach is that I don’t rely on having lots of secondary shit installed on my computers, and I don’t try to maintain a super-complex software system that is supposed to make things “easier” by complicating everything to the point of a bloated mess. No, make things simple by learning a few tools that work everywhere, and reduce the number of intermediary steps I have to take in order to get things done. You may think that a nice fancy GUI with icons is a more elegant way of getting files across than rsync, but it’s only elegant if it works, and those things have a tendency to break in various creative ways just when you have to do something quickly, and you can spend a whole day trying to fix something that is really not essential to your primary task, fixing some environment instead of writing your code. So, yes, compared to some “elegant” thing such as an iPhone with user interface chimps and cats can be taught to operate, my ideas of simplicity and elegance can seem counter-intuitive, but guess what? I maintain my own mail server, web server, blog and forum without anyone’s help. If something goes wrong with any of it, I fix it myself. If something goes wrong with my computer, I fix it myself, whether it’s a software or hardware problem. If I have to choose between elegance and self-reliance, I pick self-reliance, because “clean” solutions have a nasty tendency to just displace the messy parts of life somewhere else. Also, if I have to choose between practicality and productivity on one side, and simplicity and elegance on the other, I prefer to just get things done and not let minimalism get in the way. That is how I personally see the desirable kind of minimalism: it’s minimizing the number of things that interfere with the creative process.


I recently experimented with Hackintosh (essentially, a normal PC that has Mac OS installed), and the whole process is intimidating because everybody seems to be giving you a “cookbook” type instructions where you just follow steps without actually understanding what’s going on, and when it works, you end up being no smarter. So, I decided to add the part that’s usually missing.

Basically, it works like this: Mac has specific hardware, such as SMC, that makes it quite different from a PC, and Mac OS gets its basic sensor info and other stuff from the SMC. On a PC, those things are done differently, but if you add a software layer that will trick the OS into thinking it’s talking to Mac hardware, while the software in fact translates the commands and data between the OS and PC hardware, everything will work. Also, there are kernel extensions that trick the OS into thinking some piece of hardware is compatible. This is the complicated part where everybody’s eyes get blurry and they say something along the lines of “fuck all”. However, the good part is that you don’t need to know much about this in order for things to work. You just need to find the “recipe” someone else made for your hardware, copy it to the right place, possibly make adjustments and it will work.

The basic principle is this: there’s a piece of software called Clover, which takes place of your normal bootloader, but it also serves as an intermediary layer that tricks Mac OS. It scans for all bootable drives on your system, exposes them in form of a list, from which you then pick a drive you want to boot. This means that for basic booting into Mac OS, you need a drive with Clover installed, and a Mac OS bootable drive. Everybody is telling you to download Mac OS installation file on a real Mac, enter a few commands to make a bootable USB drive, and suffocate you with technobabble. I have a simpler explanation. Get a Clover ISO somewhere, and burn it onto the USB stick. Get pre-cooked EFI for your hardware. Copy this EFI onto the clover boot drive. At that point, if you connect both the Clover USB stick and a drive that would boot into Mac OS, such as the Time Machine backup drive, boot into the Clover stick, wait for the Clover to give you the list of bootable drives, and boot into the Time Machine system recovery partition or whatever it’s called. It will give you the option to install Mac OS on an empty drive. I assume you already have one, so format it in Disk Utility, exit disk utility, choose to either install a fresh copy of the OS or to restore from backup, go through the steps, and when it reboots, again boot into Clover and pick the right partition to boot into, and after a few steps you’ll have a working system. Theoretically, if your Mac has a standard SATA drive, you could just pull it out of a mac, plug it into a PC, boot into Clover, select the Mac drive and boot into it and you’d have a working Hackintosh. There’s just one more step, and that’s transferring Clover onto your Mac drive, so that you can dispense with the Clover USB stick. Boot into Hackintosh, install a tool called Multibeast, and it will transfer Clover onto your Mac OS system drive, after which point this drive is no longer safely bootable in a real Mac. Then use the Clover configuration tool to mount the EFI, and then copy the EFI cookbook specific for your hardware from the Clover stick to the EFI on the Mac OS drive. Unmount, reboot, pull the Clover stick out, go to the BIOS and select the Mac OS drive as the first boot option, and you should then boot into the Clover menu, and you know what you do from there.

I’m starting to sound as complicated as the guys who are making the Hackintosh instructions, but what I wanted to say is that you need 2 things: a drive that would boot into Mac OS on Mac hardware, and the Clover bootable stick with an EFI cookbook for your hardware. After that point everything starts making sense. The only thing to avoid is putting a drive with Clover EFI into a real Mac. That will make your Mac unbootable until you do a NVRAM/SMC reset, and even that might not work because I haven’t tried.

There’s a reason why it’s called Hackintosh: it’s janky as fuck. The only thing I can think of that’s as unintuitive, creates as much problems without solving any, and wastes as much time, is trying to install Windows 95 or something similar onto modern hardware. Try it once, you won’t try it again. In comparison, Linux is the most intuitive and user friendly thing ever. Also, there’s a much better chance you’ll get all your hardware working in Linux. I’m not kidding. Stuff like Bluetooth/wifi will almost certainly not work, and you better not have a Nvidia GPU, because you can get it to work but will almost certainly suffer stability issues. Also, on a major OS update everything will break.

The reason why you would want to do it is not to get a normal Mac desktop on PC hardware, it’s to get a basic barely-working Mac desktop on PC hardware where you can run things such as the xcode compiler needed to build iOS and Mac executables, and you won’t mind much if you don’t have Airdrop or Bluetooth or if sound doesn’t work. Essentially, it’s a way to get a very fast Mac OS platform for running some obscure Mac OS piece of software that you need for some specific task, do whatever you have to do with it, and then boot back into a normal OS where everything works properly.