I recently experimented with Hackintosh (essentially, a normal PC that has Mac OS installed), and the whole process is intimidating because everybody seems to be giving you a “cookbook” type instructions where you just follow steps without actually understanding what’s going on, and when it works, you end up being no smarter. So, I decided to add the part that’s usually missing.

Basically, it works like this: Mac has specific hardware, such as SMC, that makes it quite different from a PC, and Mac OS gets its basic sensor info and other stuff from the SMC. On a PC, those things are done differently, but if you add a software layer that will trick the OS into thinking it’s talking to Mac hardware, while the software in fact translates the commands and data between the OS and PC hardware, everything will work. Also, there are kernel extensions that trick the OS into thinking some piece of hardware is compatible. This is the complicated part where everybody’s eyes get blurry and they say something along the lines of “fuck all”. However, the good part is that you don’t need to know much about this in order for things to work. You just need to find the “recipe” someone else made for your hardware, copy it to the right place, possibly make adjustments and it will work.

The basic principle is this: there’s a piece of software called Clover, which takes place of your normal bootloader, but it also serves as an intermediary layer that tricks Mac OS. It scans for all bootable drives on your system, exposes them in form of a list, from which you then pick a drive you want to boot. This means that for basic booting into Mac OS, you need a drive with Clover installed, and a Mac OS bootable drive. Everybody is telling you to download Mac OS installation file on a real Mac, enter a few commands to make a bootable USB drive, and suffocate you with technobabble. I have a simpler explanation. Get a Clover ISO somewhere, and burn it onto the USB stick. Get pre-cooked EFI for your hardware. Copy this EFI onto the clover boot drive. At that point, if you connect both the Clover USB stick and a drive that would boot into Mac OS, such as the Time Machine backup drive, boot into the Clover stick, wait for the Clover to give you the list of bootable drives, and boot into the Time Machine system recovery partition or whatever it’s called. It will give you the option to install Mac OS on an empty drive. I assume you already have one, so format it in Disk Utility, exit disk utility, choose to either install a fresh copy of the OS or to restore from backup, go through the steps, and when it reboots, again boot into Clover and pick the right partition to boot into, and after a few steps you’ll have a working system. Theoretically, if your Mac has a standard SATA drive, you could just pull it out of a mac, plug it into a PC, boot into Clover, select the Mac drive and boot into it and you’d have a working Hackintosh. There’s just one more step, and that’s transferring Clover onto your Mac drive, so that you can dispense with the Clover USB stick. Boot into Hackintosh, install a tool called Multibeast, and it will transfer Clover onto your Mac OS system drive, after which point this drive is no longer safely bootable in a real Mac. Then use the Clover configuration tool to mount the EFI, and then copy the EFI cookbook specific for your hardware from the Clover stick to the EFI on the Mac OS drive. Unmount, reboot, pull the Clover stick out, go to the BIOS and select the Mac OS drive as the first boot option, and you should then boot into the Clover menu, and you know what you do from there.

I’m starting to sound as complicated as the guys who are making the Hackintosh instructions, but what I wanted to say is that you need 2 things: a drive that would boot into Mac OS on Mac hardware, and the Clover bootable stick with an EFI cookbook for your hardware. After that point everything starts making sense. The only thing to avoid is putting a drive with Clover EFI into a real Mac. That will make your Mac unbootable until you do a NVRAM/SMC reset, and even that might not work because I haven’t tried.

There’s a reason why it’s called Hackintosh: it’s janky as fuck. The only thing I can think of that’s as unintuitive, creates as much problems without solving any, and wastes as much time, is trying to install Windows 95 or something similar onto modern hardware. Try it once, you won’t try it again. In comparison, Linux is the most intuitive and user friendly thing ever. Also, there’s a much better chance you’ll get all your hardware working in Linux. I’m not kidding. Stuff like Bluetooth/wifi will almost certainly not work, and you better not have a Nvidia GPU, because you can get it to work but will almost certainly suffer stability issues. Also, on a major OS update everything will break.

The reason why you would want to do it is not to get a normal Mac desktop on PC hardware, it’s to get a basic barely-working Mac desktop on PC hardware where you can run things such as the xcode compiler needed to build iOS and Mac executables, and you won’t mind much if you don’t have Airdrop or Bluetooth or if sound doesn’t work. Essentially, it’s a way to get a very fast Mac OS platform for running some obscure Mac OS piece of software that you need for some specific task, do whatever you have to do with it, and then boot back into a normal OS where everything works properly.

About computer upgrades

I’ve been thinking about computer hardware recently, since I had issues with two 2015 Macbook Pros – Biljana’s 13” had a defective SSD and a bloated battery, and my 15” had an even more bloated battery and 256GB SSD which had only 20-30 GB free space. Biljana’s laptop was already retired earlier this year, but I had to figure out what I wanted to do with mine, and ended up finding a very cheap upgrade path. I had a cheap but good replacement battery built in, and replaced the SSD with an adapter and a standard Samsung 970 EVO NVMe drive of 1TB. I decided to upgrade because unlike Biljana’s 13” that was a 2-core 8GB RAM machine, mine is a 4-core i7 with 16GB RAM and a 15” screen that’s perfectly good for photo editing and I had no issues with it other than the battery and a small SSD. Those being fixed now, I’m quite happy with it, which brings me to the main issue: is there a need to upgrade computer hardware regularly anymore, or has technology peaked? Right now I’m using several computers, and none of them are exactly new. My desktop machine is still a Skylake i7-6700K, my laptop is a Haswell i7-4770HQ, my phone is an iPhone 8 plus, the tablet is an iPad Mini 4, and the machine I’m writing this on, an ultralight hybrid Asus UX370UAR, is actually the newest and uses a Kaby lake R i5-8250U. Why am I using technology that’s basically 5 years old? Because it’s not upgradable, in a sense that upgrades don’t make it faster. Sure, you can replace it with something newer, but you don’t gain anything other than greater numbers on multi-core benchmarks; the actual speed and functionality is the same. I tested the new 16” Macbook Pro when I bought it for Biljana, and guess what, it felt almost identical to my 15”, which means I could replace mine with an expensive new thing and it would feel exactly the same. Sure, the touchpad is bigger, the screen is bigger and a bit better, but it doesn’t feel like a big difference.

I also came to an interesting conclusion when I plugged in different things into my desktop peripherals to see if anything is faster than my desktop, and it turned out that the CPU is the least important thing, because I have several machines with similar CPU/RAM/SSD performance, and they all felt laggy compared to my desktop, when I use them for normal desktopy things such as watching Youtube, switching between many apps and resizing windows to fit the big 4K screen. Guess why that was? Because today everything is strongly GPU accelerated, and driving a big 4K display is very speed-sensitive, partially because of the resolution, but mostly because of the physical screen size (43”), which visually magnifies all the problems, and the only two GPUs that worked fast enough not to cause visual lag are my 1080ti and my son’s 2070. Basically, it’s the GPU that makes all the difference, and as far as CPU power goes, the Haswell i7 in my Macbook Pro or the i5-8250U in the ultralight Zenbook are perfectly sufficient for everything I do, provided that they are equipped with enough RAM and fast storage. It’s not that I didn’t test the new 6-core machines; it’s just that I run the multi-core stress on my machines so rarely, that it doesn’t make a difference. However, if someone tells you that GPU doesn’t matter if you don’t play games, and you’re fine with integrated graphics, that’s probably true if you run a 1080p display, but on a big 4K display there’s a big difference. Integrated graphics works in a pinch, but it’s visibly stunted and creates an impression that the machine is much slower than it actually is. Even something like the AMD 270X was too slow for the 4k display, and I’m not really sure what’s enough and what’s overkill. I do know that 1080ti and 2070 are perceived to be equally fast and are great. I don’t know what’s the cheapest GPU that would suffice, because didn’t have many to test, but I would theorize that if something can’t run Valley benchmark smoothly at 4k, it might be too slow for the demands of window manager acceleration as well. Interestingly, the same doesn’t apply for the lower resolutions, because my 15” Macbook drives its own retina display perfectly fine with Intel graphics, and when I plug it into a 1080p monitor, it’s blazingly fast, and yet it can’t run Valley benchmark on those resolutions to save its life. However, on 4K, the only GPUs that are actually fast in Windows are also fast for gaming at the high resolutions. Years ago my recommendation would be to get the worst GPU that can still run your screen at the desired resolution and color depth, because GPU was not important, unless you wanted to play games. Today, my recommendation is the complete opposite: if you want to drive a 4k display or bigger, the GPU is the most important part of your system, and you should get a strong gaming PC as your desktop machine, regardless of how many games you intend to play. It’s just that your display will require powerful GPU acceleration to run smoothly, in everything from web browsing, scrolling to window resizing. However, if you don’t run 4K or 5K displays, you can greatly relax the GPU requirements: integrated graphics, such as Intel 620, will be perfectly snappy at 1080p, and you should only get dedicated graphics for gaming, and if you do GPU accelerated tasks such as video editing.

So, regarding upgrades, it’s all good news: basically, if you have anything that is Haswell or newer, if you have at least 16GB RAM and a fast SSD drive, your machine will run all normal tasks as quickly as a modern machine, providing that your GPU is modern enough for driving your display resolution. If you have specific tasks that require more power than that, well, then these general guidelines don’t apply to you, but all in all, unless your PC is really ancient, you will only need to upgrade when it finally dies, not before. But if your machine actually is ancient, you should definitely try the new generations because they are awesome. I bought an 8-th gen i5 Intel NUC for testing, and that thing is absolutely awesome as a desktop machine, if you’re running it at 1080p. At 4K, it’s marginal; it sucks in Linux and Mac OS, it’s much better at Win10, but still nowhere near the brutal speed of my 1080ti. At this point, Win10 has superior window manager acceleration and driver optimization and will extract the maximum from marginal GPUs.

Someone will say that NUC is overpriced and you can get a Raspberry Pi 4 for much less money, at which point I’ll just roll my eyes. Yes, you can, but the difference in speed is so great it’s not even funny. The NUC runs NVMe and SATA drives, it has an immensely superior GPU, it has socketed RAM which can go up to 32GB, and I tested both so I actually know. Raspberry Pi 4 is fine for web browsing and document editing, it’s great as a console for accessing other Unix systems, or a small home UNIX server (I actually have a 3B+ plugged into my home LAN as a server for rsyncing remote backups and hosting my e-mail database), but it absolutely sucks for anything video-related. It has some kind of GPU accelerated video playback but software support for it is sketchy or outright missing, so it works in some specific video modes and codecs, and completely fails in others, and generally, it’s rubbish for video. NUC, on the other hand, is better at 4K than Pi 4 is at 1080p, and that really tells you something. NUC can run photo editing in Lightroom perfectly fine, and that’s a professional-grade task. It’s my assessment that its speed is identical to that of the 15” Macbook Pro retina in Mac OS (hackintosh), and the benchmarks confirm it. So, that’s one type of a modern machine you can get today: it fits on your palm, it’s as quiet as a Macbook Pro, doesn’t draw much power, it’s blazingly fast, and its only drawback is that you can’t add a dedicated GPU later on, if you decide that you need it; for those cases, a “normal” desktop PC would be better. So, basically, this is the best time ever to buy a PC, because they are for the most part incredibly good. On the other hand, they’ve been as incredibly good for the last 5 years or so. As for the phones, they also peaked long ago: today they are all the same; pick your OS, pick the higher price level to avoid outright garbage, and you’re set. I can’t even force myself to think about them seriously anymore, they are like washing machines. If yours dies and it’s not economical to repair, you just go to the store and pick a new one: if you avoid the cheapest garbage, they are all the same and will work great.

Electric cars

In the recent years we’ve been bombarded by propaganda trying to shove electric cars down our throat, regardless of the fact that nobody really wants them, so I’ll write some things about that.

First of all, I have to say that I actually like electric cars as a concept. The electric motor is much more reliable and easy to maintain than the internal combustion engine. It also has excellent performance curve. My problem is with other things: first, the Li-ion battery is simply unfit for purpose. It decays after a few years, which is a problem since it’s the most expensive part of the car. It uses up Lithium, which is a very rare element that has to be mined and transported across huge distances, it’s a much more limited finite resource than petroleum, the batteries pose an inherent fire hazard which increases with age, use and mechanical damage, in order to power a car a battery needs to have huge capacity, and in order to charge such a huge battery you need either lots of time, or you need to shove an incredible amount of amps into the battery in a very short period of time, in your garage during the night, and this makes me uneasy, because if something goes wrong you have an incredibly deadly mixture of high current, dangerous chemistry and fire. Some of the battery issues can be resolved in the future, but we are not there yet. Right now the towing companies outright refuse to deal with wrecked electric cars because they are such a hazard.

Also, the electric cars are incredibly uneconomical. They cost more and do less. Apparently, most people agree with me since adoption of electric cars was not widespread, outside the circle of rich hipsters at least. You see, there’s a much more ecological and economical option: get a diesel with a modern particle filter and you get something that goes fast, sips fuel, is so low emission it actually beats most sources of electricity and certainly beats the environmental impact of Li-ion batteries, and is comparatively a bargain. Which is why the eco-nutcases are now working to badmouth and eventually ban diesel. Don’t get me wrong, I’m actually in favour of outlawing the old diesel shitboxes that leave black clouds of suffocation behind them, but we are talking about cars that don’t have particle filters. Euro 5 and Euro 6 diesels are not only not a problem, they are actually the best solution available. Everybody should drive those and the ecological impact of cars would drop to the point of background noise. Also, natural gas makes great sense as fuel, since it’s abundant and cheap, and you can easily convert gasoline engines to run on it and reduce their environmental impact, not on the combustion side, but on the oil refinery side of the equation.

But when we get to the energy supply side of things, it’s not like electricity is actually an abundant resource. In fact, I don’t see additional nuclear power plants being built to offset the expected increase in power consumption caused by the electric cars. Everybody talks about those stupid windmills that are the most useless and dirty power source of all time, and solar panels which are basically toxic waste, don’t work in most places for the majority of time, and work only at the time of day when people don’t charge their electric cars; the peak expected consumption would be over night. And oh yeah, neither the windmills nor the solar panels are recyclable. It’s just terrible landfill fodder, which makes “ecological” electricity sources terrible for the environment.

Which is why I expect the following to take place.

Now it’s “diesel is nasty, has to be banned, electric cars are pure and clean and wonderful”. Let’s say everybody switches to electric cars. Then it will be “electric cars consume more power in a day than a normal household consumes in a year, how dare you drive those pigs!”, “electric cars use components sourced from poor countries/using child labour/consuming finite resources, you should all feel guilty and kill yourselves for driving them”, and so on, ad nauseam. The next step would obviously be to restrict private vehicle ownership and push us to public transportation, which would of course be as crowded as those Japanese bullet trains where people are packed as sardines in a can. Which brings us to the next step, where the eco-freaks will simply kill us all because that solves all problems.

New Apple stuff

Apple recently released the new 16″ laptop, and it’s a significant improvement over the fail-fest that’s been going on since 2016. Still a reduced selection of ports, still cooling any plasticky gaming laptop would easily surpass, but there’s the ESC key again, and the keyboard is no longer that terrible breakable thing.

However, I lost a whole day of vacation fixing the mess Catalina update made on my mid-2015 15″ retinabook, having disabled 32-bit code, which made Virtualbox not run, macports initially didn’t run so I had to wait for an update, GPG didn’t run, Synergy didn’t run, and half of other stuff either didn’t work, or had to be updated. Stuff that had no updates was a problem, but I replaced most of everything with open source stuff compiled from the macports library, but it left me wondering what would happen if Apple did another “upgrade” and simply blocked macports or at least its access to the xcode compiler. That would render the computer completely unusable to me. Also, something seems to be broken with the Virtualbox guest VGA driver, and so much of the functionality I relied upon was broken by that single update (including my mail archive no longer syncing due to Apple making “improvements” to the mail application, forcing me to get a paid upgrade for the archiving software) that I got incredibly pissed.

Also, after having used it for years I decided that the 15″ laptop is too big for what I normally use it for, which is to put it in my lap and write articles. The keyboard is too far away, the whole thing is too big and unwieldy, and the only plus is the screen and the speed. With everything else, I was able to get much more comfortable with my old 13″ Air, so I don’t think I’ll be getting another big laptop again, especially since I’ve been using an ultralight Asus (Zenbook Flip UX370UA) for half a year or so and I happen to be using it much more frequently than I do the big Macbook, and only due to a much more practical size, at least for what I use it for. Also, Microsoft integrated Linux into the Win10, so now I have full access to the CLI tools that I normally use even without OS X or virtualization, which makes Windows machines very usable to me, as usable as Macs. Sure, I can write text messages or pick up a phone call from the Macbook, which makes it very convenient at times, and the Macbook screen and touchpad are still significantly superior to anything non-Apple, but the gap is decreasing due to Apple screwing up increasingly more, and others doing increasingly more things right.

The Windows laptops still have significant problems. First, almost everything has a 16:9 screen ratio, which is terrible for small laptop screens; it starts making sense from 17″ upwards. Second, touchpads on Windows laptops range from significantly worse than Apple, to absolute garbage. Third, Windows has a nasty habit of not completing the suspend command if some process refuses to respond, which leads to closing the laptop that is still actually on, merrily overheating and draining your battery in the bag. This makes Windows behave terribly on laptops. Also, Win10 constantly updates, which makes it incredibly annoying after a while. Honestly, I don’t want to even see anything updating other than the antivirus. It can update itself twice a year and that’s it. The only improvement introduced by those updates was the WSL, everything else was cosmetics and had no business rebooting my system. Microsoft should seriously reduce the frequency of updates because this is getting on everybody’s nerves. However, other than that, Win10 is fine. It’s fast, it’s elegant, it’s comfortable to use, and for the most part it’s as reliable as Mac OS, and much more reliable than any Linux desktop. Essentially, Apple is one serious fuckup away from me switching completely to Windows/Linux combo. On the other hand, Windows was always one serious fuckup away from me switching to Mac/Linux combo, so things are quite equally matched now.