Earthquake

Another powerful earthquake here in Croatia. Epicentre Petrinja, magnitude initially reported as 6.3, revised down to 6.2 on momentum tensor scale; EMSC reports 6.4. Reports of strong damage at the epicentre. Seven people reported killed so far, many severe and light injuries, widespread damage in the Petrinja city – apparently half the buildings in the city centre are destroyed.

Day later (30/12) multiple aftershocks, some very powerful, further damage, more casualty reports. There were utility disruptions in Zagreb yesterday, now mostly restored. The Sisak/Petrinja region sustained extreme damage, almost the entire population is left homeless in the middle of winter.

Damage reported on dikes on Sava river during high water levels which creates danger of flooding. Damage on the roads, deep cracks newly formed in the ground.

 



Pictures (C) Jutarnji

Felt very strongly here in Zagreb, my estimate is 6-7 on the Mercali-Cancani-Sieberg scale. My condo suffered slight additional damage, according to my visual inspection. Plaster and pieces of concrete on the floor everywhere, things flew all over the place, but nothing really bad.

About hardware reviews

I’ve been watching the hardware reviews on YouTube pretty much regularly for the last couple of years, and I see a clear trend there.

Remember those commercials on TV where the host is super-excited about the set of knives he’s trying to sell you, or some kitchen appliance, or some additive for automotive oil? Yeah, that’s those supposedly freelance PC hardware reviewers. They are the new kitchen appliance salesmen.

That doesn’t say they are completely useless. If you’re interested in what they are trying to sell you, they are quite informative, or otherwise I wouldn’t be watching them, but never in a million years should you forget that they are basically an arm of the advertisement sector. Their revenue stream comes from generating interest for what they are presenting, and interest, for the product manufacturers, means increased sales, so there is a clear motive for the manufacturers to funnel their advertising budget to the hardware reviewers who can generate the most useful form of customer interest, which can be described as “enthusiasm for the product and the brand”.

One of the ways of creating enthusiasm for what they are trying to sell you is to create a perception that we live in a parallel universe where your current computer is slow, and you really need an upgrade. An excellent example of this is a video I watched just now, basically saying that the new thing Apple released is so good, everything else is a steaming pile of garbage and you should get the shiny new gadget in order for your life to have meaning and purpose. Yawn.

Let’s put things into perspective, shall we? The computers have been an absolute overkill for the last several years. Laptops and desktops with Intel Haswell CPU, released in 2013, if they are equipped with a good SSD and GPU, are blazingly fast. I have a Haswell Macbook pro 15” and I use it to edit photos when I’m away from home, and the stuff I’m doing with it isn’t trivial – some of it is creating panoramas of more than 10 RAW files from a 24MP camera in Lightroom – and guess what, it’s fast and great. I have absolutely no complaints on its performance, unless we’re talking about GPU. Sure, I’ve seen and worked with nominally faster computers, but for the most part “fast” is a number in a benchmark, not something you actually feel. If you’re running a task that really takes a while, such as re-indexing an entire catalog in Lightroom, it’s going to take hours no matter what hardware I use. Whether it takes two or four hours is quite irrelevant, because that’s an operation I do once in a few years, and when I do it I just leave the computer overnight to do its thing, and it’s done in the morning. I’m certainly not going to sit behind it for hours with a stopwatch, and no, upgrading hardware isn’t going to make it exponentially faster, because it needs to load tens of thousands of RAW files and create JPEG previews for all of them, and that’s such a massive workload it doesn’t really matter if your computer is twice as fast, because it’s still taking hours. In normal operation, the difference between fast and faster is measured in tenths of a second. There’s no feeling of “OMG this new one is so fast, my old one is garbage”. It’s “oh, nice, it’s somewhat snappier, you almost kinda feel the difference if you really try”. I’ve seen a double blind test of SSD speed between SATA, NVMe and the new gen-4 NVMe, where people who actually work with those things professionally all day all guessed wrong trying to guess which is which, because it’s all fast enough for whatever you have to do, and a 10x difference in benchmarks is imperceptible by the user. How can that be, you might ask. Well, as I said, computers have been very good for the last ten years or so, and once you have an SSD and a GPU competent enough for the resolution and refresh rate of your monitor, you’re not going to perceive any lag in normal use. Sure, the benchmarks are going to show bigger numbers, but it’s like speed and acceleration – you don’t perceive speed, you perceive only acceleration. It also depends on the task you’re using it for. For instance, I use a Raspberry Pi 3B+ as a local backup and development server, and I don’t perceive it as “slow” for what it does; it runs a development copy of this server, and I use it to test code before I deploy. It doesn’t even have an SSD, just a microSD memory card and an USB thumb drive. Why don’t I use something faster, like a Raspberry Pi 4? It uses more power, so I would be hesitant to leave it always on, so it would be worse for the purpose. The same goes for the NUC – it’s faster, it’s better in every conceivable way, but it doesn’t matter for the intended purpose. If something is fast enough, faster doesn’t mean better, it means “meh”.

I’m in a weird position where most of my computers are more than 4 years old, all the YouTube salesmen are trying to sell me expensive new and shiny hardware, and if I listened to them and replaced all my “garbage” hardware, it would cost me enough to buy a car, and it would produce exactly zero perceivable difference in practical use for the things I do. One reason for that is that I actually did upgrade things that matter to me – the SSD in both my Macbook Pro and my desktop PC is a 1TB Samsung 970 EVO NVMe. If you know tech, you know how fast that thing is. I have 32 GB of RAM in the desktop, the monitor is a huge 43” 4K thing, and the GPU is powerful enough to run games at the monitor’s refresh rate in the native resolution; and yes, it’s completely silent in normal operation. That thing is essentially un-upgradable, because whatever you change you don’t really get a better result, you just waste money. The same goes for the Raspberry pi server: I have a NUC here I could replace it with, so it’s not even a matter of money, it’s a matter of why the fuck would I do that, because it would do the same tasks equally well. At some point, upgrading feels like changing your dishwasher just because there’s a new model. No wonder the YouTube salesmen are trying so hard to create enthusiasm for the whole thing, because it’s really, really hard to give even the slightest amount of fuck at this point.

Apple M1 chip

Here’s my take on it, based on what we know so far.

It’s a modified mobile APU, which means it has both strengths and drawbacks of other Apple’s mobile chips: it produces great computational power and uses very little energy. That much is obvious from the reviews.

The drawbacks are that the peripheral connections seem to be an afterthought. It appears to have two thunderbolt connections, but if you read carefully it turns out that when you connect one monitor to the USB C, the other USB C can’t connect the second monitor, and it’s questionable how much you can connect to it at all, because although they call it thunderbolt, it doesn’t work with e-GPU, and it’s questionable how many PCIe lanes it exposes, if any. Also, the connected monitors seem to mostly work at 30Hz, with 60Hz support being very… let’s say “selective”. Basically, it’s an iPad pro chip slightly tweaked to allow for the peripherals, but the “tweak” in question wasn’t serious enough to support anything even remotely resembling the level of connectivity we normally expect on desktop computers.

Also, they modified the recovery boot menu, completely turning off the option that previously allowed us to boot from an external drive. This means two things. First, if your SSD dies, you’ll need to change the motherboard, you can’t just plug in an USB or Thunderbolt SSD and install the OS there, and continue using the computer. Second, no more installing Linux on a Mac. That there’ll be no BootCamp Windows is already known. They completely locked the hardware in. If they lock the software in as well, a Mac will become a very nicely decorated prison cell.

Also, since the RAM is on the chip itself, that means no RAM expansion. This is a step further from soldering the RAM onto the motherboard, and we’ve only seen this level of non-expandability on smartphones and tablets. One would expect it from an ultrabook, but on a Mac Mini or a Macbook Pro, the inability to change the SSD or upgrade RAM is terrible news. Those systems are so closed off, they feel claustrophobic – no RAM upgrade, no SSD upgrade, peripherals reduced to one monitor, with the other USB C port switching to low capacity mode when that happens, which means the bus is quite constrained and, in lack of another word, anemic.

Having all that in mind, it’s questionable what can one do with all that computational power? It reminds me of my iPhone, which has as much power as a desktop computer, but you are so constrained by all the limitations of the form factor, lack of peripherals, and limitations of the OS, that it remains just a phone that does the trivial phone tasks really, really quickly. For professional use, where you have to consider connecting two external monitors, a fast external storage drive, LAN and similar things, which is what you would actually use to do non-trivial work, the Intel machines are vastly superior, and my recommendation would be to look into the 16″ Macbook Pro, and the new 2020 iMac, which are both excellent. The new machines are good for applications where battery life is the absolute priority, and where you need extreme computational power, but with little or no demands for peripheral connectivity.

Language peculiarities

I’ve been asked many times what’s the difference between a crow and a raven, and my answer was that crow is the species in general and raven is a male specimen, something like sheep and ram. However, I never felt perfectly satisfied with this answer, until one day I found out that the Icelandic word for raven is “hrafna”, and then it clicked – they come from two completely different languages, raven is the Norse “hrafna”, and crow is the Latin “corvus”.

I apologize in advance if you learned about this in kindergarten or elementary school, but it’s actually new to me. 🙂 The same goes with the names of the days of the week, where some are obvious, but I didn’t really get the etymology of others until recently:

Sunday – Sun
Monday – Moon
Tuesday – Tíw (old English for Norse Týr), the one-armed god of war
Wednesday – Wōden/Wotan (old English for Odin)
Thursday – Thor
Friday – Freya
Saturday – Saturni dies, Latin