Razorback
NewsProjectsGuidesResourcesContact

Archive

Quick Links

ARM Must Die

June 15th, 2021 at 3:55 PM by Kugee
Category: Tech/Hardware

Arm with middle finger extended

Many abominations have strangled the minds of plenty of modern tech users - UEFI, TPM, Windows 10 (and now its "successor"), Python, and Google-based solutions for everything. Few are as menacing as a shitty obscure CPU that has more recently become the subject of heavy shilling: ARM.

You've probably heard it from a bunch of big names before: "x86 is flawed! ARM is the future!" The overwhelming bias in favor of ARM in most of the tech press might have you believing that it has every justification in its hand to take over the PC market. But as someone who doesn't make money off of doing this kind of insufferable shilling for bad ideas, it is my duty to step in on behalf of those who helped explain how awful ARM is... THESE SHILLS ARE LYING TO YOU. THEY DO NOT HAVE PRINCIPLES.

ARM is Not Faster

At best, part of ARM's speed is misleading. In order for ARM to seem faster in certain things, it has to cheat by having various ASICs integrated into a CPU, which are dedicated to specific things like Java, artificial intelligence, x86 emulation, and shit like that... everything which the x86 platform could very easily do on its own.

ARM's website will be all wishy washy about how ASICs can save space and improve energy efficiency, but if the latter's what CPU designers are looking for, maybe the application in question sucks? Come on, its real purpose is exactly what it plainly states in their glossary! It's about deterring reverse engineering efforts! [1]

On top of that, x86 products created by those willing to switch over to ARM are destined to be sabotaged so as to make x86 look bad. That's exactly what Apple's pulling right now, as their new "macOS 12" operating system is going to have almost all of its features be ARM-exclusive, all because it has some ASIC in there and they pretend the same shit can't be done on a software level with raw x86 power.

When you take away those ASIC steroids and the undisclosed optimizations for ARM in macOS (I can never get used to writing it that way), it's hard to be convinced that ARM is so capable on its own. Even with so much brute force funding from the iPhone fortune, M1 isn't really earth-splitting; it's more around the same as other high end CPUs. [2] There's no denying that 8088 was a slug, but it wasn't designed to be for low-end systems like ARM originally was. Remember those British microcomputers they called "Acorn"? Basically some kind of Amiga knockoff I guess...

It begs the question: can you really put so much faith into a RISC-based architecture so as to not end up at a roadblock where you find yourself needing to switch to x86? While Apple held PowerPC's performance in high regard, they definitely weren't willing to gamble their whole stake in it.

That's why Apple continued compiling Mac OS X for x86 the whole time it was publicly a PowerPC-exclusive operating system, before they came forth and said "aw shit, G5 has serious thermal issues, Intel's next generation lineup beats it on performance per watt so that's where we're going now!" When Apple goes full ARM-exclusive, I see no reason not to believe they'll continue compiling for x86 long after that in another secret campus. Big money and brute force can only get you so far.

ARM is Not Needed for Low TDP

A low thermal design power is one thing that's frequently touted by ARM shills. Certainly, lower TDPs are what CPU manufacturers ought to strive for, but it's not as if x86 isn't capable of accomplishing the same low power consumption as ARM. Intel's Atom CPUs are prized for that very thing; while netbooks and nettops are pretty much a dead fad (big fucks to telephones for causing this), Atom enjoys continued usage in low power servers, industrial/embedded applications, and some consumer laptops.

Super high performance on such small TDP is still much of a fantasy, which is pretty much inevitable as higher TDP CPUs trend towards much greater potential. Much of the talks about ARM CPUs delivering such high performance with low power consumption tend to face at least one of the following fallacies:

  • The latest ARM CPUs are being compared against previous generation x86 CPUs
  • Such factors like ASICs being used for certain things are not acknowledged
  • The numbers are pulled out from one's ass

Also, TDP isn't everything. Actual power consumption can vary greatly in practice. Wattage is usually lower during normal usage and idling, but can also exceed the rating for a short period of time when a core's frequency is dynamically raised within official specifications, something Intel calls Turbo Boost Technology. TDP assumes the CPU is under its maximum theoretical load. [3]

ARM Won't Solve Anything

To say the least, this new war between CISC and RISC is probably going to turn out the same way it did by the end of the late 90's so as long as Intel doesn't slip up. An entire range of different RISC CPUs from different designers came up and had some pretty impressive specs to brag about; MIPS, for instance, was powerful enough to allow Jurassic Park to happen. Windows NT gladly supported the emerging RISC architectures in its infancy, but ditched them all by the time Windows 2000 was about to reach the market. Why so soon?

3D Studio MAX rendering an extreme closeup of a bad 3D axolotl using two CPUs

Well, by 1996, x86-based computers were growing so powerful and receiving a lot more new software previously reserved for expensive workstations. Have you ever dreamed of having a new FMV intro for your hot new CD-ROM game comprised of computer generated 3D images? With a fast Pentium or Pentium Pro computer and a copy of 3ds MAX, you can make that a reality, and then you can make a very boring tax return spreadsheet on the same computer! So, the case for mainstream RISC was invalidated, and such architectures based on it were left to the enterprise niche. They continued to grow and do their jobs well, but obviously they had nowhere near the level of support they could have.

Trying yet again to shift from one architecture to another isn't going to help with anything; it's only going to create more of a mess for all of those who have invested heavily into x86 hardware and software. I've already mentioned how those talks about low TDP mean nothing; of course a 5nm CPU design is generally expected to be more energy efficient than a 7nm or 10nm design. It's not RISC that enables this; CPU designs are very complex, and are certain to vary depending on whether you want high performance or low power consumption. Fundamentally speaking, you cannot get both extreme ends no matter what architecture you use. [4]

In that case, it's really better to not bother with trying to drag on this pointless debate. The only way RISC can stand a chance in mainstream desktop computing is through aggressive marketing and having Apple around to provide any such RISC-based product to mainstream consumers. But when x86 can already do so much of the things that Apple now insists requires ARM, RISC is even more useless than it was in the 90's, and throwing out everything x86 had worked up to for this fad will only create more problems with no guarantee of even solving anything. Ironically, the same can be said if we were to try getting everyone to migrate from commonplace ARM-based phones to x86 phones; I suppose it's not surprising the latter hasn't done well in that market.

ARM is Not New

One thing many ARM shills like to babble on about is how ARM is this super new thing that's gonna replace the old, outdated x86 platform. They're convinced that backwards compatibility, a necessity in enterprise environments, is somehow holding x86 back, as well as bringing up the fact that all x86 CPUs are derived from one model introduced in 1978. Eeeww, so antiquated!!

Get this... other manufacturers of RISC-based products tried throwing the exact same pitch in the 90's, including this Macintosh or Windows video tape from 1996. Not much was done to explain why RISC had its advantages over CISC apart from a few vague benchmarks. It's worth noting that this video was made when Apple seemed to be on its last legs, and had to acquire NeXT to get back on track. So much for the claim that a new Power Macintosh would be an investment in the future, for everything prior to G3 was left to die!

Even then, I'd still take a 1996 Power Macintosh over a 2021 M1-based Mac any day. System 7 was a good era for Apple; it's brought forth a number of important innovations to the mainstream userbase, but a legitimately archaic kernel with no memory protection made it pale in comparison to something like Windows 95.

I almost shouldn't even be bringing this up, because making a case for ARM on the premise that it's newer than x86 is hilarious. But, when everyone keeps hammering on it this much, well... I have to break it to them. The ARM architecture was introduced in 1985, only seven years after x86. x86 went on to dominate mainstream desktop computing in the decades to come, while ARM remained a nobody for the longest time. It wasn't until Apple started using it in its smaller devices with their own custom-designed CPUs that it would start to gain ground. Doesn't change that PowerPC is newer than ARM!

Any talks about ARM being newer than x86 are stupid and benign, they don't mean anything. x86 is not "outdated", it is mature. It has a much stronger backbone propped up by numerous credible companies all working in collaboration with each other to bring the most of what any user could possibly ask for. This brings me to my next point...

ARM Contributed Nothing

When the IBM PC was introduced, it was largely comprised of off-the-shelf components that most anyone could acquire, a far cry from IBM's other "larger" products. This, combined with reverse engineering efforts from much smaller companies first spawned third-party expansion cards compatible with the PC's expansion bus. Such cards like the SixPakPlus from AST Research were often better than IBM's own since they could integrate many more essential components like I/O ports, memory, and timekeeping into a single slot, saving room for other expansions in a single unit.

Then, fully compatible clones of the IBM PC sprung up. With no royalties to worry about, x86 became anyone's game. New innovations came forward like tighter integration of essential components to reduce manufacturing costs, and implementations of newer standards like SCSI allowed many fast hard disks to be connected to a single computer - up to 28 in the earliest incarnation.

When it comes to x86 computers, it was all about standards. Anything that was a widely accepted standard was a surefire winner and a good investment to last plenty of years for customers of all sizes. Countless companies sold x86 computers and components, can I even name them all here? Compaq, HP, AST, Tandy, NEC, WYSE, Zenith Data Systems, Olivetti, Epson, Everex, Intel, 3Com, Adaptec, Cirrus Logic, Creative, Ad Lib, HeadStart, Dell, Realtek, Digital, DTK, Asus, Shuttle, AMD, Cyrix, Tseng Labs, Acer, and even Apple to some degree (FireWire comes to mind)... at this rate you may as well make it into a ninth-generation Pokémon rap!!

Many of these companies gave back to the x86 pool one way or another. Although not all of them were involved in the development of new standards for computers, they often helped support such standards by creating devices which worked with them. As for IBM, well... they wanted everything for themselves. After Compaq beat them to the creation of a 32-bit 386-based PC, IBM seemed irate enough that they wanted nothing to do with the growing acceptance of highly cloned data buses and aftermarket expansion cards.

When their PS/2 lineup of computers was introduced in 1987, it was not held in high regard due to how far off it was from conventional PC clones, despite having some initial success in business userbases. One of the PS/2's most notorious features was the Micro Channel bus. Technically speaking, it was a decent expansion bus; its 32-bit interface was better suited for high bandwidth applications. However, its complications in implementation on both 16-bit and 32-bit computers combined with its exorbitant licensing royalties per system sold made it nothing short of a joke in the PC industry at large. You'd think they'd learn from the miserable failure that was PCjr, but they sure as hell didn't.

Several companies, which would become known as the Gang of Nine, got involved in the creation of a new, technically superior 32-bit expansion bus better known as EISA. While it never found a place in the home PC market due to its high cost of implementation as well as a miniscule licensing fee, it was much more widely implemented in comparison to MCA on workstations and servers from 1989 to around 1996. MCA died with the PS/2 in 1993, and IBM accepted that it would have to implement EISA and PCI in its later systems, which it had no involvement in developing. Plenty more standards we enjoy today, like USB, PCI Express, and dynamic energy-saving power management, all came from multiple companies cooperatively working to create solutions that work for everyone.

So where does Arm Ltd. factor into any of this? Nowhere. ABSOLUTELY FUCKING NOWHERE. The whole time that all these other tech companies were developing new standards that inevitably empowered consumers, ARM was nowhere to be found! Now all of a sudden it wants to overwrite 40 years of a monumental legacy that's stood the test of time? What are they even trying to gain from this? FUCK OFF!!!

It's About Control...

Oh, of course. That's what this ARM movement has really been all about from the beginning; it's all just political shit. Even with questionable moves like ACPI and UEFI, x86 has remained an open and modular platform. Those things that have added troubles for developers of open source operating systems are still standards in any regard, not some absurd proprietary thing owned by a single company. It's not perfect, and it has been straying further from that lately, but that doesn't justify throwing it all out for something that will, at best, be a short-lived fad, and at worst, the start of a new dystopia in desktop computing.

Compared to x86 where even the more walled garden systems like Intel Macs can run just about whatever you want with the same amount of knowledge required to use a screwdriver (metaphorically speaking), ARM is designed from the ground up to be locked down, holding a strangling grip on the will of the user. It's often very difficult to change the software being used or how it's used, especially in sandboxed devices like modern touchscreen telephones. For example, I have not been able to root my Verizon-branded Samsung Galaxy S5, and I don't even use that for phone calls! It only served as a temporary solution for a high resolution camcorder. Every time I power it on, it still desperately wants me to update my version of Android and absolutely refuses to let me say "no, I WILL NEVER update!"

This sort of locked down bullshit is just one reason why I despise mobile platforms. Can't I just have a phone I can trust anymore? Guess not, as things have gotten much worse since the S5. No more replaceable batteries, also pretentious screen notches. Fuck that shit! I should be able to use a phone exactly how I want to, I bought the damn thing. ARM is not good for anything. Give me an x86 phone and then I'll start being more open to these telephones. I know it can be done.

AMD, Guilty as Charged

Everyone just LOVES to trash Intel for whatever they can think of. For sure, Intel has pulled a significant number of anti-consumer moves, from adopting a similar "I take all" mentality to IBM since the introduction of their 80386 processor, abusing their strong brand recognition by not allowing OEMs to brand non-Intel PCs with "AMD K6-2 3DNow!", locking out non-Intel CPUs from Intel motherboards, and generally just trying to stifle AMD and similar companies wherever they could.

But to get all pumped up about AMD rising from the ashes without taking into consideration that they might be just as willing to be anti-consumer is pretty naive. I say this as someone whose early childhood was powered by AMD, who got big into retro computing through an AMD processor, and who's using an AMD platform right now. Have you heard about Platform Secure Boot? What this basically does is lock enterprise-class CPUs like Epyc, Ryzen Pro, and Threadripper Pro to OEM boards in the name of "security". Sure, it may not be a problem for you now, but when it comes time that these servers and workstations become affordable in second hand markets, they still may not be affordable anyway since it's bound to end up becoming really difficult to get the matching board and CPU needed for a working system. [5] Not much different from ARM, I suppose.

Ryzen also has super annoying supply shortages and severe 16-bit virtualization bugs.

Make Mine Free86!

The current state of computing as a whole scares me. Being confined to x86 space, Windows 10 has proven to be a control freak with its forced updates and telemetry. Going out of that, though, "mobile" has already overtaken conventional desktop/laptop computers as the majority holder of the market, and guess who holds a large stake in that... Apple, of course. [6] It's only natural that their roaring success in their quest to have touchscreen telephones replace proper desktop computers would trickle back to their Macs, now being based on their own ARM CPUs.

To retain the x86 platform and the freedoms it provides, you need Intel. Ideally, Intel and AMD could still be keeping each other in check, but as the latter appears to be going mad with its bizarre ARM-like ideas, everything rests on the upcoming Alder Lake lineup not sucking. If it fails, x86 may very well die in the face of brute force marketing for ARM, at least in mainstream areas. I cannot understate how harmful this is going to be for computer users of all types, including those who don't pay attention to any of this.

If there is not a more perfect parallel to the ongoing ideological war of capitalism vs. communism or American values vs. revolutionary propositions, and how each of them are so often perceived, this conflict between x86 and ARM is the one. The former is a flawed product in dire need of improvement, it has a dark history of bad actors, and it's showing signs of crumbling, but it has a proven track record of going great lengths to benefit consumers with the widest range of compatibility with both old and new hardware and software. Its undisputed power in the microcomputer field has enabled numerous individuals and organizations to facilitate countless operations to grow and fulfill themselves and others with maximum efficiency.

The latter, being ARM, is a highly disputable platform that does not have much to prove itself on beyond telephones, and we know how fucked up those are. It promises such fantastic computing power to transcend x86, but comes riddled with malicious intentions to take away what you love about computers and replace it with the broken philosophies of smartass telephones. It comes in the night, and attacks during the worst hours of x86.

This all sounds like salty pro-American propaganda, doesn't it? Perhaps it is no coincidence, given there happens to be one such entity which is very interested in creating ARM-based PCs... China!

Yes, there's these state-owned electronics companies that are putting stake in ARM. [7] Everyone should know just how malicious the CCP is, what with their tendency to remove organs from living political dissidents... that's a whole 'nother story in itself, but you kind of have to wonder if ARM's gonna end up being used to further fuel totalitarian control over citizens in China, much like how Epic Games being owned by Tencent creates an American pipeline for funding their creepy ass "Sesame Credit" program. Every time you buy a Fortnite costume, another individual loses their individuality...

I don't have my mind too well-organized on the state of things there, so I won't be getting too much into that. This point is that ARM is a lot more dangerous than you think, so it would be wise to not shill for this like these large YouTube tech channels do.

There is another glimmer of hope in all of this. nVidia's still in the process of acquiring ARM, and knowing them, they might just fuck it up bigtime if they do get ahold of it, potentially giving x86 another shot, if not also throwing more people off of those touchscreen telephones that everything seems to require these days. This won't affect Apple since they basically have a very special perpetual license to do whatever the fuck they want with ARM, but maybe it'll be enough to get us back on the right track. Until then, one can only hope Alder Lake will be as good as it can be. I'd be quite eager to switch back to Intel if it does excel.

The iPhone really was one of humanity's biggest mistakes. One night in early 2007, Steve Jobs crept into my room and whispered into my ear, "prepare for unforseen consequences"...

Footnotes

1 ARM: What Is ASIC?

2 Intel Core i7 1195G7 ‘Tiger Lake U’ Refresh Mobility CPU Benchmarks Leaked

3 Thermal Design Power (TDP) in Intel® Processors

4 University of Wisconsin: Power Struggles: Revisiting the RISC vs. CISC Debate on Contemporary ARM and x86 Architectures

5 AMD PSB Vendor Locks EPYC CPUs for Enhanced Security at a Cost

6 StatCounter: Operating System Market Share Worldwide - May 2021 (CSV)

7 TechRadar: New Arm-based gaming PC set to take on Intel and AMD rigs

Special thanks to community members for helping get this article polished up.