Apple Silicon and Intel's Demise

Apple at WWDC 2020 announced plans to transition Mac computers from Intel to its custom-made Arm processors dubbed Apple Silicon. There have been rumors about this for quite some time, but now it's official - the first Arm-based Macs will hit the shelves in Q4 2020. To a casual observer, this news probably doesn't amount to much. A Mac is still a Mac as long as the software that users interact with stays the same. But to the tech community, this announcement carries a much heavier weight and will have a long-lasting impact in the industry for years to come. Let me explain why.

ARM vs Intel

To appreciate the impact Arm Macs, it's important to have a bit of historical context about Intel and the semiconductor industry.


Intel is the world's largest integrated device manufacturer - that is, the company designs and manufacture its own chips. Intel rose to power through the invention of the x86 chip architecture that later became the de-facto architecture standard for computers and servers. Pretty much every laptop, desktop, or server in the world runs on x86 chips manufactured by either Intel or AMD (who got the x86 license from Intel itself). The invention of x86 and the early mover advantage helped Intel build a very strong moat around its semiconductor business - the software ecosystem. Developers have been writing software for x86 for decades, and it's not easy to just move those software applications to another platform. They have to be rewritten or, at the very least, recompiled to be compatible with non-x86 machines. The more software is built on top of x86, the more difficult it is for computer manufacturers to stray away from Intel's influence.


But Intel's dominant position was threatened when Apple introduced the iPhone. iPhones, and mobile devices in general, have very different needs from servers and computers. They don't need to run complex software, but the small battery has to last for as long as possible. X86 chips where never designed to be power-efficient. For decades, Intel has been on the march to improve X86 performance in the computer market by introducing complex instructions at the expense of power efficiency. Apple did ask Intel for a chip that was more suitable for mobile devices, but Intel passed on the opportunity. The cost of designing and manufacturing chips for this relatively unknown market segment was not something Intel could see as profitable [1]. This drove Apple to the arm of Arm (pun intended).


The term Arm usually refers to the reduced instruction set architecture designed by Arm Holdings. Contrary to Intel, Arm Holdings is not a chip manufacturer. It only designs IP cores (reusable design building blocks) for chip architectures and license them to other companies to customize and manufacture their own. The Arm instruction set is simpler, meaning it requires more instructions to perform the same task as x86, but in return, less hardware support is needed. This allows Arm to optimize for power consumption at the expense of raw speed - exactly what Apple was looking for. Apple licensed Arm and built their own chips for the iPhone, and the smartphone market took off like wildfire. Other device manufacturers soon followed the footsteps. Today, Arm chips power 95% of the world's smartphones, be it Android, iOS, Windows, or anything else.


What makes Arm attractive to  device manufacturers is not only the technology but also the business model. With Arm, device manufacturers can design and produce their own chips at a fraction of the cost using pure-play foundry services like TSMC or Global Foundries. They can also move much faster than Intel whose speed of innovation has slowed down significantly over the years. This is essentially what made Apple want to pursue Arm-based Macs.

Why Arm in Macs matters?


Arm in Macs by itself is not particularly interesting. Sure Apple will be able to improve their bottom line by dumping Intel, but the industry won't benefit much from it since Apple likes to keep their ecosystem closed. The technology is also not much of a talking point. Arm chips will most likely not match x86 in term of raw performance, and the long battery life is merely a nice-to-have in the computer market. What turns heads about this move is the impact it can have on the server space.


Apple is not the only company who wants to ditch x86 for Arm. Cloud providers have been working on this for a few years now. AWS officially initiated this process through the acquisition of Isreal-based chip maker Annapurna Labs in 2016. The product of this acquisition was AWS Graviton, a fully custom-made Arm-based processor that powers the A1 instance family. A1 instances are 40% less expensive than the C5 or M5 counterparts. This means if businesses running on AWS switch to A1, they can reduce significantly their infrastructure cost and pass on that cost to consumers (or investors). It's not the bottom line of Apple we're talking about anymore. It's the bottom line of the entire tech industry. Graviton has achieved decent performance benchmark compare to Intel and AMD, making it usable for common workloads like web servers. However, the adoption of A1 has been slow. Part of this is because the industry is waiting for server-side Arm processors to mature, but a larger part is due to the lack of Arm-based development machines.
You see, most software developers write and test code on their personal computers. Software developed on x86 machines is more likely to be deployed to x86 servers. Cross-platform development exists but it's still relatively painful - developers have to constantly deploy code to a remote machine for testing, slowing down development speed. At the early stage of a software product, convenience and speed are more important than hosting cost unless the cost saving is 10x or 100x, not 40-50%. If Arm-based servers want to go mainstream, Arm needs to be present in developer machines and capture the end-to-end development cycle.


This is where Arm Macs come in. Macs are already popular in the software developer community due to MacOS being closer to Linux than Windows (this is due to Linux being the operating system of choice for servers). If developers start using Arm Macs, it'd be natural to see the usage of Arm on the server side to pick up. As Arm processors get faster and faster, there will be very little reasons for SaaS developers to stick with x86. Soon enough, what has been the cash cows for Intel will slowly shrink as Arm will start invading the server market.

What's next?

Some may say it's still early to tell whether Apple will succeed with the Arm Macs and whether Arm servers will become mainstream. But to me, the question is not "if" but "when". Apple has a good track record of rallying developers to  accommodate breaking changes in iOS and MacOS. They will definitely be able to do the same for Arm Macs. Once that is done, the cost advantage of Arm servers will be too enticing for developers to pass on.


The Arm march is a classic example of the Innovator's Dilemma. Intel pioneered the silicon industry and held the realm for a very long time. The moat they built is very strong, but the industry simply went around it. Arm chips focused on a different competitive dimension and slowly moved up the market. Intel chips most likely will always be superior in term of raw performance, but raw performance up to a certain point is just a diminishing return. The industry will eventually care more about cost, control, speed of innovation, and power efficiency. Intel's semi-conductor business will still be relevant in performance-demanding segments like High Performance Computing, but the market size might be too small to justify staying in this capital-intensive industry. One thing Intel can do is to drop or spin off its manufacturing division (similar to what AMD has done with Global Foundries) to focus on chip design and gain back the innovation speed they've lost over the years. Regardless of which path the company chooses to follow, the road ahead will for sure be bumpy.


[1] This was mentioned in Paul Otellini's interview with The Alantic  https://www.theatlantic.com/technology/archive/2013/05/paul-otellinis-intel-can-the-company-that-built-the-future-survive-it/275825/.

"We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," Otellini told me in a two-hour conversation during his last month at Intel. "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."