Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Intel and the Danger of Integration (stratechery.com)
225 points by ingve on June 25, 2018 | hide | past | favorite | 138 comments


The thing is, though, that tying design and manufacturing together has been really good for Intel. Historically Intel's performance in terms of driver current and so forth at a given process node has been better than its competitors and it has reached those nodes more quickly. Intel manufacturing paid for that lead partially by accepting more restrictive design rules for transistor layout than a merchant silicon shop could ever get away with because they knew they only had one customer and one design they had to make work. With Intel's numerous missteps at 10nm that lead is over and the manufacturing debacle is taking down the design too but there's no reason to think Intel would be starting from such a dominant place if they had split up. If I were to go back to 2013 I'd tell them "10 nm is too early for cobalt interconnets!" rather than "Split up your design and manufacturing".

And really you could also look at the problem as the design team being too wedded to x86 and unwilling to consider other ISAs just as much as the manufacturing problem. Intel could have done an ARM processor for the iPhone but they wanted to make a go at an x86 mobile chip instead. And while the x86 tax is small potatoes when you're paying for a deep OoO pipeline it's pretty noticeable in the world of in order processors or graphics processing units.


That's exactly what Ben concludes with.

> To demand that Intel apologize for its integrated model is satisfying in 2018, but all too dismissive of the 35 years of success and profits that preceded it.

Many business failure stories are like that. "It worked great until it suddenly didn't."


I disagree with a few of the points made by this article.

"By 1986... Intel was already the best microprocessor design company in the world. They just needed to accept and embrace their destiny."

...they were most certainly not the best design. SPARC was just about ready to murder the VAX at this point, and more importantly, both the first MIPS and ARM samples were delivered in 1985. Either would have trounced an i386.

The iPhone was also discussed. We should remember that Intel got StrongARM from Digital, later selling the Xscale architecture to Marvel. The destructive debris of ARM within Intel was likely substantial enough to have delivered whatever Steve Jobs wanted.

However, Apple would not be happy with 32-bit ARM for long, and Intel should have realized that. If Intel had presented the equivalent of AArch64 to Apple a year before the iPhone, meeting power and code-density requirements with newly-designed 16- and 32-bit equivalents of THUMB within a 64-bit architecture, Apple might have abandoned ARM.

Intel and HP shared Itanic - had Intel and Apple shared something like the A7, history would be quite different.


Reviewing the history, the Xscale/StrongARM sale to Marvel went through in June 2006.

The original iPhone was released in June 2007 with a 620 MHz Samsung ARM.

That was awfully close. I wonder if Apple was consulted on the sale, and what else was privately said between the two corporations.

Intel certainly had the iPhone in their grasp. The Wiki says that the Blackberry Bold used an Xscale PXA930, the Torch used a PXA940, and the Xscale IOP348 was hitting 1.2GHz. Unbelievable that Intel walked away, after pushing several mobile products with the leading smartphone manufacturer of the time.


Right. Its really hard to predict this kind of shifts in processes, or how processes that you're relied on for a long time have been made obsolete. When that happens, big businesses have an especially hard time because they're usually not agile enough to reorient themselves/adopt new processes quickly enough.

A peer commenter mentioned IBM. As an ex-IBM'er, I certainly think that is true. The team I worked on adopted agile methodologies (which I think has been largely adopted in the emerging tech teams). But you could still see remnants of old processes and incentives, like a weird emphasis on writing patents. That was kinda shocking to me as a SE to see a patent writing as a part of my job, but I realized it was a holdover from earlier times. Ultimately, while I enjoyed my time there, broken processes for recognizing contribution was among the most important reason I jumped ship.


It's funny, though, that it's often very easy for an outsider to predict these kinds of shifts in processes, because they have no vested interest in the status quo and are evaluating everything with fresh eyes.

As a kid growing up with computers in the 90's, I knew IBM was doomed. Why? Because buying an IBM-brand PC when a clone cost 1/2 the price and did exactly the same stuff was ridiculous, and Microsoft owned the software you actually used (and had to make sure all your apps were compatible with). Older folks would tell me "Well, IBM makes most of its money selling to enterprises, and they can charge millions of dollars for a computer" and I'd be like "Why would you do that when you can buy a PC for a couple thousand and hire a high-school kid to program it for you?" and they'd mutter something about transaction processing and customer support and how nobody got fired for buying IBM, and I'd be like "That's stupid. Their customers are all going to go bankrupt, and then IBM will too." I was a pretty insufferable teenager, but I was right about a lot of that.

I'm 37 now, I've lived through four generations of technology (PC, web, mobile, and cloud) and retrained for each, and I find I need to make a conscious effort to try and see everything through college-student eyes. If I were a new grad just coming out of school, how would I evaluate the world? And I try to keep that in mind whenever I read eg. a thread on crypto here. The total volume of all data on the Ethereum blockchain is roughly what I would process in 10 minutes with a small MapReduce when I was working at Google. But it's very likely a college student today would be like "So? What's the point, when nobody trusts the answers you give them because you're just a giant corporation that keeps all your data secret?"


I use to ask my father why he bothered using a desktop email client. It seemed like useless overhead, and webmail was the future. Well, webmail is indeed everywhere- but I now use a desktop email client daily.

Outsiders don't tend to talk about the predictions they got wrong.


Heh, that's very true, but I think that the predictions I've gotten wrong actually tend to err on the side of being too conservative about the future rather than too radical, or of betting on the wrong horse between radical alternatives. I was skeptical of the WWW at first ("Gopher is better organized"), continued to use Eudora and then Outlook right up until GMail came out ("you have to be online to read your e-mail?"), and still don't really get what's so great about Facebook (LiveJournal was better in just about every way, and basically every feature that Google+ forced Facebook into implementing was actually cribbed from LJ a decade before).

I think there's a cognitive bias somewhere about people preferring the mental model they already hold over some unknown future. Realistically, pretty much the only thing you can be certain of is that the future will not be the same as today, it's just that picking between many different futures gives you terrible odds of selecting the right one. It seems like your options are assuming that the status quo will continue, which is guaranteed wrong, vs. choosing between multiple competing alternate futures, where you are most likely wrong.


If you predict the weather tomorrow will be the same as today, you will be right most of the time. It's easy to do worse with a sophisticated model. That's probably a significant reason why evolution leads to humans and other organisms generally not using too much logical analysis, but simply doing what works until it results in a major disaster.


Well, that is because you get a copy of your email on your machine, so don't have to worry about losing the account as much, and you have offline access.


"That was kinda shocking to me as a SE to see a patent writing as a part of my job but I realized it was a holdover from earlier times."

Useful activity for current times. Patent licensing and suits are a big money-maker and competition blocker for companies like IBM. For instance, Microsoft has made over a billion dollars on patent royalties *for Android." All that distracting paperwork paid off nicely. IBM used their legal team to crush a company selling commercial versions of Hercules emulator. Oracle vs Google. Apple vs Samsung. Intel recently threatened patent suits if anyone clones x86 to get off Intel chips.

This stuff has a real, financial benefit to these companies. So, they're justified in investing in it.


You make an excellent point. I never thought of it that way and it totally makes sense. I guess I was looking at it from the perspective of an individual: I would rather my work be used to build amazing stuff than as way to extort others.


This is what the moral of the story of Blockbuster was supposed to be, but everyone wanted a story of avoidable technological disruption, so they told themselves that one instead.


Ahhh, but that’s the thing. They’re the same story


It's possible, of course, that Intel is just doomed but I don't see that we have any reason to think Intel would be better off taking Ben's advice than not. Integration has hurt them in the last few years but again they could quite plausibly have ended up in a worse place now if they'd gone the foundry route years ago. I'm pretty certain that Intel's manufacturing arm would do much worse without being tied to the design arm and I think it's a tossup, going forward, whether the design arm will benefit or be hurt from being tied to the manufacturing arm.


> but I don't see that we have any reason to think Intel would be better off taking Ben's advice than not

Ben doesn't offer any advice that I can see, and he makes it clear that integration has worked very well for Intel up to this point.

This seems like basic analysis, even though people seem determined to take it as advocacy.


Relentless reinvent itself, like Amazon did.


IBM comes to mind


It's because integration and disintegration is all cyclical, as per Clayton Christensen (Innovator's Dilemma fame). When you have a new type of product you need integration to squeeze all of the performance out of it. When the industry is mature, you need to disintegrate it so benefit from outsourcing, economies of scale, customization features for customers, etc.

Of course the devil is in the details, so you have to figure out when to do that exactly. Do it too early and it could be your downfall. Do it later, and the same could happen.


Jim Barksdale:

“Gentlemen, there’s only two ways I know of to make money: bundling and unbundling.”


I understand the point that Ben Thompson is making: that a vertically integrated Intel lost out in the mobile revolution because it wouldn't unbundle it's manufacturing from its processor design. But its current crisis (not referring to the CEO's affair) has nothing to do with vertical integration. It has everything to do with Moore's law scaling coming to an end. This gives fabs like TSMC and increasingly SMIC from China the ability to catch up. Unless there is a breakthrough in technology -- molecular transistors or some such, Intel's ability to dominate will end soon.


From the quoted tweet in the article: "I said that ICL should be taken to 14nm++, everybody looked at me like I was the craziest guy on the block, it was just in case".

Imagine if Intel were two companies, Intel Design and Intel Manufacturing. This wouldn't have arisen - Intel Design would have produced the new ICL design and could have worked with Intel Manufacturing and TSMC on how to build it, ultimately using TSMC if IM couldn't make it.


I suspect you may have hit on Intel's eventual fate. The company may indeed split up into a design unit and a manufacturing unit. That might unlock significant shareholder value.


It would take many years since the spinoff that significant players would be comfortable enough to manufacture their designs with "Intel Foundry" since they known damn well there would be several hot-lines between Intel Foundry and Intel Designs.


Was that true of AMD/Global Foundries, when they split?


The same was said of Microsoft, as Office was held back by the need to always be in lock step with Windows.

Microsoft got past that problem, and I’m sure Intel will too. Like 1990-2010 intel, the foundry companies have been able to hide a lot of their warts through growth. IMO, the smartphone revolution will peak much more quickly than PCs, and Intel competitors will gave their own life threatening challenges.


I agree. Performance in cellphone processors will peak even faster than the PC. And the semiconductor industry as a whole either becomes a commodity industry (like electricity) or they have to discover new technologies.


> But its current crisis (not referring to the CEO's affair) has nothing to do with vertical integration. It has everything to do with Moore's law scaling coming to an end.

Why not both? Their current business model is threatened by a technical challenge, but the company's exposure to that threat could've been mitigated by making small changes to the business model — by unbundling and diversifying.


"small changes" would seem to be a significant understatement, however. The innovator's dilemma.


Unbundling and diversifying are both viable strategies. I just don't know if Intel can pull off both simultaneously while maintaining market share, and current levels of profitability.


Intel lost out on the mobile revolution because the mobile revolution was about radio technology and not processor technology. Outside of buying Qualcomm for patents and radio designs, anything Intel could have tried was doomed to fail.


I'm not sure if this was required for low-power mobile, but it certainly helped:

https://www.theregister.co.uk/Print/2012/05/03/unsung_heroes...

When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.

Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.

As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."

Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt.


Just take a look at a product page of qualcomm [1]. The CPU gets 3 bullet points, meanwhile the cellular modem has 1/3 of the entire page. Also here is a nice picture that shows the irrelevance of the CPU compared to the rest of the SoC [2]. If intel wants to compete they not only need the best cpu, they also have to be a champion at everything else.

[1] https://www.qualcomm.com/products/sdm845

[2] https://www.qualcomm.com/sites/ember/files/styles/optimize/p...


Good article, but it misses plenty.

One has been marketing that damages its own brand, particularly the "Atom" phenomenon. Intel had such an obsession with phones that it just had to make x86 processors that were too weak for phones, never mind anything else, full of bugs, etc.

It might be a classic case of "boiled frog" because few have realized that ever since Intel switched to PCIe, Intel chips have been starved for PCIe lanes. There has been no point to SLI in a consumer configuration because there just is not enough I/O bandwidth to support two graphics cards. In fact, Intel hopes you will settle on "integrated graphics" (the same as on a phone) so you won't have a reason to buy a computer instead of a phone.

Optane is a story like 10 nm. Just add tone-deaf marketing that makes a potentially revolutionary product seem like nothing.

Probably the thing that has hurt the PC industry the most is the slow transition to SSD. Given a choice to a brand new computer that has an i7 chip and an HDD which will show you a spinning cursor most of the time, or an eight year old computer with a new SSD which can boot before you get old, the choice is obvious.

If Intel had said something like "an i5 chip has to come with an SSD" or if Microsoft had required SSDs for the Windows 8 launch people would be like "Wow this is much better than what I had before." As it is, Intel has mandated a Meh experience for a long time and the tech press has let them get away with it. (It's murdered the tech press too since now all Tom's Hardware talks about is RGB lighted fans and the really cool monitors that might get released someday...)


  > There has been no point to SLI in a consumer configuration because there just is not enough I/O bandwidth to support two graphics cards.
For what exactly use case there wasn't enough I/O? In case of gaming you can find dozens of tests that will show there is almost zero difference between 8x or x16 on PCIe 2.0 and in many games you can have x4 without any performance loss.

SLI / CrossFireX failed because they both were vendor lock-in technologies that depend on driver manufacturers doing adjustments for specific games since game developers wouldn't care to spend time to support 1% of PC user base.


I don’t believe vendor lock in has much to with the failure in the adoption of SLI/Crossfire. Lack of support for most games to take advantage of it and issues with games that don’t support the protocol are the main problems.


Though Vulkan might finally unlock those powers in a cross-vendor, easy to use way, just in time for both technologies to die out anyway.


I don’t think that will change the math for most game developers since the SLI market will still be tiny. Every serious engine already optimizes for specific GPUs already, and Vulkan won’t change that need.


I meant because Vilkan can make it transparent to use a couple of function calls to enable a second GPU for higher frame-rates. It's a significant API improvement and is cross platform.


It's pretty clear the PC market went into decline as the smart phone market ate up their sales and I find it highly doubtful a faster harddrive was going to overcome the convenience factor of the smartphone.


Maybe but it was still stupid marketing too. Demanding all Core i* CPUs to be bundled with SSD would have been a brilliant move. Not enough to turn the ship around, but steering in the right direction.


Sure it would. Where I work, we’ve done piecemeal (15%) replacement of PCs since 2008. That saved us something like $100M.

Why? There was no reason to move past Windows 7 and almost no performance reason the get new gear. SSD everywhere would have pushed down the component costs earlier and cost justified the move 18-24 months earlier.

The justification there for me wouldn’t have been performance either — it would have been servicing costs on the PCs as SSD has a lower failure rate, and the power consumption is better on new PCs.

PCs are like the work trucks that contractors or facility guys use. If it can tow a mower and doesn’t break, you give zero shits about it, other than the $$$.


The smart phone has an SSD in it and that is why the smartphone has many convenience factors.

For instance, a netbook with an SSD can ride in your backpack and play music to bluetooth headphones, just like a phone. If either of those had an HDD you would have trouble with the music skipping caused by the vibrations of your footsteps. Probably you'd trash the HDD before long.

The fast responsiveness you expect from a phone also comes out of the use of SSD. Computers with HDD frequently "go out to lunch" and that's a convenience negative for a computer (if it doesn't have an SSD).


>> One has been marketing that damages its own brand, particularly the "Atom" phenomenon. Intel had such an obsession with phones that it just had to make x86 processors that were too weak for phones, never mind anything else, full of bugs, etc.

I'm not sure the low power offerings from Intel were the problem. They kept wanting to enter the mobile market with bloated software from Microsoft. It's not enough to make low power hardware, the software had to be there as well.


> As it is, Intel has mandated a Meh experience for a long time and the tech press has let them get away with it

What do you want to say with that? Did Intel force OEMs to bundle i3s or netbooks with HDDs?


They did either directly or via/in cooperation with Microsoft.

Wintel is a thing, and Microsoft delivered a turd in Windows 8 that killed everyone. Netbooks were there to counter iPad and Chrone.

Even today, Microsoft pays HP $70 or something like that to make those shitty “Stream” laptops.


Netbooks were there first, though. The first wave of them is probably what spawned the Atom product line. (The very first of them had weird little Celerons that downclocked to like 660MHz for thermal reasons because there wasn't anything explicitly designed for purpose)

The fact Microsoft is subsidizing them to ensure they run Windows doesn't mean that a netbook as we know it wouldn't exist today without them. They'd still be on the market, likely running some poorly skinned Linux distribution like those "computers for seniors" they advertise in Parade magazine; or maybe Android-x86 would have gotten more support from OEMs and evolved into something usable as a daily driver.

The story has already been established for consumers: if you don't need games or performance, someone will sell you a portable device which will run an office suite and a web browser for USD200 or less. It's interesting to discuss how much "it runs Windows/uses an x86 CPU" is part of the narrative-- would a $150 ARM laptop running a nicely packaged Linux suite with LibreOffice sell?

I suspect there's no way out of the crappy laptop sector though. Even after the Vista Basic fiasco, I can't see Microsoft outright antagonizing OEMs by saying "Windows Next won't run on anything below some minimum spec that would provide an actually good experience, so goodbye $79 tablets with a gig of memory and 16 of flash"


Agreed, they definitely were. But those early devices like the eeePc were exploding the market segmentation of thin/light being a price premium. (Fujitsu and Toshiba were the players there) The customers were nerds and the OEMs were mostly bit players.

The iPad and Chromebook is where the market was exploded. Companies like Comcast were buying 40k iPads to displace laptops. That’s when shit got real.


Intel gutted their budget mobile Celeron/Pentium lines by replacing them with Atom based models.

This strategy makes sense for smaller laptops/netbooks but it makes for horrible user experience at 15inches. It is like putting an engine meant for an European city car into F150.

That is if a Acer/Asus/etc wants to build a budget laptop they have to choose the crappy atom based chip.

It is a crime that a budget 15 inch laptop from 2012 can run circles around a budget 15 inch laptop from 2018.


The problem with Intel, as I see it, is an inability to get new designs to production, and a low tolerance for market growth versus market sustainment. They’ve been a disaster at acquisitions, being totally fine with wasting billions on failed outside investments without any program for internal ventures. The latest focus on “Intel is now a data company” is just complete denial of the reality.

But the margins are awesome, and I’d reject any notion that they should erode that with fab services. They need more of the high-volume, high-margin Intel-designed chips that have been their hallmark. Everything else is a distraction.

Some of that depends on identifying new forms of computing. Therefore, it makes sense to do technology experiments, but lately Intel has focused on market experiments, where they are not really developing new technology, but building internal organizations with sales, marketing, and product management that dwarf engineering.

This is a problem not just because it diverts resources, but because nobody really knows what Intel is anymore. Engineers don’t feel like they are working in an engineering company anymore.

Intel’s margins are great, and despite the negative prognostication, their market position is still dominant. Even the engineering talent at Intel is still some of the best, although the management is somewhat deliberately not their best representation. They Just need to focus on being an engineering company again, ignore sales, ignore acquisitions, and invest in their own people to do the experiments that will lead to the next ubiquitous high-margin chip.


I really think this article fails to make its point about what exactly intel stands to gain from say, manufacturing processors for Apple. It trades in its major competitive advantage, advanced processes (dispite the propaganda intel is far from out of the game process advantage wise) coupled tightly with its processor designs that ensure margins that would make most companies blush, for low margin hypercompetitive contract work.

Intel is under absolutely no real threat of being unseated as the company that powers virtually every server in existence, throwing that away to chase contract mfging work dosent make any sense, even if some other companies do it well. The arguement it makes would be more like intel going back into DRAM production, and abandoning x86.


> throwing that away to chase contract mfging work dosent make any sense

It wouldn't be throwing anything away. Intel Fabs could still make Intel Design's Xeons same as always. It could just make everyone else's stuff as well as capacity and economics permitted. Why would you leave that on the table?

The best example is perhaps Samsung. Not only do Samsung's fabs make all the chips for samsung phones, they make them for others as well, including up until recently even their deadly competitor Apple! If you've got fabs, why wouldn't you do that? Even if Ryzen totally destroys intel on the desktop it's heads I win, tails you lose.


Becuase it gives their competitiors a pretty substantial leg up.


Intel’s advantage was built on scale. They could build more which meant they could invest more which meant they could build better and build more. By leaving scaling opportunity on the table, TSMC can now build more which means they’re investing more and oh no they just passed Intel.


You've got your eyes on the wrong ball, Intel the fab company will be competing for those contracts, but Intel the chip design company will be free to innovate without the burden of manufacturing.


> Intel is under absolutely no real threat of being unseated as the company that powers virtually every server in existence, throwing that away to chase contract mfging work dosent make any sense, even if some other companies do it well.

The future of their networking division might be brighter than the server one, because decentralization (abstraction layers on top of interconnected devices) is counter to the demand of servers and hardware powering current cloud tech.

The servers will move into peoples pockets or iot devices with increased decentralization.


Intel had the capability to be the only game in town when it comes to latest gen chips. If Intel was an open shop that anyone could order anything no one would bother with second and third players because the facility costs are so astronomical. Chip manufacture is by it's nature a natural monopoly.


It is not like Intel has not repeatedly tried.

Atom is their attempt. As are more advanced MCP51 micros.

Intel's problem is that their CPUs were performance focused and as such didn't scale down as well - plus vendors didn't have to care as OSes were portable enough while software has to be rewritten anyway, so going Intel brought no compatibility advantage. This is the same reason why Microsoft has trouble penetrating mobile market.


> Intel's problem is that their CPUs were performance focused and as such didn't scale down as well

Don't know the first thing about electronics, but I am interested to learn more about what you meant here. Highly-optimized code (usually) isn't too readable or extendable, are you saying Intel has the same problem with their processor's design?


There are a lot of instances in a design, both macro and micro, where you can trade off greater performance at the expense of something else - either area or power or both.

Then these tradeoffs tend to multiply each other. If you have a design which is fast on branch-predicted and slow on branch-mispredicted, and you build a big branch predictor, it works well. But you're committed to powering all that.

I mean, compare http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.... (few sentences) with http://www.agner.org/optimize/microarchitecture.pdf (admittedly a lot more detail, but it's clearly a more complicated system too)


Fast processors work by “speculating” — calculating several possible next steps in advance before knowing for sure what the next step will be. Modern Intel chips do this several steps ahead, at multiple levels. It uses more power, because it does work that is ultimately wasted.

Large amounts of speculation makes sense for data center CPUs, but not for battery-powered devices.


I'd say this is a myth unless you have actual numbers.


If you compare energy per calculation intel server processors readily beat ARM server processors. When under constant high load, intel processors surely aren't bad in regards to power consumption. So that can't be it.

Maybe it's more about peak performance vs. average power consumption under very low load as you usually have in a mobile phone that's only used occasionally. For example when starting an app and scrolling you need very high performance but the rest of the time the CPUs are mostly idling.


There are different design considerations when designing a CPU to run in a chilled data center vs your pocket. Intel was trying to make everything x86 from data center to pocket and that just wasn't going to work.


On one hand, this article seems to make a lot of sense. OTOH, INTC is up 40% in the past year. It seems likely that there are quite a few parties out there who probably don't agree with this analysis.


Yes and Microsoft profits were rising at the same time the web, iPhone, Chrome, et al were eating out its foundation. Bell Labs invented VoIP; AT&T or Verzion could have built Skype at any time. Xerox and PARC. RCA and LCDs. This sad story repeats itself continuously.

That's the whole problem with the innovator's dilemma and why so many people look up to Apple - one of the few companies able to cannibalize an existing product line in pursuit of the future. Mac & (Lisa/Apple II), iPhone & iPod.

It is extremely rare to be able to see the future coming, time it correctly, then undercut your current extremely successful product line while you prepare for the change. Everything in business school 101 screams "NO!". Managers are paid to optimize returns and the current business.

Ben is 100% correct about Intel's history. They clung to their identity as a DRAM company and it almost killed them. This time they're clinging to their identity as an integrated processor company when most of the world wants fully or slightly customized SOCs. Intel directly passed on the most lucrative mobile SOC business (iPhone). It is unfortunate that Intel missed this second opportunity. Maybe new management can pivot, but whoever it is will only have about 90 days to make the change.*

* You typically have about 90 days where you're still an "outsider" who can proffer opinions on problems with current practices. For management this is also the best time to make sweeping changes since most people will be willing to jump on the new direction. IMHO after that you have to spend a few years building credibility to get things done without people ignoring or sabotaging you.


> Yes and Microsoft profits were rising at the same time the web, iPhone, Chrome, et al were eating out its foundation.

So... the scenario you have in mind for Intel's coming collapse is... a not-quite-as-dominant major industry player in the coming decades? Microsoft got passed by Google and Facebook and Amazon, sure. They're still making, y'know, crazy bank.


I don't know about the other examples but Xerox made all their money back on Parc and then some, via the laser printer patent iirc.

They did miss the boat on personal computing though...


Intel's biggest threat is the decline of WinTel, not its inability at making stuff for consumer products.

Intel has legions of top tier IC designers, they can make Apple A10 or Qualcomm 845 class chips as weekend projects. What they don't have is determination to abandon the proven WinTel model business, which is still an extreme cash cow.


Weekend project? I thought the Apple chips were getting pretty competitive with Intel chips despite the thermal/power constraints of a phone. I thought AMD was getting really catching up with Ryzen and thread-ripper. I thought Intel was slowing from tick-tock to tick-tock-tweak. I thought Intel had a couple of large layoffs in the past years which axed a lot of their older^w underperforming workers. Are they truly still such a powerhouse?


>Are they truly still such a powerhouse?

Even after layoffs, they still have a humongous IC designers headcount. Possibly, still, largest in the industry.


Intel embracing RISC-V would be a super-bold move. Given the support we can already see from the long list of RISC-V foundation members, it's bound to succeed. In the next couple of years it will probably wrestle a few royalties away from ARM.

Beyond that, I think it will end up a legit heavy-duty apps processor for Android phones. It might start out on the mid/low-end phones, but manufacturers will absolutely love the idea of cutting out ARM from their BOM.


PowerPC had support from Apple, IBM and Motorola, plus it saw use in every video game console for a time period, it got a Windows port... it's still around and has its uses, but I don't know that you'd say it "succeeded." RISC-V may have support, but it's a hard market to break into.


It would be very interesting to see and Intel or AMD designed RISC-V chip. I'm curious just how the ISA could do given similar resources brought to it.

AMD should be in a better position to try because they were already planning to put ARM cores in their SoCs a few years ago. While that effort was dropped, I assume they still have a bunch of people around that know what the issues are and are ready to tackle them. Problem is ARM doesn't really have enough of those people to divert to alternatives.


AMD or Intel could beat others to the punch and make a mobile SoC with RISC-V applications cores. They'd be the first to face the pain of porting Dalvik and NDK apps, but in the end they'd be way out ahead of the competition.


I've just been assuming Google would bring Dalvik to RISC-V. They're already porting Go and some other things. If they were to release Android for RISC-V it could be devastating to ARM. Not that I think Google cares about ARM, they've been enjoying competition among SoC designers due to ARMs reasonable licensing (that's a relative term of course).


I meant the more abstract pain of introducing the new ISA and probably taking blame for problems with functionality and performance. Yes, Google would probably be the ones executing this task.


Does Google force developers to upload an IR or bytecode for the NDK so that they can just recompile for a new architecture?


Oops. I meant AMD doesn't have enough resources...


RISC-V is a much bigger problem for ARM than it is for Intel.


Therefore Intel getting into the RISC-V bandwagon and making it a freight train.

But I suspect they won't do that.


Nokia/Symbian vs the Open Handset Alliance/Android, all over again?


If INTEL positioned itself as the main source for risc-v SOCs they could take away ARMs thunder.


I don't think, right now Integration is something wrong about Intel. It is how they miscalculate every thing every time for the past years, even before BK. Planning / Vision, and Execution, I am not sure if there are anything else other then these two attribute that is important to a company, and Intel lacked both.

Intel used to win because they have an effectively monopoly in PC market. The high price, high margin and high volume CPU drove investment in tech and their Fab. That was possibly the only item I know other then iPhone that had those three factors together. When the Smartphone revolution started, everything changed. The scale changed, TSMC now produce possibly close to 5x volume then Intel do at Intel Fabs, and their leading node 7nm will likely reach a similar quantity to Intel's 14nm in 12 months time.

TSMC now has 2X+ the Total addressable market in leading node compared to Intel. So while new leading node investment cost continue to increase, TSMC also had increasing market size to help spread itself. If you note Intel's mainstream CPU die size, the trend has been going downwards over recent years, i.e They are optimising for cost.

Intel could also have expanded their Scale. They could have kept their integration and make more chip, expanding their Fab advantage in both scale and technology. Although they did but it was either late or taken far too long. Their GPU is only looking to launch in 2020, as a first gen product I doubt they will make much an impact and volume. Imagine they had a dGPU prepared 4 years ago, when the crypto wave hit, they ride the wave and gain money earned from Crypto, or they would have a dGPU for a market prices that is cheaper then what ever Nvidia or AMD's bumped up prices. Their 14nm process would actually be miles ahead of 28nm or 16nm that those GPU had.

It took them 8 years, 8 years! from acquiring Infineon modem to actually produce a modem in their own Fabs. Next time some M&A talks about synergy one should ask them, how long does it take. Now they have the contract for iPhone modem and could produce it themselves instead of TSMC, they face a new problem that no one has asked so far. How is Intel going find additional capacity to produce an additional 100M modem for Apple over the next year? My guess, the original plan of Intel was the move to the much delayed 10nm will finally be fixed in 2018, leaving some room for the Modem. The modem is much smaller in size, so it is more like 40M unit to Intel's median chip size. But that is still roughly 20% additional unit Intel has to chip on their leading node.

And if you had read some rumours about Intel delaying, or renaming their chipset originally going from 22nm to 14nm. That is a possible reason why. May be they shouldn't have defer Fab 42's construction in the first place.

I am pretty sure BK's departure has more to do with his performance then whatever they wrote in that PR. And may likely has something to do with Apple. I wouldn't be surprised if Apple has been unhappy about Intel's lying and execution. Remember when Apple started to design their "thinner" MacBook and MacBook Pro, it was obvious they had 10nm and 7nm Intel's CPU in mind. First was suppose to happen last year and the latter schedule for next year. This two year delay, which Apple may or may not know, put Apple in an awkward position. Their custom foundry, were suppose to attract Apple to built an integrated SoC with Intel Modem inside, which is now even less likely to happen when Intel has not been as open about its problem. When Apple contracted Morris in TSMC, they managed to bring up a new Fab running for Apple in less then 6 months. It was a big bet on both side, billions of dollar invested. Could you imagine Intel doing that? These relationship takes time to built, and TSMC right now has literally zero fault.

I do agree Intel now faces another turning point, last time they had Andy Grove, now I am not sure who they have hire to get this behemoth running. The task is enormous, and lots of Intel's best engineers and executive have left the company. The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat.


>I am pretty sure BK's departure has more to do with his performance then whatever they wrote in that PR. And may likely has something to do with Apple.

Yes

>When Apple contracted Morris in TSMC, they managed to bring up a new Fab running for Apple in less then 6 months. It was a big bet on both side, billions of dollar invested. Could you imagine Intel doing that? These relationship takes time to built, and TSMC right now has literally zero fault.

Apple contacted both TSMC and Intel for their own modem. They came with "We are ready put $n billions cash bond on the table today." And Intel refused.


> I am pretty sure BK's departure has more to do with his performance

"The strife of Brian", as the wags at ElReg titled it last week, suspecting the same: https://www.theregister.co.uk/2018/06/23/brian_krzanich_inte...


Not sure why I am getting lots of downvote, would have been nice if people actually have some feedback or response.


Well Microsoft is about to be fully integrated if they make their own CPUs for their PCs like Apple is about to do (and sorta like how they designed their own XBox processors)... Let's see how that plays out for them and if this anti-vertical-integration theory holds.


Does this mean we can have more than 16GB in our laptops now?

Also, is "stratechery" named based on the old Celebrity Jeopardy skits on SNL?


i love the article's general quotation about disruption:

... what makes disruption so devastating is the fact that, absent a crisis, it is almost impossible to avoid. Managers are paid to leverage their advantages, not destroy them ...

as other firms innovate, those advantages become obsolete practices so leveraging them means taking on more risk, not less risk.


Does anybody know a good technical history of Intel?

I teas “Inside Intel” and there was some good stuff but seemed too superficial.


I think the most common issue would be neglecting constant values.


> it was already clear that arguably the most important company in Silicon Valley’s history was in trouble: PCs, long Intel’s chief money-maker, were in decline, leaving the company ever more reliant on the sale of high-end chips to data centers;

Where PCs were replaced, they were succeeded by laptops, and for the most part they use Intel processors.


The term “PC” refers to both desktops and laptops. Both are in decline as general purpose computing moves to the smartphone.


I wouldn't overestimate the impact of smartphones. It's just that PCs (both in desktop and laptops) don't need upgrades anymore from year to year. You can use your PC from 2012 (Intel i5-2xxx if you remember), possible even farther back and you'd still be fine for almost all tasks. Gamers and other high-performance tasks are a niche. The biggest performance gain for these machines is a SSD, not a new PC.

The same will happen to smartphones. You can already see it. Manufacturers trying so hard to invent features nobody needs, just to have an excuse to buy a new device.


That's the point. The so-called "post-PC era" has never arrived. Personally I don't know any adult who doesn't own a computer. We no longer upgrade so often but we use them on a daily basis. Pretty much anyone doing any kind of white-collar work will use a PC (that includes Macs...) Your needs outside of the work environment will vary, but people are still not throwing their laptops away just because they use their phones more often.


Perhaps the post PC era did not arrive in your immediate bubble, however I assure you it happened.

There are whole parts of non first world societies where laptops are not a thing and everything is done on phones and tablets.


The question is: do new adults (think 18-24 year olds) buy desktops or laptops?


At least one, yes? There's no university student without one, and there's no white-collar worker without one.

Literally nobody is typing up their 10-page essay on their phone.


Tablets + keyboards.


Tablets remain truly mediocre devices, in the case of Android running a mediocre os.

- garbage multi tasking

- slightly laggy typing

- no real ability for multiple apps to work on the same data

-10" is still a tiny screen and almost as unwieldy as a small laptop 7" is too tiny yet still not pocket size.

Since a tablet usually does not have by definition a cellular modem it can't replace your phone plus the cameras and microphone normally blow.

I find the idea that most people are using a tablet in place of a real computer laughable.


if gaming is an issue: desktop. Laptop in the other case. Or both, you can get cheap laptops for 300€ which will work just fine because of the reasons in the parent.


Heck, I'm a gamer, and I'm still using a rig with an i5-2500K at its core. I've upgraded the graphics card twice since I put the machine together, and put in an SSD to use as a boot drive, but I've never felt a burning need to upgrade the CPU. That's just not where the bottlenecks are anymore, even for games.


> The same will happen to smartphones. You can already see it. Manufacturers trying so hard to invent features nobody needs, just to have an excuse to buy a new device.

Unfortunately, smartphones exist in this world where everyone has far tighter control over hardware/software integration. So, its much harder to run the latest software on an old smartphone than it is to run the latest software on an old PC.

Of course its not because the hardware is necessarily incapable, but more because everyone assumes they can get away with new market/platform/support rules "because mobile."


As long as the apps from the appstore are installable, this is a non-issue for non-technical users.


More like "as general purpose computing dies, replaced by the smartphone."


People are doing everything on smartphones they used to do on "personal computers", the form factor doesn't change that.

As far as extensibility and/or openness of the platform goes, that may define general purpose computing for you but the industry has abundantly shown that it doesn't for most people.


> People are doing everything on smartphones they used to do on "personal computers", the form factor doesn't change that.

I don't think folks are doing programming and graphic design on their phones...


Not even writing text. The interface is just terrible for that.

Obviously a day may come where people plug their phones into docking stations that are equipped with a keyboard and a monitor, but it seems likely that we'll stick to lightweight laptops for a long time.


True, but we've gone from millions of people using PCs to billions using smartphones, and a majority of the people using PCs never used them for programming or graphic design.


There are billions of pcs.


There are certainly more than a billion PCs in existence. I'm skeptical there are much more than a billion PC users.

Right now it takes about 4 years to sell one billion PCs[1]. Considering that includes business sales, a significant chunk of that includes people with multiple PCs.

Compare that to smartphones, which sell well over a billion per year[2].

[1]: https://www.statista.com/statistics/272595/global-shipments-...

[2]: https://www.investopedia.com/news/smartphone-sales-be-flat-y...


You started by claiming 1-several orders of magnitude difference in pc users to mobile "millions vs billions" which is clearly nonsense.

Now you are using sales as a proxy for users which is nearly as false.

People buy and rapidly dispose of phones for a variety of reasons.

They break, are lost, get stolen. Having a new model is a status symbol. Current models are supported by the manufacturer for only a few years on the Android side. Newer models are significantly better than last years.

PCs easily last 5-7 years. Decreasing revenue from new models annually doesn't imply that the users or the platform is no longer relevant.

Reports of the death of the pc are greatly exaggerated.


A "billion per year" is clearly unsustainable, given the world population. I mean, shouldn't we step back and observe that clearly if they're selling at this rate, some grave defect in human behavior is being exploited to the extent that something will dramatically collapse soon? Also keeping in mind the end of Moore's law as people have mentioned.


I don't know about the other poster's opinion, but mine is that it isn't a personal computer unless I can program it with itself. The ability for anyone to write and distribute software was what made PCs great. Current smartphones fight the idea that the user can own their device and can do whatever they want with it, preferring a walled garden approach that's helpful to consumers, but not personal computing.


I'm also of the opinion that current smartphones are not "personal computers" in the original sense of the term. The power is in the hands of the OS and apps, they are the real users, making the decisions of how to use the computer - while the consumers are just that, consuming programs that other people/companies designed.


It's a disturbing trend but I see some dim hope. Everywhere across the world the importance of programming education is recognized. The BBC:micro is probably the epitome of this, with the the aim of the entire population knowing how to program a device at some point in a future.

The people who learned to program in this way will appreciate the ability to control a device in this way and will naturally steer towards platform allowing them to do so.

Moreover, even Apple is changing these days. Making Swift available is a step towards more ubiquitous app development. And with Swift Playgrounds you can learn to actually program on your device, even though it's not the same as controlling it the way we're used to. It's not unthinkable they make further concessions and at some point iOS devices could be programmable to a certain extent.


Even if they do, it is just as form factor, everyone doing serious work with 21" screens or higher will be using docking stations and will be business as usual.


Mobile sales are plateauing now as well. Moore's law has hit them just as hard. It would be foolish for Intel to dump everything to jump into a market just as it matures and peaks.


Is that an absolute decline now, or just relative?

(I'm asking about usage, not sales - I know the latter have fallen off a cliff since Moore's Law topped out.)


https://bgr.com/2016/11/02/internet-usage-desktop-vs-mobile/

Mobile internet usage worldwide blew past desktop usage in October of 2016 and hasn't slowed down since, and even that's just looking at web page views. I'd argue that holistically, smartphones have been the dominant technology platform for many years even before that, since apps are the primary interface to most services on the platform.


Thanks, but that doesn't answer the question. It's showing percentages, not absolute numbers.


This may mean so many things... From "people who use desktops/laptops do it in a work environment and use the Internet less" to "mobile apps use bandwidth more intensively than local apps (which is pretty normal in an office environment)"


Isn't the web the biggest platform still? All platform users use web sites/web apps, even though on mobile the percentages are a bit smaller, for now, because of mobile apps.


I’m not so sure that not choosing to produce ARM based chips for the iPhone was such an obvious miss for Intel. Why would they want their business subject to the whims of a single company over which they exert zero control?

Just as an example, a couple of months ago Apple seemingly out of the blue, announced they would no longer be using Intel chips in future laptop models and the markets barely noticed. Were they more reliant on PC sales that would not have been the case.

Perhaps they are way behind AMD with respect to the latest chip designs, but even that is far from proven.

I think this article is attempting to make a company with a very complex history and product lineup sound simple.


- 260 509 000 products with ARM (iphone + ipad)

- 19 251 000 products with Intel (macs)

These are the numbers for 2017 from http://investor.apple.com/secfiling.cfm?filingid=320193-17-7...

Apple sells twice as many ipads as they sell macs. And more than 10x as many iphones as macs.

Revenue wise it's a little bit closer.

- 160 541 000 000 $ for iphones + ipads

- 25 850 000 000 $ for macs


Intel makes ten to hundreds of dollars for each x86 processor they sell. They would make pennies to dollars manufacturing chips for other companies. If Intel made their own mobile chips they would have to similarly dramatically reduce their enormous margins.

In any discussion about Intel this is the point that really reigns supreme -- Intel was in the enviable state where their primary concern was always putting their own profits at risk. In the long term of course they should to move to new, growing markets (e.g. mobile), but from a closer term perspective they knew that such products, if not seriously crippled, would start chipping into their higher-end processors.

Hence why their mobile processor offerings were built on laughably dated processes, using very obsolete designs. If Intel made them better they knew that someone would figure out a way to drop 32 of them on a motherboard with a shared memory controller, etc.


Well, according to the Innovator's Dilemma, if you're Intel now you throw 1-2 of those billions of dollars of yours into a totally separate startup that you fully own.

That startup is free to make ARM chips or whatever the markets wants. It's better long term if you cannibalize yourself than if other companies do it.


If you do that, as a public company, don't investors come along and say "hey, you really need to spin that off"? I mean, I see there are quite a few companies that just seem to have a grab-bag of random stuff they make, as though they were a mutual fund of different tangentially related businesses. And it certainly occurs that people decide they are more valuable broken up.

From the larger perspective of society in general, why should a company be eternal; why does it need to always have a way to pivot?


> Hence why their mobile processor offerings were built on laughably dated processes, using very obsolete designs. If Intel made them better they knew that someone would figure out a way to drop 32 of them on a motherboard with a shared memory controller, etc.

Interesting... Would you care to share a reference for this? Thanks.


> Hence why their mobile processor offerings were built on laughably dated processes, using very obsolete designs.

I thought the various Airmont-based chips were made on the same 14nm process as everything else they made at the time?


Airmont came after Intel gave up on mobile, and they targeted the premium market (e.g. Microsoft Surface), thinking they could eek out a niche there with their compromised solutions.


Oh, right. Apparently they still made phone SoCs with it, but the vast majority of the push was earlier.


I think you’re missing the point. It’s not about the decision to do ARM mobile chips instead of PC chips, it’s about the decision to go for data center dominance instead of consumer electronics dominance.


They had a mobile-dominance strategy -- it was Atom. They even partnered with Google/Android. (Maybe you're saying the execution was lacking?)


The article points to (if only by implication) x86 not being a great design for low power processing.


>a couple of months ago Apple seemingly out of the blue, announced they would no longer be using Intel chips in future laptop models

Clarification: this was a (likely true) rumor, not an official Apple announcement.


Even if those apple arm chips were low profit, two reasons:

1 - volume over which to amortize process investments;

2 - a chance to be the dominant vendor in a new, albeit lower-profit, market.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: