This is stupid, stop trying to milk silicon beyond 11 nm and start using gallium arsenide or indium gallium arsenide. The Fujitsu AP2000 used BiCMOS and GaAs fabrication back in the early 1990s, for crying out loud. Get rid of silicon, it's outdated and needs to be replaced! You have BILLIONS of dollars and get government subsidies, why not take a risk?!
The only problem of course is that Gallium based Semi's are about 100X to the cost which is why no one uses them for anything outside certain categories where the cost is justifiable.
iii-v materials can also only result in n-channel HFETs or NPN BJTs. You can't make an active pull-up device, which is required for any VLSI use. Only very high power resistive pull-up or current steering logic is possible.
Price is more than volume production, the materials are far more difficult to obtain, purify combine and handle than silicon and frequently involve highly toxic chemicals. Silicon on the other hand requires an enormous amount of energy but doesn't involve toxic chemical processing. Take a look at the wiki article and the processes to produce GaAs cyrstals, it's not simple. https://en.wikipedia.org/wiki/Gallium_arsenide
Let me resume Silicon properties from wikipedia...
- Abundant and cheap
- Has economies of scales on its side
- Very stable and can grow to high diameters with good yields
- Very good thermal conductor for densely packed transistors
- Its native oxide works very well as insulator with excellent electrical properties
- Has much higher hole mobility which enables CMOS transistors with much lower power consumption
- Is a pure element, avoiding the problems of stoichiometric imbalance and thermal unmixing of GaAs
- It has a nearly perfect lattice, impurity density is very low and allows very small structures to be built
And Gallium Arsenide drawbacks include...
- Naturally, GaAs surface cannot withstand the high temperature needed for diffusion
- GaAs does not have a native oxide, does not easily support a stable adherent insulating layer, and does not possess the dielectric strength or surface passivating qualities of the Si-SiO2
-Because they lack a fast CMOS structure, GaAs circuits must use logic styles which have much higher power consumption; this has made GaAs logic circuits unable to compete with silicon logic circuits
- Problems of stoichiometric imbalance and thermal unmixing
- GaAs has a very high impurity density which makes it difficult to build integrated circuits with small structures, so the 500 nm process is a common process for GaAs
So you know, usually statements like "This multitrillion dollar industry is wrong, what they do is stupid, they should do X" coming from random dudes on the internet are bullshit.
"So you know, usually statements like "This multitrillion dollar industry is wrong, what they do is stupid, they should do X" coming from random dudes on the internet are bullshit. "
yeah but Dear Leader has done that for a lifetime, and look where it got him. :)
And your argument is wrong. Economy of scale is one factor that determines cost, but it is not the only factor.
If economy of scale were the only determinant of cost, then we would expect every product to be produced by a single monopolist, with a single factory. Any time there were two competing factories, they could reduce their costs by building a single factory of twice the size. The result would be that every product on earth would be produced in a single factory, owned by a single company.
But that's obviously not the case in the real world.
" Price is only an outcome due to lack of volume production"
absolutely not, in this case. the cost is in quantities of raw materials. silicon is plentiful and cheap. the rest are far from it. and, as other comments have described, not technically superior.
It doesn't matter, move away from silicon, this is getting ridiculous. You can't keep milking this technology for decades and expect it to, and these multi-billion-dollar technology conglomerates sure can afford to switch to InGaAs or GaAs and still make countless billions more, all while not paying taxes and receiving massive government subsidies.
Moving away from silicon has been an industry goal for decades now. When anyone manages to find a cost-effective method of doing so they will become wildly rich. To date, nobody has pulled that off. Not for lack of effort.
"Moving away from silicon has been an industry goal for decades now. "
it's worth noting that the original transistors were made of germanium. hard to find, hard to use. the engineers figured out how to use abundant silicon (thought, not just sand scooped up from the beach). the laws of physics can't be flummoxed.
Evidence seems pretty strong that you can in fact milk silicon tech for decades, since that has already happened. And other companies seem to feel pretty confident about milking it for a couple more decades at least. I'm pretty sure if GaAs or some other technology was easy to make profitable, someone would rush to it and crush the competition. Intel has billions in the bank and knows that Samsung is a mortal threat if they don't catch up. Pretty sure they would leapfrog to another technology if they could have an edge on Samsung of even 6 months.
What do you WANT from your logic process? THAT is the question -- figure that out, then we can decide if GaAs or other III-V materials make sense.
What's holding back the performance of existing CPUs? There are three contenders (which one bites first depends on exactly your target market) - power density (too much power generated and you can't get enough current in, and you can't cool it fast enough) - metal speed (also RC) you're limited by how fast wires can transport info from one part of the chip to another - transistor speed
GaAs ONLY helps with transistor speed. In the process it goes bezerk with power and doesn't help with wire speed.
Beyond that the primary reason our current CPUs don't hit their potential is waiting on memory, which halves (or more) their IPC compared to their full potential. All the ways of dealing with this are built on massive transistor budgets (starting with huge caches). But if you want small transistors, there's no magic in GaAs that makes this easier. It's the exact same lithography, with the exact same difficulties, whether it's multi-patterning or EUV.
It's like a car company wants to make a better car, and your answer is "we need to pour ALL our resources into making better paint". WTF? Does better paint make the car faster? Does it improve power efficiency? Does it solve any actual engineering problem?
More companies would do better by having their own fabs, not less. Being vertically integrated via either merger, acquisition, or organic means is a much better way to remain competitive than solely remaining fabless and outsourcing. Intel is only as large as they are in the semiconductor world because they did not so foolishly sell off all their fabrication facilities like AMD did.
"More companies would do better by having their own fabs, not less. "
I've always been puzzled about how much scale there really is. ASML (and who else?) supply the fundamental equipment that all fabs use. did ASMl/whoever figure out, say EUV, and supply all the fabs; or did Samsung, say, figure it out and ASML/whoever built the machines to Samsung/whoever's spec?? from the picture of fabs I've seen, it looks very much like some number of sequential, discrete work-stations, repeated X times. it's not like a car assembly line, where one line makes all the cars.
Yeah I noticed that too. The first photo makes it look like there are a bunch of semi-mobile machines arranged in groups forming the stations. Then the next photo might be more of an assembly line - at least it has a bunch of overhead tracks for things to move along, not sure what, maybe transporting items around.
It wasn't foolish for AMD to sell, TSMC is a lot cheaper for them than global while being on a better node ans AMD not having the money to invest in smaller nodes.
Going TSMC they are at least mostly competitive in the node portion of the cards.
You're either trolling or a fool to suggest that selling the fabs was a bad idea for AMD. For a company of that size they're nothing but a millstone. They limited their output capacity when AMD actually had good processors to sell, and then cost them billions when they didn't have a competitive product.
The stupid bit was the agreement to produce a minimum number of wafer stars with GF after they sold the fabs. That bit them hard in the financials.
Smart guy over here figured out the secret to business success in semiconductors if you ignore... *checks notes* ...the need for decades of R&D and many 10s of billions of dollars to acquire companies to supply goods, design products, and leverage talent.
LOLOL. Yea AMD is doing horrible since they sold off their outdated fabs and are paying less for superior nodes now. Their stocks have just plummeted in the last few years. The investors are super pissed that their $3 stocks have gone up to $20 since they don't run their own fabs (which btw is on their own way down the drain after sinking billions into 7nm R&D and scrapping it to focus on old nodes). Look at AMD's balance sheets, then look at TSMC or Samsung. It's 1B vs FIVE HUNDRED BILLION cash on hand. AMD is a tiny designer of fancy chips compared to massive semi-conductor firms that produce chips for the whole world. Even Intel is a speck compared to these companies and more than a few analysts feel Intel's best future move is also letting TSMC use their much deeper pockets to compete in bleeding edge node wars. (If you haven't noticed, Intel is getting the shit kicked out of them in node advancement.)
The savings from vertical integration is absolutely nothing compared to the R&D costs and ramping up manufacturing facilities for each new node. Even if the vertical integration made the wafers literally free, it would forever run AMD in the red trying to pay for the R&D and new facilities for a tiny fraction of the chips sold required to justify the cost of each node advancement. AMD would have to sell over 100X the amount of chips they did in 2018 to maintain a fab that could keep up with TSMC and Samsung. TSMC can do it since they sell a staggering amount of chips compared to either AMD or Intel. AMD accounts for under 1% of TSMC's contracted chips produced. In the tech enthusiast world, AMD is very exciting since they play such a large roll in gaming hardware design but in the business world, they're a little ant.
It turns out that at very small dimensions compound semiconductors such indium gallium arsenide do not offer any benefits over silicon. It's fundamental, in the physics, quite apart from any economic considerations.
Will IBMs huge mainframe chips be one of the specialized products they'll still be supporting, or is big blue being unceremoniously dumped in a ditch and left to scramble to migrate their chip designs to either a Samsung or TSMC process?
according to the wiki (no, I didn't run the links :) ) the Z14 is a 6.1 billion chip. not, by any means, the highest transistor count among cpu. not the biggest, but big, by area either. here: https://en.wikipedia.org/wiki/Transistor_count#Mic...
Look at what Intel has done with 14nm, +, ++, etc. There's a lot of optimization left in leading edge hardware for the obvious reason that you're always time constrained -- you do things the safe way, you drop ideas that don't fit the schedule, and so on. IBM can certainly cruise for a few years (with both POWER and z) on just optimizing what they have at the µArchitecture and system level, sticking with GloFo's 14nm (which will doubtless pick up small improvements at the process level each year); looking at how they can exploit advanced packing (ie things like interposers or EMIB) and advanced memories (MRAM, Nantero, Optane, ...).
That's not an ideal on-going situation, but it's absolutely feasible for a few years as the market sorts itself out and GloFo figures out its future. Maybe GloFo partners with UMC? Maybe they license Samsung's 7nm as soon as ASML can make enough machines? Maybe they conclude that the economics (for their set of clients) works best not by standing still at 14nm, but if they are always about three years behind the absolute leading edge?
If this means that AMD will no longer pay penalties to fab their chips at competent leading-edge foundries, then this is fantastic news, and Global Foundries can go climb into the rubbish bin of silicon history.
Didn't read the article did you? GloFo is abandoning the market because they don't believe they can compete economically against Samsung and TSMC. Basically the EUV fabs cost $20 billion and to afford that you need to produce more modules than GloFo physically can. As a result they are going to start fabbing specialized chips (probably ASICs and others) on their existing process and let TSMC and Samsung have the market.
No, the stated reason is "economics". It is certainly true that one day there will probably just be the one true fab, so in that sense this is inevitable. But, let's be honest here as well, the article also says they are behind schedule. Which is not not a factor. Worse, they use marketing nm, so they are late behind Intel which is already way late.
I don't agree in this case, both Samsung and TSMC are in volume production on 7nm. Though I have no doubt it's difficult the logic provided by GloFo for the reason the pulled the plug is convincing IMO, Fab costs continue to rise, well beyond the predictions even 2 years ago. IIRC they were predicting about 8 Billion for a 7nm Fab and now we're looking at close to 20, that's a significant change and dramatically alters the accounting. I think GloFo being a bit behind decided to do a cost analysis and saw how out of wack they were and realized it was time to stop as proceeding could drag them into bankruptcy. This is essentially what they said.
In the manufacturing of silicon technology=economy. I'm sure there are very super expensive processes in a lab somewhere that can produce awesome silicon at an equally awesome price. But, that is not what we are talking about here, either your technology can produce wafers of die at a cheaper cost or your technology is not up to par. Also, if there is not technological reason then they would sell the process rights, correct?
Take it as you will, but no one IN the semicon industry buys the reason GloFlo is selling. While economics are certainly a factor (leading edge is expensive), there are most likely technical issues which tipped the scale solidly towards the decision to kill R&D.
IF THEY only have revenues of $6Beelion , then it becomes very very hard to afford a fab costing $20b. Mubadala didn't want to keep feeding the cashless cow as oil is losing its value. Is this a market that China is going to get into ?(long-term)
Interesting story, but I believe a bit more skepticism in GloFo's official line is in order. To me, only two scenarios make sense: 1. They just couldn't get their 7 nm process up-and-running at any volume and yield that would be at least somewhat competitive with TSMC and Samsung by Q1 2018, and decided to not throw good money after billions already sunk into this. 2. They could have gotten the process working at yields and volumes that would have made sense in theory, but couldn't find the customers/orders anywhere near the volume to at least break even here. If it's the second reason (tech actually works and is sort-of on target time-wise), then why not try to wrap the IP, the personnel (expertise) and the expensive equipment up in one package and spin the "7 nm subsidiary" off as its own entity, just as they are doing with their ASIC line? And here a really wild idea, again all based on the tech actually works and is basically ready: Maybe Intel likes to pick GloFo's "7 nm bones" clean and get the goodies (IP, expertise, equipment) for cheap. Chipzilla could really use a below 10 nm process right now, so provided it does work, who knows? Also, that solution would keep at least one 7 nm fab in the US, which might be of interest from a national security POV.
Damn typos. I meant 2019 for point 1. "1. They just couldn't get their 7 nm process up-and-running at any volume and yield that would be at least somewhat competitive with TSMC and Samsung by Q1 2019,..."
As the article clearly states, the GloFo 7nm is behind the TSMC 7nm, which itself is equivalent to Intel's 10nm. So the IP might not be very interesting to intel, although they've gone in a slightly different direction.
I actually agree with the notion that x nm doesn't equal x nm when comparing different processes. However, Intel's 10 nm process has (continues to have?) had many teething problems, and scale-up has been delayed and apparently almost painful. My thought was that this is a potential chance for Intel to buy valuable IP, expertise and basically new state-of-the-art equipment at bargain basement prices. I know that they like to do everything as much in-house as possible, but Intel has made large acquisitions in recent years if they thought they would add to their bottom line.
1) Is the most likely scenario. Everyone is having problems as EUV is late for everyone and the alternatives are quad patterning which adds costs and slows production. It is a bag of hurt. 5 nm was looking like it could only be developed by masochists.
2) Not improbable was GF was never that aggressive with their roadmap. The problem is that the other players hit delays so the reality likely was that GF was never going to be that far behind. Perception is everything. Spinning off 7 nm production would still require customers to come forward. This is a clear chicken and the egg scenario where the customers decided to roost else where.
As for other players purchasing GF, it is unlikely as companies like Intel actually have an excess of capacity for logic right now (DRAM and flash are slightly different which are higher utilized). The exception as noted would be keeping the US fabs in the Trusted Foundry Program. Certain stipulations had to be adhered to for GF to inherent those as part of the purchase of IBM's fabs. As such, there are likely conditions that still exist for the sale of those fabs. If another member of the Trusted Foundry Program just wanted those fabs, it would be far easier to change ownership.
Rumours last year said that the previous CEO Mr. Sanjay Jha and his team wanted to stop investing in 7 because the IBM led management team in Malta was behind and struggling. the current management team it seems was against 22fdx but pushed for 7nm. now after failing to deliver, the same management team under pressure from their investors and AMD have given up and seems to be calling it a strategy shift. Billions more wasted is what is said. GF needs new management, not dead weight IBM'ers that are slow to move if at all.
The article clearly paints it as your number 2 scenario. When they did the math on buildout and customers they realized they'd loose money so they bagged the whole thing and decided to focus on their current process.
Keep in mind there is still a ton of stuff fabbed on older process nodes, it can in fact by quite lucrative. With AMD unbound on manufacturing it also free's GloFo from having to spend billions building and upgrading Fabs. Given the rising costs in fabrication as they move up each node the market is likely to consolidate further and we could even see TSMC or Samsung drop out of the race due to the costs. Fab prices have gotten to the point of breaking budgets for even huge companies, there are very few companies that can afford to build a $20 billion EUV fab.
Actually, you can, if you have a fab or two that can house the equipment and the people (the expertise) who are now being "made redundant" as the the British say. Processes have been moved ("migrated") from one fab to another in the past, although it can be a major pain in the derriere to do so.
I wonder how this plays out for the Trusted Foundry programs that GF inherited from IBM. They allegedly had several contracts open with three letter agencies for specialized parts. Outside of that, they still provide some radiation hardened nodes as well (though far from state of the art). I'm not sure how profitable, if at all, things would be catering to those types of customers. While not high in volume, those due produce a nice margin. While the number bleeding edge nodes within the Trusted Foundry program is slim, there are plenty of players able to fab using older nodes. As the rest of the industry marches toward 7 nm and beyond, those other small time fabs will catch up.
While IBM's contract with GF runs out at the end of this year, GF did develop a 12 nm FD-SOI process that looked like it was being used for a POWER9+ revision. Still those products would have been initially planned for a late 2019 release date.
That's one of the open questions here. 80% of TSMC's 7 nm capacity is contractually bound for Apple's needs, leaving precious few wafers for everybody else. eetimes published an interesting article on this a few days ago. Basically, players like Qualcomm and Huawei (HiSilikon) will likely have to get chummy with Sammy if they want their 855 Snapdragon in 7 nm in usable quantities, because Samsung is now the only other fab operator supposedly close or ready for 7 nm. I do wonder where all the other 7 nm silicon for EPYCs, Ryzens, and assorted GPUs is supposed to come from now with GloFo out. AFAIK, TSMC is currently the only player with a 7 nm fab fully up-and-running, and that fab is very busy (and contractually obligated) to make all the A12 chips it can. So, this pull-back by GloFo is great news for Samsung; wouldn't surprise me if their stock just jumped a bit.
Good news for Samsung, sure. However, they are only an option for AMD if they can get the yields up. Historically, Samsung's process has been competitive with other contract fabs. Though (if I recall correctly), they've struggled with larger chips. ARM chips targeted at phone should be good (they need that for their own chips), but larger ARM chips targeted at servers and larger GPUs have historically been problematic. It is uncertain if large, high performance x86 chips would work out well. If Apple moved a large portion of their orders to Samsung, then TSMC might have the capacity to service all these larger chips. However, with so much of the TSMC capacity tied up by Apple, AMD will likely need to prioritize either CPUs or GPUs. Given recent history, I suspect they will favor server CPU production and probably back down on high end GPUs. Perhaps we'll get another round of smaller mid-range GPUs fabricated at Samsung. Unless Samsung suddenly develops proficiency for fabricating larger chips or TSMC suddenly frees up significant capacity, I suspect this bodes very poorly for competition in high end GPUs.
I think they are saying that the company isn't big enough to pay for the investment to continue process development at 7 nm and below. You have to make enough wafers to spread the development cost around, and they are just too small, make too few wafers. I worked at companies that have made this same decision, it's called a going out of business strategy. Any customers that might need to migrate to 7 nm and smaller will move elsewhere, usually they are the more profitable customers.
That’s the sound of Moore’s law colliding with physics.
A few years ago, an ex-Intel buffin held a talk on chipmaking at the end of Moore’s law. He predicted it would come at 2016-2020 and around 10nm.
That’s really it then. We’ll get another node at 7nm (eventually) but after that? Slow, incremental improvement.
Few people understand how big of a deal this is, but think about this. All of us grew up in a world where the power and speed of computers and electronics would double roughly every two years.
This exponential growth was a virtual certainty and it fed the biggest prosperity engine in human history. There isn’t many fields and businesses that didn’t in some way benefit from it.
And that’s almost over. There’s a few years left where it almost seems like nothing has changed, but it has.
Some of us have children. Those children won’t live in a world with electronics doubling in speed and capability every few years.
The next generation of video games for them will look pretty much like the last one, unless you’ll know where to look.
For them, it’ll be nothing but slow, incremental improvement year after year: A 2% improvement here. A 1% improvement there. The way it was before the transistor.
The scariest thing is, what will happen to the economy. Trillions of dollars and billions of man hours have been spent in the shadow of Moore’s law the past decades. What will happen, once there isn’t a reason to get a new TV every 4 years? If computers only get a few percent faster every year, will computers become a thing you buy once a decade?
I don't think the future is quite as grim quite yet. After 7nm there is still EUV for a few more nodes (probably 5 and 3nm, and maybe even 1nm). Thats probably over a decade right there.
Maybe in that time, someone will finally figure out an alternative. Carbon-based solutions have been mysteriously hyped for a long time. Graphene transistors? Carbon nanotubes?
the snark is well done. but... a static socio-economic environment is otherwise called "The Dark Ages". but this one is forever. if Social Darwinism is the paradigm going into the permanent Dark Age, said children's biggest worry won't be the missing electronics that double in speed every few years.
There never was a need to buy a new TV every 4 years. The crazy consumerism making people replace their perfectly working electronics every 1-2 years is a phenomenon almost exclusively happening in the USA and it cannot end soon enough. Believe it or not, life will keep going on after the death of Moore's law.
I think most people just keep their TVs until they die or the provider does something to make them obsolete, like in my case in 2008 my cable provider decided to kill analog over cable completely, so I had to throw out that CRT TV otherwise I'd still have it for a few more years, but that was a reasonable switch. That flat panel that replaced the LCD is still working in the bedroom, whilst we bought a new TV for the living room somewhere in 2014, and I intend to keep it until it dies, or the standards change and it becomes useless, but that likely wont be happening before it actually dies by itself.
It’s not like Intel is running a furniture factory, where anyone can get their table built, if IKEA is taking up all of the production at the usual plant.
(This is also why Intels last attempt at “renting out fab space” went so poorly.
Intels fabrication methods are tied into their design methods. They’re using their own tools and even design things somewhat differently from other companies. (Even AMD).
Intel’s fans and designers are used to a process that’s much more reliant on full-custom hand-crafted circuits which are intimately tied to the process.
Pretty much all of their IP, including the stuff for fabrication is designed and developed in-house, and is sometimes CPU specific.
Compare that to the rest of the industry that mostly uses common tools, and you can see the problem.
Going from TSMC to Samsung with a design might mean a delay of a few months, as you rejig the design to be built on Samsung’s process and tools.
Going from TSMC to Intel would mean a major delay as you redo the design for a process that was designed specifically for Intels needs and internal tools. If the chip is really complex, in some cases it would be easier and faster just to junk your design and start from close to scratch.
Very interesting. To me, this represents the canary in the silicon mine. Though, I did not expect it as soon as this. Maybe the other foundries will see these changes for what they are- submitting to physics. No sense in expensive pursuits that hasten arriving at the end of the road.
I think the writings been the wall for a while. Fab prices have been far exceeding estimates, almost exponential growth in costs is simply unsustainable.
There is an industry view that mature markets evolve into 3 main players. (with occasional disruptions & consolidations) There is a big player (e.g., 70% market share) ... and 2nd player (20-30% market share) ... and a niche/3rd player.
I think the semiconductor market is evolving this way also. (my guess is Samsung will evolve to be the 2nd player -- but we'll see)
This is what the end of Moore’s Law and exponential growth in computing looks like.
Foundries dropping out, not because they don’t see a way forward, or can’t figure out how to get to the next node, but because it’s too expensive to move on.
that's only half the story. the other half is product demand. everybody from user device producers to the materials miners can only thrive if there's sufficient unmet demand to require expanding output. for chip makers, whatever happened to the >300mm wafer?? never happened. same will happen with smaller nodes: while they *could* make more chips with smaller nodes, even on 300, unless there's demand for more chips, it won't happen.
that gets us into macroeconomics, which is to say the disappearing middle class. the 1% isn't (and never has) growing fast enough to absorb higher cost output. not enough output to drive down average cost. the bean counters care only about that. the current bowing to Social Darwinism from certain places keeps a lid on demand.
The books will look better in the short term but once 7nm becomes mainstream, it's not hard to imagine what will happen to Global Foundries. 90nm was the first sub-micron fabrication technology. Imagine what would have happened if a company says "we are already sub-micron, let's stop investing in 65nm and focus on specialized manufacturing". Who still uses 90nm now?
Or they could have stayed open but not move beyond 90nm. Sooner or later they would have recouped the investment and become profitable.
There is still a huge market out there for older foundries that make ICs bigger than 60nm. And there always will be.
Out of the hundreds (or thousands) of ICs we all come in contact with every day, it’s only a handful that really benefit from being manufactured as small as possible.
For the vast majority, 90nm or 22nm doesn’t matter, except in terms of cost, and the old 90nm fabs can make them a lot cheaper than any newer node.
Out of the entire worldwide IC market, only 40% is 40nm and smaller.
The other 60% of the market are all older, bigger nodes. Old 90nm or 130nm tool that have recouped the huge investment in them years ago, and are virtual money printing machines now.
To answer your question, 90nm made up 6% of sales from pure play foundries last year. And that’ll probably only grow in the coming years, since 130nm sales made up 7%. 14% of sales were of IC that were bigger than 130nm but smaller than 180nm.
And don’t forget btw, that it’s probably a smaller risk to stay on 12-14nm (or heck: 14-22nm) than to invest billlions of dollars in a new node that might never become profitable.
GF was facing a huge risk. The tens of billions of dollars that a new 7nm would cost, is also a virtual guarantee that it would never be cost effective against older, cheaper nodes where a huge part of the market is.
How much of those billions had they already invested though? At this point, I'd expect that over 95% of the funds were already committed and they are actually not saving too much.
Sure - they would have lost more money on initial ramp of 7nm - but would they then gain more back over the next 20 years by having 7nm to offer clients, and stopping development of any smaller nodes beyond 7nm?
On the other hand, (assuming things were coming along as well as they say) they're not far off from their targeted launch date and most of the research costs and initial fabrication capability costs (and associated equipment) are already sunk. One might make the case that they could have recouped some of those losses if they had launched the first 7nm process and ceased development after that. As it is, they have two extremely expensive ASML Twinscan NXE devices already installed that they no longer have use for (re-purposing them may or may not be practical). This all seems like they gave up a bird in the hand for two in the bush.
The biggest factor in the cancellation is probably the cost escalation. If the 7nm fab had come in at the $8 Billion predicted a few years ago they probably would have went ahead but when EUV costs went crazy the resulting fab costs went to ridiculous. There is no conceivable way GloFo could have built a fab costing $20 Billion. They didn't have the revenue or financing for such a cost. At $8 Billion the fab was within their spending with revenues of $6 billion or so, but a capital expense that's 4 times your annual revenue? I doubt there's a company in existence that could afford that without absolutely guaranteed returns.
This is the consequence of the EUV price increases and those costs weren't known until very recently.
Plenty of people still uses 65nm. Besides, 14/12nm is very competitive, 5/3nm won't be much different from 7nm, we are talking about 3 steps before the end of the road here.
90nm was the first sub-micron fabrication technology? You're forgetting about 700 nm 500 nm 350 nm 250 nm 180 nm 130 nm it's been a long road and many of these older nodes are still in volume production
Well...the former IBM fab (now GF) in Vermont is just that. It's a 200mm fab that never went below 90nm. Instead they diversified their offerings. Found a niche space in RF and have been running close to fully loaded for years now (sometimes a little under, sometime over). While I don't think this move is smart in the long term, there are plenty of chips to be made on older technologies. Think you would be surprised how many 130nm chips there are in current cell phones. People forget there is more than just a processor in there.
Well that seems to hang IBM out to dry. On the other hand I guess it releases them from whatever wafer agreement they have with GF. Maybe IBM will sell their Power CPU business.
So AMD has already done the work of porting the Zen 2 design to the TSMC process. The only question is whether TSMC has the capacity to produce as many chips as AMD can sell.
We are told that the Wafer Supply Agreement is being renegotiated rather than abandoned. The point of the WSA, as I understand it, was to commit AMD to buying some minimum number of chips each year in order to allow Global Foundries to recover the cost of developing new nodes. Since GF is no longer going to be developing new nodes or processes for AMD, it would make sense to dump the WSA. Perhaps AMD wants to have a WSA in effect for another year or so in order to have GF commit to producing a sufficient number of 14nm and 12mn chips for AMD to meet demand until AMD can switch most of its product line over to 7nm.
Zen2 is in no trouble. Vegas most likely and APUs were planned to use Global foundaries, so those may be delayed one or two years. Or They will stay in 14nm for some years. No big deal.
It kind of is? Heat and power use are bad in themselves and limit the frequency you can reach; die size also relates closely to cost. For APUs, you don't need more cores, but increasing the number of compute and rendering units could be the difference between reaching 4K or not.
Sure, you can make that die on 14nm, but you will pay for it one way or another. Maybe thanks to size or power it doesn't fit into an AM4 socket at all, and you need to go to TR4.
To all the people calling for a move away from Si - Tell that to Intel, Samsung and TSMC, as those three will be the ones leading development of new nodes. For the vast majority of tech products, we don't need those smaller/faster nodes. Think IoT and normal household tech. The bulk of tech. The need for computing power in a single chip isn't huge for that market. For that market we need a reliable, efficient, profitable and cheap node. 40nm, 22nm and 14nm can do just that. GF fabs them all. Smart move.
Smart short term move, but increasing the chances they'll fade into oblivion as their tech can't compete in the longer term. 7nm will be extremely appealing to IoT in 5-10 years once it's reasonably cheap.
Doesn't really sound like they had a choice though.
Just because they are focusing on non-bleeding edge nodes doesn't mean they won't revisit 7nm when it is no longer bleeding edge. 7nm fabrication will likely be cheaper by the time 3nm is ramping up. At the very least, they could skip DUV with quadruple masking and go straight to EUV allowing them to make more wafer starts with the same amount of production space.
Right. They don't have a choice. It is quite puzzling they are withdrawing quite late. I think there is a change with their competitors. It seems that TSMC must have been giving better prices for contracts. It also seems that TSMC can serve all customers as long they are willing to wait. In short, GF lost some potential customers making the latest node not worth it
It is sad to read this news - i worked at Fab 8 for 3 years and know the struggles involved in developing bleeding edge processes first hand, especially at the brutal timelines that are demanded from the market. Hopefully people affected by this action manage to find work elsewhere.
Good points by the commenters, but I think many miss the main point. Yes, we all know and have known how expensive new nodes are to develop. And we all have known how crazy it can be to chase Moore’s Law. BUT, so did Global. They must have had a business plan when they launched 7nm two years ago. They knew how much capital it would take, how much it would cost to make a wafer, and how much revenue they could make. They spent billions of dollars on development and tools like EUV over the last two years. According to their press releases they should be ready for production this year. So why do you cancel it now ??? Something doesn’t add up. Think of how much it must have cost to outfit the fab for EUV tools alone. Yes there is a lot of demand for older nodes like 28nm. But 28nm was the end of the line for planer devices and finfet arrival was delayed several years. In contrast, 14nm is first generation finfet and probably not a good place to park a fab. So there must be more to this story than just “we woke up and discovered how expensive these new nodes are”. Something is very amiss at Global to get this far down the road, then cancel.
Hardly the best managed company out there, and not exactly known for their sterling leadership and flawless execution.
They also had huge problems transitioning to previous nodes.
Having said that, it’s kinda like a gambler who already lost his paycheck, and is about to throw the deed to his house on the table.
YES, GF already lost 5-7 billion dollars researching 7nm, investing in EUV tools etc. But continuing the move to 7nm would have cost them an additional 10-20 billion dollars.
And that investment might never have paid for itself, since GF in 5 years might have been stuck with a 7 nm fab that’s too slow for the leading edge (Apple, Qualcomm etc.) but too expensive for everyone else compared with a cheap “good enough” 14nm or 22nm fab.
I used to work for IBM Research, though not in the part they sold to Global Foundries, but in the part where they invented the new wrinkles to make better transistors and such. I finished my doctorate in 2012. It was just about a year after I finished that the deal with GF was announced. When this happened I was not sure how much longer IBM would invest in the physical science research for such things. They promised five years I believe. But it made me think about the business case for transistors and I concluded there was not much of a future left.
Simply put quantum effects and statistical effects hurt you more and more as you go smaller, it also blows up in cost. We can debate the end of Silicon scaling till we are blue in the face. Will it be 7 nm or 5 nm or will it be 10-12-14 or something else. It doesn't matter costs go up and amortizing your cost is the better business play at some point.
Can we research something to replace Silicon? Yes. But silicon has 40+ years of investment, you are not likely to invent a technology to overtake it in a few years or for a few dollars. Do you choose to invest in research or driving down the cost of building 14 nm fabs?
I though there would be a break point in 5-6 years where the CEOs would have to choose: amortize or invest a massive amount in researching something totally new. I fully expected them to choose the former. GF just did. Will Intel or Samsung choose differently? I doubt it. Which means a big brick wall is coming. Mr. Moore we have reached the end.
IBM paid Global Foundries $1.5 billion to take over their foundries. They also gave them patents. In return, Global Foundries said they would provide IBM 10 nm CPUs. How are Global Foundries not in breach of that agreement? AMD is obligated to buy a certain number of CPUs from Global Foundries each year. If they don't buy the minimum amount, they have to pay anyway. But what if Global Foundries is producing out-of-date products? Would AMD have signed that agreement knowing GF was going to just give up on research suddenly? Not only is this decision a waste of the research and development they have already put into 7 nm, it also appears to be dishonest and underhanded to IBM and AMD. I hope they are compensating IBM and AMD for going back on their word.
Looking back at previous announcements I am now under the impression that AMD and most likely IBM have known this was coming a long ways off so they could adjust their strategies as needed.
As to the agreements in place, if AMD and/or IBM were getting screwed over due to this change I am sure a lawsuit would have already been announced.
What i find interesting is that their 7nm process is more or less developed and surly that has some value to someone that has deep pockets and want to be even more vertical integrated like apple. Yes its a huge investment, but does anadtech readership thinks there is a chance that Apple would be interested in getting in the fab business.
Apple tied itself to TSMC (and vice versa). In return for Apple booking huge quantities and making TSMC their exclusive supplier, TSMC gave Apple first right of refusal for their 7 nm capacity. They both took a risk, but that deal deal gave TSMC what GloFo was missing: a full order book for 7 nm silicon with the attached large, guaranteed revenue stream (many billions of $$$) that made TSMC's investing the necessary multi-billion $$$ into 7 nm tech possible. Right now, I simply don't know any other (fabless) company except Apple that would (or could) sign a two-figure billion dollar purchase order like that; AMD simply doesn't have anywhere near that volume, or the financial muscle, and it looks like even Qualcomm and Huawei had to settle for 7 nm table scraps at TSMC's. While some here have commented on the coming end of Moore's Law, that scenario may have to be amended: in addition to physical limitations to how much one can shrink semiconductor structures, it looks more and more that the costs of shrinking nodes will put the brakes on that development even before we hit the actual physical limits. @Anandtech: if you guys have the numbers, could you publish a plot of node sizes/nm vs. estimated costs to get a fab up-and-running. This seems to approach an (inverse) exponential function as we go from 14 nm to 10 nm to now 7 nm.
GF has been very aggressive in growing the company through private equity acquisition - which means debt, which means interest payments. It'd be interesting to see how the transaction and financing costs of the acquisitions factor into this.
GF would not be the first company sunk by ownership's shitty business strategy. Blaming it all on R&D is the idiot's way out. AMD, eg, was sunk by borrowing to buy ATI - not by overspending on R&D.
Now I get it. Brilliant strategy by Global. Instead of having to announce that they didn’t execute on 7nm development and have no technology or customers or both, and they wasted billions of dollars of NY taxpayer, IBM, and oil money...they shift the focus to the endless debate “is Moore’s law over”. Brilliant diversion and spin on the story. Based on comments on the article it worked on many people.
so that stupid interview about glofo's 7nm and chips at 5ghz (which people were assuming meant ryzen instead of ibm's upcoming chips) was just hot air. do you know how many fanboys have been citing that stupid thing. and it was all nothing. and where are all those people now?
GF has suffered since the beginning from the luck of good leadership. The actual CEO and his team had no vision at all. how did they invest billions on 7nm and then they find out they have no capacity to manufacture 7nm devices even if the production was ready. Wasted over $500 millions on EUV tools that now GF will sell back to ASML for penny on a dollar. GF management was playing with the house money (Mubadala), When Mubadala said enough is enough...the party is over. Malta site will shutdown in 2 years max when AMD move 100% to 7nm at TSMC and stop ordering 14nm products.
A GF EVP I worked with overseas years earlier asked if I would look at their brand new FAB 8 = very disappointing at best. They tried to hire me after my assessment of FAB 8 (6+ years ago). I met with CEO & CFO, FAB 8 (ex-AMD) Manager, FAB 8 (ex-GE) HR Director, personality profiling, 80 emails, etc. and finally got an offer 3 months later which I turned down the next day. I saw no opportunity for a successful money making operation, no profit sharing, no IPO, just a pay check and politics. Way to many people having influence without any accountability for the companies success. Some of these people have been milking the UAE for 10 years. Too many losers milking the UAE investors that could careless if the company ever made a profit. Instead of purging these milk UAE for a paycheck people, GF promoted these people like the HR director. History = UAE (ATI) buys AMD (cannot compete in a 2 horse race). Then buys Chartered (company that lost money for Singapore Gov for 20+ years) I worked next door to Chartered for 7 years watching Chartered never make profit while the gov financed more fabs. Then the straw that broke the camels back GF gets paid a Billion+ Dollars to take over money losing IBM. Next put a bunch of money losing IBM people in charge of making a profit. Promote ex-IBMer to CEO and now the camels back if finally broken after 10 years of losers padding their pockets with oil money. What ever happened to building that FAB in UAE to employee the people at home??? UAE still has the $$$$$$$$$$$$ to listen to BS = break up and sell off the assets for the least loss. Nothing like paying people to convince you to Buy High and Sell Low so you will lose less of your countries money. Chinese have a saying the Cow who drinks its own milk will not thrive. UAE should have told everyone in management you have 3 years to make a profit or your terminated (period), especially the original FAB 8 HR Director.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
127 Comments
Back to Article
NuclearArmament - Monday, August 27, 2018 - link
This is stupid, stop trying to milk silicon beyond 11 nm and start using gallium arsenide or indium gallium arsenide. The Fujitsu AP2000 used BiCMOS and GaAs fabrication back in the early 1990s, for crying out loud. Get rid of silicon, it's outdated and needs to be replaced! You have BILLIONS of dollars and get government subsidies, why not take a risk?!rahvin - Monday, August 27, 2018 - link
The only problem of course is that Gallium based Semi's are about 100X to the cost which is why no one uses them for anything outside certain categories where the cost is justifiable.Khenglish - Monday, August 27, 2018 - link
iii-v materials can also only result in n-channel HFETs or NPN BJTs. You can't make an active pull-up device, which is required for any VLSI use. Only very high power resistive pull-up or current steering logic is possible.evanh - Monday, August 27, 2018 - link
Price by itself doesn't explain the why. Price is only an outcome due to lack of volume production. This is true for any mass produced product.Khenglish gives a real reason.
rahvin - Monday, August 27, 2018 - link
Price is more than volume production, the materials are far more difficult to obtain, purify combine and handle than silicon and frequently involve highly toxic chemicals. Silicon on the other hand requires an enormous amount of energy but doesn't involve toxic chemical processing. Take a look at the wiki article and the processes to produce GaAs cyrstals, it's not simple. https://en.wikipedia.org/wiki/Gallium_arsenideevanh - Tuesday, August 28, 2018 - link
None of which makes it expensive.SirPerro - Tuesday, August 28, 2018 - link
Let me resume Silicon properties from wikipedia...- Abundant and cheap
- Has economies of scales on its side
- Very stable and can grow to high diameters with good yields
- Very good thermal conductor for densely packed transistors
- Its native oxide works very well as insulator with excellent electrical properties
- Has much higher hole mobility which enables CMOS transistors with much lower power consumption
- Is a pure element, avoiding the problems of stoichiometric imbalance and thermal unmixing of GaAs
- It has a nearly perfect lattice, impurity density is very low and allows very small structures to be built
And Gallium Arsenide drawbacks include...
- Naturally, GaAs surface cannot withstand the high temperature needed for diffusion
- GaAs does not have a native oxide, does not easily support a stable adherent insulating layer, and does not possess the dielectric strength or surface passivating qualities of the Si-SiO2
-Because they lack a fast CMOS structure, GaAs circuits must use logic styles which have much higher power consumption; this has made GaAs logic circuits unable to compete with silicon logic circuits
- Problems of stoichiometric imbalance and thermal unmixing
- GaAs has a very high impurity density which makes it difficult to build integrated circuits with small structures, so the 500 nm process is a common process for GaAs
So you know, usually statements like "This multitrillion dollar industry is wrong, what they do is stupid, they should do X" coming from random dudes on the internet are bullshit.
FunBunny2 - Tuesday, August 28, 2018 - link
"So you know, usually statements like "This multitrillion dollar industry is wrong, what they do is stupid, they should do X" coming from random dudes on the internet are bullshit. "yeah but Dear Leader has done that for a lifetime, and look where it got him. :)
evanh - Tuesday, August 28, 2018 - link
My sole argument is economies of scale is what defines the cost.Mikewind Dale - Monday, July 19, 2021 - link
And your argument is wrong. Economy of scale is one factor that determines cost, but it is not the only factor.If economy of scale were the only determinant of cost, then we would expect every product to be produced by a single monopolist, with a single factory. Any time there were two competing factories, they could reduce their costs by building a single factory of twice the size. The result would be that every product on earth would be produced in a single factory, owned by a single company.
But that's obviously not the case in the real world.
FunBunny2 - Tuesday, August 28, 2018 - link
" Price is only an outcome due to lack of volume production"absolutely not, in this case. the cost is in quantities of raw materials. silicon is plentiful and cheap. the rest are far from it. and, as other comments have described, not technically superior.
evanh - Tuesday, August 28, 2018 - link
The raw materials are nothing in the final cost for a high value high volume product.Samus - Thursday, August 30, 2018 - link
Damn you all are WAY the fuck over my head lolboozed - Monday, August 27, 2018 - link
Do you think they haven't thought about it?Azethoth - Monday, August 27, 2018 - link
Nope. He is a very special boy. He thinks of things nobody else can even dream of.NuclearArmament - Monday, August 27, 2018 - link
It doesn't matter, move away from silicon, this is getting ridiculous. You can't keep milking this technology for decades and expect it to, and these multi-billion-dollar technology conglomerates sure can afford to switch to InGaAs or GaAs and still make countless billions more, all while not paying taxes and receiving massive government subsidies.Reflex - Monday, August 27, 2018 - link
Moving away from silicon has been an industry goal for decades now. When anyone manages to find a cost-effective method of doing so they will become wildly rich. To date, nobody has pulled that off. Not for lack of effort.FunBunny2 - Tuesday, August 28, 2018 - link
"Moving away from silicon has been an industry goal for decades now. "it's worth noting that the original transistors were made of germanium. hard to find, hard to use. the engineers figured out how to use abundant silicon (thought, not just sand scooped up from the beach). the laws of physics can't be flummoxed.
surt - Monday, August 27, 2018 - link
Evidence seems pretty strong that you can in fact milk silicon tech for decades, since that has already happened. And other companies seem to feel pretty confident about milking it for a couple more decades at least. I'm pretty sure if GaAs or some other technology was easy to make profitable, someone would rush to it and crush the competition. Intel has billions in the bank and knows that Samsung is a mortal threat if they don't catch up. Pretty sure they would leapfrog to another technology if they could have an edge on Samsung of even 6 months.name99 - Tuesday, August 28, 2018 - link
What do you WANT from your logic process? THAT is the question -- figure that out, then we can decide if GaAs or other III-V materials make sense.What's holding back the performance of existing CPUs? There are three contenders (which one bites first depends on exactly your target market)
- power density (too much power generated and you can't get enough current in, and you can't cool it fast enough)
- metal speed (also RC) you're limited by how fast wires can transport info from one part of the chip to another
- transistor speed
GaAs ONLY helps with transistor speed. In the process it goes bezerk with power and doesn't help with wire speed.
Beyond that the primary reason our current CPUs don't hit their potential is waiting on memory, which halves (or more) their IPC compared to their full potential. All the ways of dealing with this are built on massive transistor budgets (starting with huge caches). But if you want small transistors, there's no magic in GaAs that makes this easier. It's the exact same lithography, with the exact same difficulties, whether it's multi-patterning or EUV.
It's like a car company wants to make a better car, and your answer is "we need to pour ALL our resources into making better paint". WTF? Does better paint make the car faster? Does it improve power efficiency? Does it solve any actual engineering problem?
Alexvrb - Tuesday, August 28, 2018 - link
LOL let's bring Fujitsu into an argument as a good example of how to win... LMAO. How's Fujitsu doing in the fab biz, again?NuclearArmament - Tuesday, August 28, 2018 - link
More companies would do better by having their own fabs, not less. Being vertically integrated via either merger, acquisition, or organic means is a much better way to remain competitive than solely remaining fabless and outsourcing. Intel is only as large as they are in the semiconductor world because they did not so foolishly sell off all their fabrication facilities like AMD did.FunBunny2 - Tuesday, August 28, 2018 - link
"More companies would do better by having their own fabs, not less. "I've always been puzzled about how much scale there really is. ASML (and who else?) supply the fundamental equipment that all fabs use. did ASMl/whoever figure out, say EUV, and supply all the fabs; or did Samsung, say, figure it out and ASML/whoever built the machines to Samsung/whoever's spec?? from the picture of fabs I've seen, it looks very much like some number of sequential, discrete work-stations, repeated X times. it's not like a car assembly line, where one line makes all the cars.
mikato - Tuesday, August 28, 2018 - link
Yeah I noticed that too. The first photo makes it look like there are a bunch of semi-mobile machines arranged in groups forming the stations. Then the next photo might be more of an assembly line - at least it has a bunch of overhead tracks for things to move along, not sure what, maybe transporting items around.RSAUser - Tuesday, August 28, 2018 - link
It wasn't foolish for AMD to sell, TSMC is a lot cheaper for them than global while being on a better node ans AMD not having the money to invest in smaller nodes.Going TSMC they are at least mostly competitive in the node portion of the cards.
Spunjji - Tuesday, August 28, 2018 - link
You're either trolling or a fool to suggest that selling the fabs was a bad idea for AMD. For a company of that size they're nothing but a millstone. They limited their output capacity when AMD actually had good processors to sell, and then cost them billions when they didn't have a competitive product.The stupid bit was the agreement to produce a minimum number of wafer stars with GF after they sold the fabs. That bit them hard in the financials.
FullmetalTitan - Thursday, August 30, 2018 - link
Smart guy over here figured out the secret to business success in semiconductors if you ignore...*checks notes*
...the need for decades of R&D and many 10s of billions of dollars to acquire companies to supply goods, design products, and leverage talent.
gglaw - Sunday, December 16, 2018 - link
LOLOL. Yea AMD is doing horrible since they sold off their outdated fabs and are paying less for superior nodes now. Their stocks have just plummeted in the last few years. The investors are super pissed that their $3 stocks have gone up to $20 since they don't run their own fabs (which btw is on their own way down the drain after sinking billions into 7nm R&D and scrapping it to focus on old nodes). Look at AMD's balance sheets, then look at TSMC or Samsung. It's 1B vs FIVE HUNDRED BILLION cash on hand. AMD is a tiny designer of fancy chips compared to massive semi-conductor firms that produce chips for the whole world. Even Intel is a speck compared to these companies and more than a few analysts feel Intel's best future move is also letting TSMC use their much deeper pockets to compete in bleeding edge node wars. (If you haven't noticed, Intel is getting the shit kicked out of them in node advancement.)gglaw - Sunday, December 16, 2018 - link
The savings from vertical integration is absolutely nothing compared to the R&D costs and ramping up manufacturing facilities for each new node. Even if the vertical integration made the wafers literally free, it would forever run AMD in the red trying to pay for the R&D and new facilities for a tiny fraction of the chips sold required to justify the cost of each node advancement. AMD would have to sell over 100X the amount of chips they did in 2018 to maintain a fab that could keep up with TSMC and Samsung. TSMC can do it since they sell a staggering amount of chips compared to either AMD or Intel. AMD accounts for under 1% of TSMC's contracted chips produced. In the tech enthusiast world, AMD is very exciting since they play such a large roll in gaming hardware design but in the business world, they're a little ant.transistortechnologist - Tuesday, August 28, 2018 - link
It turns out that at very small dimensions compound semiconductors such indium gallium arsenide do not offer any benefits over silicon. It's fundamental, in the physics, quite apart from any economic considerations.DanNeely - Monday, August 27, 2018 - link
Will IBMs huge mainframe chips be one of the specialized products they'll still be supporting, or is big blue being unceremoniously dumped in a ditch and left to scramble to migrate their chip designs to either a Samsung or TSMC process?Ian Cutress - Monday, August 27, 2018 - link
Technically the IBM contract runs out end of the year. Beyond that, not sure.Alexvrb - Tuesday, August 28, 2018 - link
I'm sure they'll continue working with IBM for the foreseeable future. But that still revolves around 14HP, I'd bet.FunBunny2 - Tuesday, August 28, 2018 - link
according to the wiki (no, I didn't run the links :) ) the Z14 is a 6.1 billion chip. not, by any means, the highest transistor count among cpu. not the biggest, but big, by area either.here: https://en.wikipedia.org/wiki/Transistor_count#Mic...
name99 - Tuesday, August 28, 2018 - link
Look at what Intel has done with 14nm, +, ++, etc.There's a lot of optimization left in leading edge hardware for the obvious reason that you're always time constrained -- you do things the safe way, you drop ideas that don't fit the schedule, and so on.
IBM can certainly cruise for a few years (with both POWER and z) on just optimizing what they have at the µArchitecture and system level, sticking with GloFo's 14nm (which will doubtless pick up small improvements at the process level each year); looking at how they can exploit advanced packing (ie things like interposers or EMIB) and advanced memories (MRAM, Nantero, Optane, ...).
That's not an ideal on-going situation, but it's absolutely feasible for a few years as the market sorts itself out and GloFo figures out its future.
Maybe GloFo partners with UMC?
Maybe they license Samsung's 7nm as soon as ASML can make enough machines?
Maybe they conclude that the economics (for their set of clients) works best not by standing still at 14nm, but if they are always about three years behind the absolute leading edge?
Dragonrider - Saturday, September 1, 2018 - link
Then again, IBM just might get Intel to build their stuff. Stranger things have happened.klagermkii - Monday, August 27, 2018 - link
If this means that AMD will no longer pay penalties to fab their chips at competent leading-edge foundries, then this is fantastic news, and Global Foundries can go climb into the rubbish bin of silicon history.CajunArson - Monday, August 27, 2018 - link
This clearly means that 5nm is so far ahead of schedule that GloFo will have it out in January!Because advanced lithography is SUPER EASY for literally everybody but those idiots at Intel.
Right! Right???
rahvin - Monday, August 27, 2018 - link
Didn't read the article did you? GloFo is abandoning the market because they don't believe they can compete economically against Samsung and TSMC. Basically the EUV fabs cost $20 billion and to afford that you need to produce more modules than GloFo physically can. As a result they are going to start fabbing specialized chips (probably ASICs and others) on their existing process and let TSMC and Samsung have the market.phoenix_rizzen - Monday, August 27, 2018 - link
<Whoosh>The sound of sarcasm flying over your head. :)
rahvin - Monday, August 27, 2018 - link
This has nothing to do with easy or hard, it's about economics which is why your sarcasm wasn't a factor in my post.Azethoth - Monday, August 27, 2018 - link
No, the stated reason is "economics". It is certainly true that one day there will probably just be the one true fab, so in that sense this is inevitable. But, let's be honest here as well, the article also says they are behind schedule. Which is not not a factor. Worse, they use marketing nm, so they are late behind Intel which is already way late.smilingcrow - Monday, August 27, 2018 - link
In this case hard = expensive so it's not exactly irrelevant.rahvin - Monday, August 27, 2018 - link
I don't agree in this case, both Samsung and TSMC are in volume production on 7nm. Though I have no doubt it's difficult the logic provided by GloFo for the reason the pulled the plug is convincing IMO, Fab costs continue to rise, well beyond the predictions even 2 years ago. IIRC they were predicting about 8 Billion for a 7nm Fab and now we're looking at close to 20, that's a significant change and dramatically alters the accounting. I think GloFo being a bit behind decided to do a cost analysis and saw how out of wack they were and realized it was time to stop as proceeding could drag them into bankruptcy. This is essentially what they said.ironargonaut - Tuesday, August 28, 2018 - link
In the manufacturing of silicon technology=economy. I'm sure there are very super expensive processes in a lab somewhere that can produce awesome silicon at an equally awesome price. But, that is not what we are talking about here, either your technology can produce wafers of die at a cheaper cost or your technology is not up to par. Also, if there is not technological reason then they would sell the process rights, correct?FullmetalTitan - Thursday, August 30, 2018 - link
Take it as you will, but no one IN the semicon industry buys the reason GloFlo is selling.While economics are certainly a factor (leading edge is expensive), there are most likely technical issues which tipped the scale solidly towards the decision to kill R&D.
dromoxen - Tuesday, August 28, 2018 - link
IF THEY only have revenues of $6Beelion , then it becomes very very hard to afford a fab costing $20b. Mubadala didn't want to keep feeding the cashless cow as oil is losing its value. Is this a market that China is going to get into ?(long-term)eastcoast_pete - Monday, August 27, 2018 - link
Interesting story, but I believe a bit more skepticism in GloFo's official line is in order. To me, only two scenarios make sense: 1. They just couldn't get their 7 nm process up-and-running at any volume and yield that would be at least somewhat competitive with TSMC and Samsung by Q1 2018, and decided to not throw good money after billions already sunk into this. 2. They could have gotten the process working at yields and volumes that would have made sense in theory, but couldn't find the customers/orders anywhere near the volume to at least break even here.If it's the second reason (tech actually works and is sort-of on target time-wise), then why not try to wrap the IP, the personnel (expertise) and the expensive equipment up in one package and spin the "7 nm subsidiary" off as its own entity, just as they are doing with their ASIC line?
And here a really wild idea, again all based on the tech actually works and is basically ready: Maybe Intel likes to pick GloFo's "7 nm bones" clean and get the goodies (IP, expertise, equipment) for cheap. Chipzilla could really use a below 10 nm process right now, so provided it does work, who knows? Also, that solution would keep at least one 7 nm fab in the US, which might be of interest from a national security POV.
eastcoast_pete - Monday, August 27, 2018 - link
Damn typos. I meant 2019 for point 1."1. They just couldn't get their 7 nm process up-and-running at any volume and yield that would be at least somewhat competitive with TSMC and Samsung by Q1 2019,..."
Setout - Monday, August 27, 2018 - link
As the article clearly states, the GloFo 7nm is behind the TSMC 7nm, which itself is equivalent to Intel's 10nm. So the IP might not be very interesting to intel, although they've gone in a slightly different direction.RSAUser - Tuesday, August 28, 2018 - link
Their 7nm is probably slightly better than Intel 10nm.Think someone linked some papers a while back stating that TSMC and Samsung have likely already overtaken or are soon to overtake Intel.
eastcoast_pete - Tuesday, August 28, 2018 - link
I actually agree with the notion that x nm doesn't equal x nm when comparing different processes. However, Intel's 10 nm process has (continues to have?) had many teething problems, and scale-up has been delayed and apparently almost painful. My thought was that this is a potential chance for Intel to buy valuable IP, expertise and basically new state-of-the-art equipment at bargain basement prices. I know that they like to do everything as much in-house as possible, but Intel has made large acquisitions in recent years if they thought they would add to their bottom line.Kevin G - Monday, August 27, 2018 - link
1) Is the most likely scenario. Everyone is having problems as EUV is late for everyone and the alternatives are quad patterning which adds costs and slows production. It is a bag of hurt. 5 nm was looking like it could only be developed by masochists.2) Not improbable was GF was never that aggressive with their roadmap. The problem is that the other players hit delays so the reality likely was that GF was never going to be that far behind. Perception is everything. Spinning off 7 nm production would still require customers to come forward. This is a clear chicken and the egg scenario where the customers decided to roost else where.
As for other players purchasing GF, it is unlikely as companies like Intel actually have an excess of capacity for logic right now (DRAM and flash are slightly different which are higher utilized). The exception as noted would be keeping the US fabs in the Trusted Foundry Program. Certain stipulations had to be adhered to for GF to inherent those as part of the purchase of IBM's fabs. As such, there are likely conditions that still exist for the sale of those fabs. If another member of the Trusted Foundry Program just wanted those fabs, it would be far easier to change ownership.
bourbon_high - Monday, August 27, 2018 - link
Rumours last year said that the previous CEO Mr. Sanjay Jha and his team wanted to stop investing in 7 because the IBM led management team in Malta was behind and struggling. the current management team it seems was against 22fdx but pushed for 7nm. now after failing to deliver, the same management team under pressure from their investors and AMD have given up and seems to be calling it a strategy shift. Billions more wasted is what is said. GF needs new management, not dead weight IBM'ers that are slow to move if at all.rahvin - Monday, August 27, 2018 - link
The article clearly paints it as your number 2 scenario. When they did the math on buildout and customers they realized they'd loose money so they bagged the whole thing and decided to focus on their current process.Keep in mind there is still a ton of stuff fabbed on older process nodes, it can in fact by quite lucrative. With AMD unbound on manufacturing it also free's GloFo from having to spend billions building and upgrading Fabs. Given the rising costs in fabrication as they move up each node the market is likely to consolidate further and we could even see TSMC or Samsung drop out of the race due to the costs. Fab prices have gotten to the point of breaking budgets for even huge companies, there are very few companies that can afford to build a $20 billion EUV fab.
levizx - Tuesday, August 28, 2018 - link
You can't "spin-off" a process with NO FAB to produce it.eastcoast_pete - Tuesday, August 28, 2018 - link
Actually, you can, if you have a fab or two that can house the equipment and the people (the expertise) who are now being "made redundant" as the the British say. Processes have been moved ("migrated") from one fab to another in the past, although it can be a major pain in the derriere to do so.FullmetalTitan - Thursday, August 30, 2018 - link
Bigger issue then would be cost. Fab tool-in is expensive, even with retrofit work.Kevin G - Monday, August 27, 2018 - link
I wonder how this plays out for the Trusted Foundry programs that GF inherited from IBM. They allegedly had several contracts open with three letter agencies for specialized parts. Outside of that, they still provide some radiation hardened nodes as well (though far from state of the art). I'm not sure how profitable, if at all, things would be catering to those types of customers. While not high in volume, those due produce a nice margin. While the number bleeding edge nodes within the Trusted Foundry program is slim, there are plenty of players able to fab using older nodes. As the rest of the industry marches toward 7 nm and beyond, those other small time fabs will catch up.While IBM's contract with GF runs out at the end of this year, GF did develop a 12 nm FD-SOI process that looked like it was being used for a POWER9+ revision. Still those products would have been initially planned for a late 2019 release date.
btmedic04 - Monday, August 27, 2018 - link
My question in all of this is how will the fallout of Glofo abandoning 7nm affect AMD's 2nd gen Ryzen/Epyc processorsphoenix_rizzen - Monday, August 27, 2018 - link
It's mentioned in the article that AMD is targetting TSMC 7nm.eastcoast_pete - Tuesday, August 28, 2018 - link
That's one of the open questions here. 80% of TSMC's 7 nm capacity is contractually bound for Apple's needs, leaving precious few wafers for everybody else. eetimes published an interesting article on this a few days ago. Basically, players like Qualcomm and Huawei (HiSilikon) will likely have to get chummy with Sammy if they want their 855 Snapdragon in 7 nm in usable quantities, because Samsung is now the only other fab operator supposedly close or ready for 7 nm. I do wonder where all the other 7 nm silicon for EPYCs, Ryzens, and assorted GPUs is supposed to come from now with GloFo out. AFAIK, TSMC is currently the only player with a 7 nm fab fully up-and-running, and that fab is very busy (and contractually obligated) to make all the A12 chips it can. So, this pull-back by GloFo is great news for Samsung; wouldn't surprise me if their stock just jumped a bit.BurntMyBacon - Tuesday, August 28, 2018 - link
Good news for Samsung, sure. However, they are only an option for AMD if they can get the yields up. Historically, Samsung's process has been competitive with other contract fabs. Though (if I recall correctly), they've struggled with larger chips. ARM chips targeted at phone should be good (they need that for their own chips), but larger ARM chips targeted at servers and larger GPUs have historically been problematic. It is uncertain if large, high performance x86 chips would work out well. If Apple moved a large portion of their orders to Samsung, then TSMC might have the capacity to service all these larger chips. However, with so much of the TSMC capacity tied up by Apple, AMD will likely need to prioritize either CPUs or GPUs. Given recent history, I suspect they will favor server CPU production and probably back down on high end GPUs. Perhaps we'll get another round of smaller mid-range GPUs fabricated at Samsung. Unless Samsung suddenly develops proficiency for fabricating larger chips or TSMC suddenly frees up significant capacity, I suspect this bodes very poorly for competition in high end GPUs.dogzilla - Monday, August 27, 2018 - link
I think they are saying that the company isn't big enough to pay for the investment to continue process development at 7 nm and below. You have to make enough wafers to spread the development cost around, and they are just too small, make too few wafers. I worked at companies that have made this same decision, it's called a going out of business strategy. Any customers that might need to migrate to 7 nm and smaller will move elsewhere, usually they are the more profitable customers.Zoomer - Tuesday, August 28, 2018 - link
Exactly. What are you going to sell in 5, 10, 15 years time?PhrogChief - Monday, August 27, 2018 - link
'Strategical' is NOT a word.levizx - Tuesday, August 28, 2018 - link
And the Earth is flat.https://www.dictionary.com/browse/strategical
https://www.thefreedictionary.com/strategical
V900 - Monday, August 27, 2018 - link
*CRASMASH!*That’s the sound of Moore’s law colliding with physics.
A few years ago, an ex-Intel buffin held a talk on chipmaking at the end of Moore’s law. He predicted it would come at 2016-2020 and around 10nm.
That’s really it then. We’ll get another node at 7nm (eventually) but after that? Slow, incremental improvement.
Few people understand how big of a deal this is, but think about this. All of us grew up in a world where the power and speed of computers and electronics would double roughly every two years.
This exponential growth was a virtual certainty and it fed the biggest prosperity engine in human history. There isn’t many fields and businesses that didn’t in some way benefit from it.
And that’s almost over. There’s a few years left where it almost seems like nothing has changed, but it has.
Some of us have children. Those children won’t live in a world with electronics doubling in speed and capability every few years.
The next generation of video games for them will look pretty much like the last one, unless you’ll know where to look.
For them, it’ll be nothing but slow, incremental improvement year after year: A 2% improvement here. A 1% improvement there. The way it was before the transistor.
The scariest thing is, what will happen to the economy. Trillions of dollars and billions of man hours have been spent in the shadow of Moore’s law the past decades. What will happen, once there isn’t a reason to get a new TV every 4 years? If computers only get a few percent faster every year, will computers become a thing you buy once a decade?
nevcairiel - Monday, August 27, 2018 - link
I don't think the future is quite as grim quite yet. After 7nm there is still EUV for a few more nodes (probably 5 and 3nm, and maybe even 1nm). Thats probably over a decade right there.Maybe in that time, someone will finally figure out an alternative. Carbon-based solutions have been mysteriously hyped for a long time. Graphene transistors? Carbon nanotubes?
Manch - Tuesday, August 28, 2018 - link
"Some of us have children. Those children won’t live in a world with electronics doubling in speed and capability every few years."Oh....the.....horror.....
FunBunny2 - Tuesday, August 28, 2018 - link
"Oh....the.....horror....."the snark is well done. but... a static socio-economic environment is otherwise called "The Dark Ages". but this one is forever. if Social Darwinism is the paradigm going into the permanent Dark Age, said children's biggest worry won't be the missing electronics that double in speed every few years.
benedict - Tuesday, August 28, 2018 - link
There never was a need to buy a new TV every 4 years. The crazy consumerism making people replace their perfectly working electronics every 1-2 years is a phenomenon almost exclusively happening in the USA and it cannot end soon enough.Believe it or not, life will keep going on after the death of Moore's law.
FMinus - Wednesday, August 29, 2018 - link
I think most people just keep their TVs until they die or the provider does something to make them obsolete, like in my case in 2008 my cable provider decided to kill analog over cable completely, so I had to throw out that CRT TV otherwise I'd still have it for a few more years, but that was a reasonable switch. That flat panel that replaced the LCD is still working in the bedroom, whilst we bought a new TV for the living room somewhere in 2014, and I intend to keep it until it dies, or the standards change and it becomes useless, but that likely wont be happening before it actually dies by itself.V900 - Monday, August 27, 2018 - link
AMD will land on their feet, but this will REALLY suck for IBM who must regret betting the farm (and their fabrication guys on Global Foundries.)And of course, with only 2 fabs doing leading edge work, at some point it could mean delays for everyone who isn’t Apple or Samsung.
nevcairiel - Monday, August 27, 2018 - link
Its really still 3 fabs, TSMC, Samsung and Intel. Maybe Intel will use that chance and do more third-party fabbing.V900 - Monday, August 27, 2018 - link
It’s not as easy as that.It’s not like Intel is running a furniture factory, where anyone can get their table built, if IKEA is taking up all of the production at the usual plant.
(This is also why Intels last attempt at “renting out fab space” went so poorly.
Intels fabrication methods are tied into their design methods. They’re using their own tools and even design things somewhat differently from other companies. (Even AMD).
Intel’s fans and designers are used to a process that’s much more reliant on full-custom hand-crafted circuits which are intimately tied to the process.
Pretty much all of their IP, including the stuff for fabrication is designed and developed in-house, and is sometimes CPU specific.
Compare that to the rest of the industry that mostly uses common tools, and you can see the problem.
Going from TSMC to Samsung with a design might mean a delay of a few months, as you rejig the design to be built on Samsung’s process and tools.
Going from TSMC to Intel would mean a major delay as you redo the design for a process that was designed specifically for Intels needs and internal tools. If the chip is really complex, in some cases it would be easier and faster just to junk your design and start from close to scratch.
zodiacfml - Monday, August 27, 2018 - link
Now it seems that if a fab produces mobile chips, it is more likely to thrive.Comdrpopnfresh - Monday, August 27, 2018 - link
Very interesting. To me, this represents the canary in the silicon mine. Though, I did not expect it as soon as this.Maybe the other foundries will see these changes for what they are- submitting to physics. No sense in expensive pursuits that hasten arriving at the end of the road.
rahvin - Monday, August 27, 2018 - link
I think the writings been the wall for a while. Fab prices have been far exceeding estimates, almost exponential growth in costs is simply unsustainable.ChrisGar15 - Wednesday, August 29, 2018 - link
There is an industry view that mature markets evolve into 3 main players. (with occasional disruptions & consolidations) There is a big player (e.g., 70% market share) ... and 2nd player (20-30% market share) ... and a niche/3rd player.I think the semiconductor market is evolving this way also. (my guess is Samsung will evolve to be the 2nd player -- but we'll see)
V900 - Monday, August 27, 2018 - link
This is what the end of Moore’s Law and exponential growth in computing looks like.Foundries dropping out, not because they don’t see a way forward, or can’t figure out how to get to the next node, but because it’s too expensive to move on.
FunBunny2 - Tuesday, August 28, 2018 - link
"it’s too expensive to move on."that's only half the story. the other half is product demand. everybody from user device producers to the materials miners can only thrive if there's sufficient unmet demand to require expanding output. for chip makers, whatever happened to the >300mm wafer?? never happened. same will happen with smaller nodes: while they *could* make more chips with smaller nodes, even on 300, unless there's demand for more chips, it won't happen.
that gets us into macroeconomics, which is to say the disappearing middle class. the 1% isn't (and never has) growing fast enough to absorb higher cost output. not enough output to drive down average cost. the bean counters care only about that. the current bowing to Social Darwinism from certain places keeps a lid on demand.
Koenig168 - Monday, August 27, 2018 - link
The books will look better in the short term but once 7nm becomes mainstream, it's not hard to imagine what will happen to Global Foundries. 90nm was the first sub-micron fabrication technology. Imagine what would have happened if a company says "we are already sub-micron, let's stop investing in 65nm and focus on specialized manufacturing". Who still uses 90nm now?grant3 - Monday, August 27, 2018 - link
If GloFo had shut their doors after 90nm, they'd have saved a few billion dollars....V900 - Monday, August 27, 2018 - link
Or they could have stayed open but not move beyond 90nm. Sooner or later they would have recouped the investment and become profitable.There is still a huge market out there for older foundries that make ICs bigger than 60nm. And there always will be.
Out of the hundreds (or thousands) of ICs we all come in contact with every day, it’s only a handful that really benefit from being manufactured as small as possible.
For the vast majority, 90nm or 22nm doesn’t matter, except in terms of cost, and the old 90nm fabs can make them a lot cheaper than any newer node.
ChrisGar15 - Wednesday, August 29, 2018 - link
Actually some of GF's most profitable fabs are 90nm and above. (it won't last forever ... but has worked really well the last 10 years)V900 - Monday, August 27, 2018 - link
Ehm... A lot... More than you’d think anyways.Out of the entire worldwide IC market, only 40% is 40nm and smaller.
The other 60% of the market are all older, bigger nodes. Old 90nm or 130nm tool that have recouped the huge investment in them years ago, and are virtual money printing machines now.
To answer your question, 90nm made up 6% of sales from pure play foundries last year. And that’ll probably only grow in the coming years, since 130nm sales made up 7%. 14% of sales were of IC that were bigger than 130nm but smaller than 180nm.
KAlmquist - Tuesday, August 28, 2018 - link
I learned something new today. Thank you.BurntMyBacon - Tuesday, August 28, 2018 - link
The situation seems less daunting for GF when you put it in perspective. (O_o)#GIVEMEBACKMYDOOMANDGLOOM
V900 - Monday, August 27, 2018 - link
Sorry, forgot the link to the figures: https://www.statista.com/statistics/553271/worldwi...And don’t forget btw, that it’s probably a smaller risk to stay on 12-14nm (or heck: 14-22nm) than to invest billlions of dollars in a new node that might never become profitable.
GF was facing a huge risk. The tens of billions of dollars that a new 7nm would cost, is also a virtual guarantee that it would never be cost effective against older, cheaper nodes where a huge part of the market is.
Atari2600 - Tuesday, August 28, 2018 - link
How much of those billions had they already invested though? At this point, I'd expect that over 95% of the funds were already committed and they are actually not saving too much.Sure - they would have lost more money on initial ramp of 7nm - but would they then gain more back over the next 20 years by having 7nm to offer clients, and stopping development of any smaller nodes beyond 7nm?
BurntMyBacon - Tuesday, August 28, 2018 - link
On the other hand, (assuming things were coming along as well as they say) they're not far off from their targeted launch date and most of the research costs and initial fabrication capability costs (and associated equipment) are already sunk. One might make the case that they could have recouped some of those losses if they had launched the first 7nm process and ceased development after that. As it is, they have two extremely expensive ASML Twinscan NXE devices already installed that they no longer have use for (re-purposing them may or may not be practical). This all seems like they gave up a bird in the hand for two in the bush.rahvin - Tuesday, August 28, 2018 - link
The biggest factor in the cancellation is probably the cost escalation. If the 7nm fab had come in at the $8 Billion predicted a few years ago they probably would have went ahead but when EUV costs went crazy the resulting fab costs went to ridiculous. There is no conceivable way GloFo could have built a fab costing $20 Billion. They didn't have the revenue or financing for such a cost. At $8 Billion the fab was within their spending with revenues of $6 billion or so, but a capital expense that's 4 times your annual revenue? I doubt there's a company in existence that could afford that without absolutely guaranteed returns.This is the consequence of the EUV price increases and those costs weren't known until very recently.
levizx - Tuesday, August 28, 2018 - link
Plenty of people still uses 65nm. Besides, 14/12nm is very competitive, 5/3nm won't be much different from 7nm, we are talking about 3 steps before the end of the road here.levizx - Tuesday, August 28, 2018 - link
*90nmtransistortechnologist - Tuesday, August 28, 2018 - link
90nm was the first sub-micron fabrication technology? You're forgetting about700 nm
500 nm
350 nm
250 nm
180 nm
130 nm
it's been a long road and many of these older nodes are still in volume production
HiDensity - Wednesday, August 29, 2018 - link
Well...the former IBM fab (now GF) in Vermont is just that. It's a 200mm fab that never went below 90nm. Instead they diversified their offerings. Found a niche space in RF and have been running close to fully loaded for years now (sometimes a little under, sometime over). While I don't think this move is smart in the long term, there are plenty of chips to be made on older technologies. Think you would be surprised how many 130nm chips there are in current cell phones. People forget there is more than just a processor in there.evanh - Tuesday, August 28, 2018 - link
None of which makes it expensive.Yojimbo - Tuesday, August 28, 2018 - link
Well that seems to hang IBM out to dry. On the other hand I guess it releases them from whatever wafer agreement they have with GF. Maybe IBM will sell their Power CPU business.KAlmquist - Tuesday, August 28, 2018 - link
So what does this mean for AMD? First of all, AMD has already announced that it will be building some Zen 2 chips on the TSMC 7nm process:https://www.anandtech.com/show/13122/amd-rome-epyc...
So AMD has already done the work of porting the Zen 2 design to the TSMC process. The only question is whether TSMC has the capacity to produce as many chips as AMD can sell.
We are told that the Wafer Supply Agreement is being renegotiated rather than abandoned. The point of the WSA, as I understand it, was to commit AMD to buying some minimum number of chips each year in order to allow Global Foundries to recover the cost of developing new nodes. Since GF is no longer going to be developing new nodes or processes for AMD, it would make sense to dump the WSA. Perhaps AMD wants to have a WSA in effect for another year or so in order to have GF commit to producing a sufficient number of 14nm and 12mn chips for AMD to meet demand until AMD can switch most of its product line over to 7nm.
haukionkannel - Tuesday, August 28, 2018 - link
Zen2 is in no trouble. Vegas most likely and APUs were planned to use Global foundaries, so those may be delayed one or two years. Or They will stay in 14nm for some years. No big deal.GreenReaper - Tuesday, August 28, 2018 - link
It kind of is? Heat and power use are bad in themselves and limit the frequency you can reach; die size also relates closely to cost. For APUs, you don't need more cores, but increasing the number of compute and rendering units could be the difference between reaching 4K or not.Sure, you can make that die on 14nm, but you will pay for it one way or another. Maybe thanks to size or power it doesn't fit into an AM4 socket at all, and you need to go to TR4.
OwCH - Tuesday, August 28, 2018 - link
This is actually a really smart move.To all the people calling for a move away from Si - Tell that to Intel, Samsung and TSMC, as those three will be the ones leading development of new nodes. For the vast majority of tech products, we don't need those smaller/faster nodes. Think IoT and normal household tech. The bulk of tech. The need for computing power in a single chip isn't huge for that market. For that market we need a reliable, efficient, profitable and cheap node. 40nm, 22nm and 14nm can do just that. GF fabs them all. Smart move.
spronkey - Tuesday, August 28, 2018 - link
Smart short term move, but increasing the chances they'll fade into oblivion as their tech can't compete in the longer term. 7nm will be extremely appealing to IoT in 5-10 years once it's reasonably cheap.Doesn't really sound like they had a choice though.
BurntMyBacon - Tuesday, August 28, 2018 - link
Just because they are focusing on non-bleeding edge nodes doesn't mean they won't revisit 7nm when it is no longer bleeding edge. 7nm fabrication will likely be cheaper by the time 3nm is ramping up. At the very least, they could skip DUV with quadruple masking and go straight to EUV allowing them to make more wafer starts with the same amount of production space.zodiacfml - Tuesday, August 28, 2018 - link
Right. They don't have a choice. It is quite puzzling they are withdrawing quite late.I think there is a change with their competitors. It seems that TSMC must have been giving better prices for contracts. It also seems that TSMC can serve all customers as long they are willing to wait.
In short, GF lost some potential customers making the latest node not worth it
MananDedhia - Tuesday, August 28, 2018 - link
It is sad to read this news - i worked at Fab 8 for 3 years and know the struggles involved in developing bleeding edge processes first hand, especially at the brutal timelines that are demanded from the market. Hopefully people affected by this action manage to find work elsewhere.Crazyguy9 - Tuesday, August 28, 2018 - link
Good points by the commenters, but I think many miss the main point. Yes, we all know and have known how expensive new nodes are to develop. And we all have known how crazy it can be to chase Moore’s Law. BUT, so did Global. They must have had a business plan when they launched 7nm two years ago. They knew how much capital it would take, how much it would cost to make a wafer, and how much revenue they could make. They spent billions of dollars on development and tools like EUV over the last two years. According to their press releases they should be ready for production this year. So why do you cancel it now ??? Something doesn’t add up. Think of how much it must have cost to outfit the fab for EUV tools alone.Yes there is a lot of demand for older nodes like 28nm. But 28nm was the end of the line for planer devices and finfet arrival was delayed several years. In contrast, 14nm is first generation finfet and probably not a good place to park a fab.
So there must be more to this story than just “we woke up and discovered how expensive these new nodes are”. Something is very amiss at Global to get this far down the road, then cancel.
V900 - Tuesday, August 28, 2018 - link
It’s Global Foundries...Hardly the best managed company out there, and not exactly known for their sterling leadership and flawless execution.
They also had huge problems transitioning to previous nodes.
Having said that, it’s kinda like a gambler who already lost his paycheck, and is about to throw the deed to his house on the table.
YES, GF already lost 5-7 billion dollars researching 7nm, investing in EUV tools etc.
But continuing the move to 7nm would have cost them an additional 10-20 billion dollars.
And that investment might never have paid for itself, since GF in 5 years might have been stuck with a 7 nm fab that’s too slow for the leading edge (Apple, Qualcomm etc.) but too expensive for everyone else compared with a cheap “good enough” 14nm or 22nm fab.
del42sa - Thursday, August 30, 2018 - link
no it would cost them a lot of less :It would have cost GF $2-4 billion to ramp up the 40-50,000 wafers/month capacity needed to have a chance of making a return on the node. “
https://www.eetimes.com/document.asp?doc_id=133363...
drwho9437 - Tuesday, August 28, 2018 - link
The first shoe has dropped.I used to work for IBM Research, though not in the part they sold to Global Foundries, but in the part where they invented the new wrinkles to make better transistors and such. I finished my doctorate in 2012. It was just about a year after I finished that the deal with GF was announced. When this happened I was not sure how much longer IBM would invest in the physical science research for such things. They promised five years I believe. But it made me think about the business case for transistors and I concluded there was not much of a future left.
Simply put quantum effects and statistical effects hurt you more and more as you go smaller, it also blows up in cost. We can debate the end of Silicon scaling till we are blue in the face. Will it be 7 nm or 5 nm or will it be 10-12-14 or something else. It doesn't matter costs go up and amortizing your cost is the better business play at some point.
Can we research something to replace Silicon? Yes. But silicon has 40+ years of investment, you are not likely to invent a technology to overtake it in a few years or for a few dollars. Do you choose to invest in research or driving down the cost of building 14 nm fabs?
I though there would be a break point in 5-6 years where the CEOs would have to choose: amortize or invest a massive amount in researching something totally new. I fully expected them to choose the former. GF just did. Will Intel or Samsung choose differently? I doubt it. Which means a big brick wall is coming. Mr. Moore we have reached the end.
evanh - Tuesday, August 28, 2018 - link
My sole argument is economies of scale is what defines the cost.evanh - Tuesday, August 28, 2018 - link
My sole argument is economies of scale is what defines the cost.Follower - Wednesday, August 29, 2018 - link
If future need to be in our hands, then 7nm drop out may not be a good idea though, as it plays a major role after 5-10 years.richmaxw - Wednesday, August 29, 2018 - link
IBM paid Global Foundries $1.5 billion to take over their foundries. They also gave them patents. In return, Global Foundries said they would provide IBM 10 nm CPUs. How are Global Foundries not in breach of that agreement? AMD is obligated to buy a certain number of CPUs from Global Foundries each year. If they don't buy the minimum amount, they have to pay anyway. But what if Global Foundries is producing out-of-date products? Would AMD have signed that agreement knowing GF was going to just give up on research suddenly? Not only is this decision a waste of the research and development they have already put into 7 nm, it also appears to be dishonest and underhanded to IBM and AMD. I hope they are compensating IBM and AMD for going back on their word.Holliday75 - Wednesday, August 29, 2018 - link
Looking back at previous announcements I am now under the impression that AMD and most likely IBM have known this was coming a long ways off so they could adjust their strategies as needed.As to the agreements in place, if AMD and/or IBM were getting screwed over due to this change I am sure a lawsuit would have already been announced.
Alien959 - Wednesday, August 29, 2018 - link
What i find interesting is that their 7nm process is more or less developed and surly that has some value to someone that has deep pockets and want to be even more vertical integrated like apple. Yes its a huge investment, but does anadtech readership thinks there is a chance that Apple would be interested in getting in the fab business.eastcoast_pete - Wednesday, August 29, 2018 - link
Apple tied itself to TSMC (and vice versa). In return for Apple booking huge quantities and making TSMC their exclusive supplier, TSMC gave Apple first right of refusal for their 7 nm capacity. They both took a risk, but that deal deal gave TSMC what GloFo was missing: a full order book for 7 nm silicon with the attached large, guaranteed revenue stream (many billions of $$$) that made TSMC's investing the necessary multi-billion $$$ into 7 nm tech possible. Right now, I simply don't know any other (fabless) company except Apple that would (or could) sign a two-figure billion dollar purchase order like that; AMD simply doesn't have anywhere near that volume, or the financial muscle, and it looks like even Qualcomm and Huawei had to settle for 7 nm table scraps at TSMC's.While some here have commented on the coming end of Moore's Law, that scenario may have to be amended: in addition to physical limitations to how much one can shrink semiconductor structures, it looks more and more that the costs of shrinking nodes will put the brakes on that development even before we hit the actual physical limits.
@Anandtech: if you guys have the numbers, could you publish a plot of node sizes/nm vs. estimated costs to get a fab up-and-running. This seems to approach an (inverse) exponential function as we go from 14 nm to 10 nm to now 7 nm.
Sahrin - Wednesday, August 29, 2018 - link
GF has been very aggressive in growing the company through private equity acquisition - which means debt, which means interest payments. It'd be interesting to see how the transaction and financing costs of the acquisitions factor into this.GF would not be the first company sunk by ownership's shitty business strategy. Blaming it all on R&D is the idiot's way out. AMD, eg, was sunk by borrowing to buy ATI - not by overspending on R&D.
del42sa - Thursday, August 30, 2018 - link
bunch of loosers....Crazyguy9 - Thursday, August 30, 2018 - link
Now I get it. Brilliant strategy by Global. Instead of having to announce that they didn’t execute on 7nm development and have no technology or customers or both, and they wasted billions of dollars of NY taxpayer, IBM, and oil money...they shift the focus to the endless debate “is Moore’s law over”. Brilliant diversion and spin on the story. Based on comments on the article it worked on many people.bobhumplick - Friday, August 31, 2018 - link
so that stupid interview about glofo's 7nm and chips at 5ghz (which people were assuming meant ryzen instead of ibm's upcoming chips) was just hot air. do you know how many fanboys have been citing that stupid thing. and it was all nothing. and where are all those people now?YoloPascual - Sunday, September 2, 2018 - link
OMG! TSMC gonna have it all for the next 2-3 years??s.yu - Monday, September 3, 2018 - link
There's still Samsung right behind them.The Free Agent - Thursday, September 6, 2018 - link
GF has suffered since the beginning from the luck of good leadership. The actual CEO and his team had no vision at all. how did they invest billions on 7nm and then they find out they have no capacity to manufacture 7nm devices even if the production was ready. Wasted over $500 millions on EUV tools that now GF will sell back to ASML for penny on a dollar. GF management was playing with the house money (Mubadala), When Mubadala said enough is enough...the party is over. Malta site will shutdown in 2 years max when AMD move 100% to 7nm at TSMC and stop ordering 14nm products.GF Milking UAE - Friday, September 21, 2018 - link
A GF EVP I worked with overseas years earlier asked if I would look at their brand new FAB 8 = very disappointing at best. They tried to hire me after my assessment of FAB 8 (6+ years ago). I met with CEO & CFO, FAB 8 (ex-AMD) Manager, FAB 8 (ex-GE) HR Director, personality profiling, 80 emails, etc. and finally got an offer 3 months later which I turned down the next day. I saw no opportunity for a successful money making operation, no profit sharing, no IPO, just a pay check and politics. Way to many people having influence without any accountability for the companies success. Some of these people have been milking the UAE for 10 years. Too many losers milking the UAE investors that could careless if the company ever made a profit. Instead of purging these milk UAE for a paycheck people, GF promoted these people like the HR director. History = UAE (ATI) buys AMD (cannot compete in a 2 horse race). Then buys Chartered (company that lost money for Singapore Gov for 20+ years) I worked next door to Chartered for 7 years watching Chartered never make profit while the gov financed more fabs. Then the straw that broke the camels back GF gets paid a Billion+ Dollars to take over money losing IBM. Next put a bunch of money losing IBM people in charge of making a profit. Promote ex-IBMer to CEO and now the camels back if finally broken after 10 years of losers padding their pockets with oil money. What ever happened to building that FAB in UAE to employee the people at home??? UAE still has the $$$$$$$$$$$$ to listen to BS = break up and sell off the assets for the least loss. Nothing like paying people to convince you to Buy High and Sell Low so you will lose less of your countries money. Chinese have a saying the Cow who drinks its own milk will not thrive. UAE should have told everyone in management you have 3 years to make a profit or your terminated (period), especially the original FAB 8 HR Director.ClickFunnels - Tuesday, January 1, 2019 - link
It may be because of the demand and supply. No one would shut down a profitable venture. Right?https://linkedin.com/company/click-funnels-discoun...