Graphics card prices really are something. In late 2019 I purchased an nvidia gtx 1060 used for $120. Now just yesterday I sold that same card with an additional year of wear on it for $220. This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.
I'd also been keeping track of used i7-4790k CPU prices, looking to upgrade from my i5-4670k. They started out at $120-$130 earlier in 2020 and are up to $160-$170 now. For an eight-year-old CPU!
It used to be the case that computer components aged like milk, price wise, but with the death of Moore's law in the 2010s that is less and less the case. I remember RAM prices spiked a couple years back too.
Huh - wow. I think I have a 1060 in my closet. I bought 2 for 100 each in 2019. Was planning to use one for GPU VM pass through and could never get it to work properly.
I have been looking to upgrade my i7-4770k rig to a Ryzen 5900x, but after seeing this insanity and paying attention to what the mfg's are being vague about, looks like I'll be waiting until June (or later). By then, there may be an entirely new generation of chips ready.
CPUs aren't too bad. I've got a 5950X I bought at MSRP and I just found a site with a 5800X in stock for MSRP. 5900X on the same site is backordered but didn't see how long the wait is but I doubt it's months considering the 5800X was in stock.
Intel is releasing Rocket Lake shortly but it won't have any options with more cores than the 5800X. Zen 4 isn't launching until late fall. Not sure availability will be any better on either of these launches anyways.
GPUs are a little bit more ridiculous. You can get them if you don't need it to show up tomorrow but getting one at a reasonable price any time soon is much more a challenge than doing the same for a CPU. That being said I did manage to get a reasonably priced 3070 in a build early on but I put that to luck more than anything.
There's okay stock for 5600X + 5800X but the 59xx series is nowhere to be found and probably won't be until later this year if you read between the lines of what the industry is saying.
I broke down waiting for a 5950X and bought one off a scalper last week :/ luckily got it before the recent spike in price.
I want an 5900x, but I will end up buying another Intel CPU if AMD can’t get their act together and provide stock that I can buy without jumping through hoops or encouraging scalping.
That is the interesting thing I think covering AMD on the GPU side and Intel on the CPU side, there is so much demand vs supply that less performance per price products are viable because MSRP has gone completely out the window.
>Now just yesterday I sold that same card with an additional year of wear on it for $220. This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.
a 6Gb version of it would generate about $3/day on Ethereum as of today, so it would be sweet, even by cryptocoin standards, ROI of ~70 days. The rise of Ethereum prices makes even the 2-3x GPU price growth of the last couple months still a good deal. If one wonders where all the GPUs are i think that provides pretty good idea https://cdn.videocardz.com/1/2021/01/GeForce-RTX-3080-Mining...
I built a computer with a AMD 5600XT over a year ago, and was having a hell of a time getting it to not crash, so I bought a Nvidia GeForce RTX 2060 to maybe replace it with (which apparently scared the AMD card into working properly). I feel like a fool right now for not just sticking it in my garage for a year.
Ha! I do believe they cleaned up the drivers roughly around that time. I got my 5600XT in April 2020 and have not had any issues with it. Of course there's luck of the draw, too, but I read a lot of early adopters of the 5600/5700 cards had some issues with drivers.
Looking on eBay, it looks like I could sell that $290 card for about $500...
I sold a 10 year old Asus Rampage III Black X58 motherboard last month on eBay for north of $650, which is more than what I paid for it new (even after accounting for inflation.)
Granted, this was a limited edition board, but it is still 10 year old tech!
CPUs are immortal, it's the motherboards that die. If you want to run a certain chipset, after some amount of time it's motherboard availability that will keep you from doing that!
There are more separate elements of lower quality testing than CPUs (which go through serious qa and binning). For example capacitors on motherboards break/explode, PCB traces degrade/crack more often than a CPU will break.
CPUs have a lower failure rate than motherboards, which makes sense if you think about it. (One is an IC, one is a complex assembly of many ICs and passive components.)
Anecdotally, I'm still running a computer I mostly bought in 2011, i5-2500K still going strong, but the motherboard has a few issues: one of the PCI-E slots works sometimes, one of the RAM slots is completely broken, and the RTC seems to lose a few minutes a week.
CPUs should go pretty much forever, provided they're run on clean power. They're a little square, entirely encased. Motherboards are bigger and with physical connectors you need to use force on.
I'm also on an i5-2500K from 2012, I only upgraded the GPU so far. I recently ordered a new pc for roughly the same price as the one i built in 2012, curious to see how much of a difference I will notice day-to-day.
Back in the day the story was that a knockoff manufacturer of capacitors used an incomplete recipe—the caps started out fine but didn’t last. Wikipedia says the story is unverified but still includes it under the “Investigation” section:
Anecdotally, I built 4 computer labs at a college with Dell optiplex desktops, and a year later, 30 of the 104 had their motherboards replaced due to blown caps. And at the time, the nearest certified tech was 70 miles away, over a mountain pass.
No moving parts, small and stable voltages, relatively stable temperatures, no capacitive sections to experience electromechanical strain. There's essentially nothing that can "wear out" in a CPU. Basically the only things that can degrade a CPU are if you run it on a noisy voltage source (unlikely, behind a power supply), or if you regularly turn it on and heat it up a lot and then allow it to cool. But if you keep a CPU well cooled or just run it constantly it will last essentially forever.
This isn't true. Aging is now more of a concern at 16/7nm to the point there's extra margin specially for it. But all that margin does is delay the impact beyond the realistic lifetime of the chip (normally approx 10 years for consumer stuff). Sometimes you can't afford the extra timing margin and you have to use extra anti-aging circuitry instead.
I read that the reason Intel switched to LGA pins from standard sockets is the chip is more expensive and less likely to die than the motherboard, so it makes more sense to put the fragile part of the interface on the motherboard.
I have a 4770k and because of the shortage rock a 3060ti I just got out of sheer luck. My friend might have snagged a 5950x but until it ships he’s not taking any chances ordering other parts. He’s been trying to get one since launch.
CPU pricing is weird. If you look at the 6th-10th gen Core i7s they are not too far off in price, [1] - for example the 6700K is currently $272 and for ~$30 more you can get the 9700K. The current generation usually command a bit of a premium.
The 6700K has the same "most powerful CPU for its socket" situation going on. When I was looking to upgrade my 6600K, it was absurd how little extra it was to get a brand new Ryzen and motherboard compared to the 6700K.
> This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.
Nonsense. It runs all games on High/60fps or better, with the possible exception of Cyberpunk and RTX marketing gimmicks. That's why it's still a relevant card 4.5 years later and goes for $200+. I own one.
> I'd also been keeping track of used i7-4790k CPU prices, looking to upgrade from my i5-4670k. They started out at $120-$130 earlier in 2020 and are up to $160-$170 now. For an eight-year-old CPU!
Funny, since I own one as well (4670k). And I haven't ever felt CPU-limited. Mine is mildly overclocked to 4GHz, and it's been able to crunch everything I threw at it since I bought it in 2014, again with the possible exception of HEVC encoding. It's still a very relevant CPU and handily beats Ryzen 1st gen for example.
It's also the last generation of Intel CPUs that lets you run a non-spyware version of Windows.
You pretty much need ray tracing for reflections of objects that are not directly visible - hacks like SSR cannot give you that. Even if you don't care about graphics, RT can absolutely have an impact on gameplay too. And if you care about graphics then RT makes realistic lighting so much easier and with much less restrictions. Modern games have to fake so many effects just to make things look good which do come for "free" with RT.
Now for the current iteration of RT graphics cards, it is still very limited so calling it a gimmick is a bit more warranted. But the hardware will improve (and has already from the first RT generation) which will allow more games and engines built arround RT instead of having it tacked on as an option.
I bought 12 2070's for our render farm, and the account manager thought I was crazy. The 3000 series had already launched and would come back in stock next month he said. Who's laughing now, I don't think they ever got enough stock of any card to serve our needs. A week later they didn't have any Nvidia cards in stock, just none at all, luckily I had already bought all their 2070's or we would've run into delays.
One month of 12 2070s today produces a hell of a lot more rendered frames than waiting until next month to maybe get our hands on something. Let me just run and ask the client if we can push their deadline back while I'm at it.
I have a 4790k it's still pretty good work horse but finally getting to the point you'd be worth going up to a older Ryzen instead.. the pricing is similar and you get a memory boost etc. Though you do need a new mobo/ram but still.
Yeah I was weighing just upgrading the whole motherboard to a ryzen chipset or something, but I have an ITX build so it would be pretty expensive for the features that I want.
Transistor have gotten smaller and transistor counts have continued to rise over the last 10 years. You might be thinking of frequency scaling, which is not the same as moore's law.
Moore's Law isn't dead. Dennard scaling died, Intel's process hit a wall, and CPU architectural improvements stalled while Intel was stalling, but transistor density improvements have stayed steady year after year since 1970.
No, the parent is right. Moore's law is about the transistor density of the least cost process. This is a quote from Moore's paper [1]:
"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year [...].".
When the cost part became tough the "cost side" was progressively dropped by being more and more creative on what cost is considered. First ignoring the NRE part to look only at the marginal cost per transistor. Then even this got problematic and the cost aspect was just ignored.
The "true" Moore low creates a virtuous cycle: everybody move to the next node as it's better and cheaper, so there's no point to stay back. Also, because cost gets down the addressable market gets bigger, sustaining more costly fabs. This virtuous cycle operated for a long while, so much that many takes it for granted now.
But if Moore's law die by not being lower cost (the other way is by not increasing density, and this is still progressing if slower than before), this virtuous cycle is broken. You can get higher density, but at a higher cost. With cost increasing only a subset of the market will move on to higher density. Little by little, the market size supporting the costlier high end will get smaller, while the fab cost is still increasing, contributing to a slower progress too. The virtuous cycle becomes vicious. This is happening now, although it's not too bad yet with enough very large actor able to sustain higher price (thanks Apple, I don't use you products but do benefit from all the money you're pouring into TSMC ;).
And in the end, if this gets to a serious slow down, then there is the specter of commoditization. It won't happen soon, only after a possibly very long transition period as the tech is really hard. But if progress slows a lot, the lagger may eventually catch up. With more competition would come lower margins. Which is definitely bad for stock valuation. We're still far away from this, but it's still a possibility. Look how the process leader has always been the biggest cheerleader for the "Moore low is still alive and well" (and let's not talk cost between friends, shall we?). See how it was Intel when they started selling their 10nm future, and look how more discrete they are now, and how TSMC has replaced them as the "everything is fine" voice.
I hate that you're getting downvoted for this. The original paper[1] isn't even 4 pages long, yet most haven't bothered to read it. If they would, they'd see he was speaking about fabrication costs. In particular, it's this passage that got distorted by media:
"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000."
Somehow what became known as "Moore's Law" excluded any reference to costs. For a similar distortion, see the "Turing Test".
That's not really true. Moore's 1965 paper was the first reference to periodic doubling in ICs but nobody called it "Moore's Law" then. Moore put out a few other papers then at the conference where Dennard gave his famous presentation on scaling laws someone coined the phrase "Moore's Law" in an interview and in that interview and in usage since then it was always ambiguous exactly which doubling it referred to because until the breakdown of Dennard scaling all good things doubled together.
No, Moore's law is transistor density (technically the ‘Number of Components Per Integrated Function’) at the minimum cost per component. Since cost per transistor is still falling on newer nodes, it holds.
Cost per transistor is still a relevant property, but it's harder to analyze since cost numbers aren't public, verifiable figures, plus the estimates I've seen were heavily criticized by people with better information, but from what I can tell it would just offset the rate of growth, not negate it.
That is really interesting. I've got a GTX 1060 3GB that I purchased for $280, looks like I could reasonably sell it for $200CAD on ebay. $80 for 3 years of GPU usage wouldn't be a bad return.
But what are you going to do in the meantime? It's like people happy that their houses are increasing in value, but so do all the other houses. You've still got to live somewhere.
Interest rates have gone down though. Presumably you sell your house and break the current 3.5% mortgage. Then buy a bigger / more expensive house and lock in at 1.5%. Overall your monthly payments remain similar.
This is not a good way to measure things. Did you just exchange 10 years of monthly payments for 30 years of "similar" monthly payments?
My mother protested to me over the cost of her new house, defending the idea that she could afford it by pointing out that monthly payments were similar -- property taxes on the new house were about the same size as mortgage payments on the old one. I was kind of horrified by this; conceptually, when you make a mortgage payment, your net worth goes up. When you make a tax payment, it goes down.
This is how we get a housing bubble. The expectation of never ending credit that gets cheaper and cheaper. One day interest rates will go up and your house will not be as valuable as your mortgage says it is.
Well the 4790k is special in that regard. It is the best processor you can buy for the LGA1150 socket. Laid my eye on it too, my plan was to get the biggest performance boost for the smallest capital expenditure. The 4790k would have been a simple swap in and enjoy an at least 33% faster processor in my case.
But instead of splunking 140 for one i got a used workstation with a Xeon E3-1271 (the 2. fastest Xeon without GPU) for 180. It included other parts, which made the upgrade free (if you don't count the work i did) by selling my old CPU+Mobo+GPU. The GPU (RX570) i had was at the same price level on ebay as when i bought it.
If you badly wan't to upgrade (from a 4670k the delta is way smaller than from the CPU i had), i'd suggest looking for an E3-1281 or 1271 if you have an dGPU, if you need an iGPU replace the last digit with a 6
> Well the 4790k is special in that regard. It is the best processor you can buy for the LGA1150 socket
Little known fact and even less available, but: The best processor for the LGA1150 socket is not the i7-4790K, but the i7-5775C. Faster IPC, significantly bigger cache and a stronger iGPU, and despite not being called a K cpu it's overclockable. https://www.pc-kombo.com/us/benchmark/games/cpu/compare?ids%... shows a comparison with many gaming benchmarks.
The Xeon route can be cheaper though, it's a good alternative.
> You can't just swap it out too, you need a mainboard with a 9-Series chipset, which you usually don't if you are running a Haswell.
There are many Z97 boards out there that currently run a Haswell processor, all of them do support this broadwell alternative.
> Well, it has the faster iGPU thats for sure. But if you use it for work it is usually slower:
The pc-kombo collected benchmarks do not use the integrated graphics, that's all cpu performance with usually a high end gpu.
The 5775C not only has a better IPCs, it has 128 MByte eDRAM "L4" cache. That's just cool and helps some applications a lot. That it is slower in some benchmarks is just because of the lower turbo clock. Overclock it and it will always be faster than the 4790K - it likely won't reach the turbo clock of the 4790K, but it just needs to come close, together with the higher IPC that should be enough. Then you have the performance advantages that the gaming benchmarks show and match the performance of the 4790K in the multithreaded application benchmarks.
It's just seldom available, but sometimes when searching for used processors like the 4790K the 5775C turns up and can be had for less.
Sure, there are many availlable, but it adds cost. If you don't want the hassle of swapping mainboards the 4790k is a drop in replacement that gives you 4-4,4GHz out of the box.
If you have a Z87 board. If you have a Z97 board it already is a drop in replacement. Z97 sold like crazy, it's a common starting point for upgrades right now. Just to make sure that comes across :)
Mindfactory is one of the biggest sellers of hardware in Germany. They always have a budget GFX card for around 50 EUR on their starting page, but recently they have replaced it every week or so with an older and worse model at the same price. Right now they're featuring the 7-year-old GT710! Two years ago, I'd have thought this could only happen in case of total economic collapse.
GPU prices are really insane. 13 years ago I bought a brand-new Radeon HD3850, which at the time was one of the best GPUs available; the budget version of a line where the HD3870 (and later the HD3870 X2) was the high end.
I paid €128 for it.
That's not going to get you anything today. 3 years ago I paid €350 for a GTX 1050 TI. Now, everything new starts at €500, and everything older is a waste of money, says everybody.
GPU improvement rate is stalling (or more likely pixel count growth is slowing), so less visual improvement easily obtained? And given that cryptocoins created a new use for GPUs, we have a price influx in even "old" cards.
GPU improvement rate does not look to be stalling. Nvidia is still introducing impressive new features (support for HW accelerated ray-tracing, NN acceleration for stuff like DLSS, etc.) while steadily improving performance. It's just getting harder to make games look nicer, because they already look really nice. Artists/Graphics devs have learned to do really impressive things using just rasterization. As someone who is interested in GPU programming, ML and rendering, I personally find it really impressive what Nvidia has been doing over the last couple of years.
Though I have to say, for all its faults, CP2077 really blew me away graphically (using the RTX support) like no other game did so far. Also, VR would also really benefit from better GPUs. And 4K gaming is also not quite there yet. So actually, I maybe wouldn't even say that. Performance improvements are still welcome.
The problem is more that there is an unprecedented demand for compute. ML, Crypto mining, Gaming being more popular than ever with Corona, etc. just means that demand is higher than ever. Combine that with the new console launch and exhausted semiconductor manufacturing capabilities and you can easily see why customers are biting the bullet and going for less pricy, older cards.
4k gaming is fine unless you brute force lighting by ray tracing. Even Cyberpunk which is very unoptimized. VR is a separate story. It needs both high resolution and high frame rates.
8k is not handled by current GPUs.
DLSS is a cute hack. We have better scalers than that in video market, its main advantage is speed. Instead, fixed function scaler should be better than outdated 5-tap filter. Remind me again when it matches basic scalers put in Smart TVs.
The gaming GPUs are hilariously underpowered for compute, even something like 3090, vs MI100 or A100. It's just the latter cost $10k per unit.
I’m just on 1080p and a GTX 1070, so haven’t experienced it for myself, but from Digital Foundry’s breakdown of DLSS 2.0 in Control, it sure looks to be more than a “cute hack”.
Are there specific chips, or products from particular companies or geographic regions that are in short supply, or is this just a general lack of fab capacity?
Source? Ajinomoto Build-up Film is used only for high-end flip-chip packaging. Very few automotive ICs use such advanced packaging. Most non-EV cars probably don't have any at all.
Certain manufacturers have been hit hardest, examples being NXP and Infineon, at least partly due to an increase in demand for automotive components. It seems as though motor control, and 'connected home'/IoT devices may have also contributed to shortages.
Either due to maxed-out fabs or shifts in uC/uP usage, the shortages are wide-spread.
I asked this question the other day. From what I heard, it varies, but for some items like microcontrollers the lead times are out to 52 weeks.
There's a lot of noise about the shortage of high end parts from TSMC, but I'm much more concerned about shortages that can impact more common consumer goods. Having issues like this when the economy is trying to recover from a pandemic could have far reaching consequences now that so many products contain semiconductors.
From what I gather from my sources that are involved in the procurement side, it sounds like there is a general shortage that is affecting certain chips more than others. Say you are using a chip where there are only a handful of customers, then companies aren't going to make your part as they have other higher runner chips they can run instead.
It's stressing me out. I'm an EE at a startup and I'm scrambling to pre purchase whatever's left on the market to get us through the next 6-12 months. Thank goodness I already laid in a huge stock of processors, the parts I'm scrounging for right now are for battery management, signal isolation, and switching voltage regulator control.
I was doing electronics design and prototyping a couple of years ago when MLCC capacitors were getting impossible to find in common values. I feel your pain.
How long until the TSMC in Arizona gets built? Or other similar things.
I feel like we need to find peaceful ways to resolve our severe differences with China. Either that or stop depending on China's neighbors for critical technology.
I just hope that there can be some kind of progress with the new administration in Washington.
I was not trying to get into the genocide thing. I just know that Taiwan and South Korea are very close to China. And so I feel like even before China really builds up their military much more, the proximity of those places to China is a pretty significant military issue for US and its allies. Its hard to see how it does not provide military advantages.
More generally, it seems like we want to hold out hope that in the long run serious military confrontation can somehow be avoided down the line. But if not, chip production is a fairly key item that you don't really want to be happening on your enemy's doorstep.
I just need a gt 1030 for an old pc and the prices just seem outrageous for such a low end card. It's more expensive now than it was at release almost 4 years ago!
The real scary thing is we're down to TSMC and Samsung who can now hit the latest node; that seems to be the problem - and the cost of building fabs is insane; it looks like we really need to do research into cost.
We are doing research into cost. This stuff gets massively cheaper every year. It's cheaper and easier than ever to build a 16nm fab.
Profit oriented companies take those cost savings and choose to drive further down the technology stack (to more complex and expensive nodes), where there are fewer competitors that have enough profits to build competing production lines.
Moore's law is almost our friend here. Being at 3nm instead of 16nm is decreasingly important.
> It's cheaper and easier than ever to build a 16nm fab.
Is anyone doing ~16nm microprocessors besides Samsung/TSMC/Intel/GloFo? I wouldn't call it "cheap and easy" if only they can do it.
(After a quick Google, it seems that China's state-owned SMIC is also starting to do ~16nm, but that's all I could find, unless you mean memory and not processors).
"it's supply chain limitations by design, because the customers told us, we don't need them, and then coming back and said, oops! we might have been wrong."
From the CEO it's due customers thinking it was going to be bad (for various reasons) and then it wasn't. The process have long lead times and when you pause them you can't just expidite them, they have a fixed lead time.
[edit: this post appears to be getting a lot of downvotes. please consider replying if you disagree]
> The shortage is amplified by the us china tech war
I keep hearing this vague accusation, over and over, with no evidence provided.
Taiwan is not part of China -- at least as far as the sanctions are concerned.
China exports lots of finished electronics but very, very few semiconductors -- and virtually none for the automotive sector. SMIC is not competitive. They're kept afloat by the Chinese government and supply almost exclusively for domestic products.
> that forced key supplier asml to limit deliveries of chip making equipment.
Source?
> Also tsmc cancelled euv equipment orders due to the huawei ban:
How would that make any sense? TSMC is completely booked to the hilt right now. Would any sane person think "gee, that's a great cancel our expansion plans"
That $150 million ASML EUV scanner order that got cancelled was from SMIC, not TSMC:
I'm not going to read the whole thing, but the word "Huawei" appears only once on that webpage, as part of an anonymous blog commenter's assertion. That is not a credible source.
Cancellation of production facilities at TSMC due to Huawei/HiSilicon ban:
"... as clearly our key foundry customer came back and said, listen, our key customer for N3 is now blacklisted. So we cannot ship. So we need to adjust our 2021 outlook for EUV systems."
Could sell more had it not been for the US sanctions:
"... but clearly see potential upside to these numbers where we can disregard any further impact of export control regulations resulting from the current geopolitical situation."
The CEO is business person and for him to do any kind of politics or worse geopolitics would hurt his business. So he clearly cannot just say: "Listen, let us sell a bunch of these bad boys to the Chinese and the chip shortage will be a thing of the past in 6 months." Even though that just might be the case.
Current shortages impact processes that aren't EUV. For example, the consoles, AMD CPUs and GPUs all use non-EUV N7. And Nvidia is on Samsung's 8nm, also not EUV. There are delays and shortages all over the industry if you read DigiTimes. Packaging in particular. Capital and lead time sensitive industries like this suffer greatly when demand increases unexpectedly.
Second that linked forum thread is really unclear. Some even thinking that Intel was the customer who delayed EUV machine orders (which would make sense given their 7nm problems).
I've noticed the used video card prices have skyrocketed. I've put up an eBay auction for a GTX 1060 6GB and it ended up selling for twice than what I bought it used two years ago. Also lots of people DMing me asking if I have more to sell. Is there some mining craze going on with old cards?
And mining. Ethereum is still mined often on videocards. And in wake of Bitcoin skyrocketing, ETH prices are way up too. So it pays more (in dollars) to mine, now.
> Is there some mining craze going on with old cards?
Combination of mining craze and massive demand for entertainment electronics and GPUs since everybody has been stuck at home due to pandemic and decided to build/upgrade their computers.
Which also lead to the situation that scalping outfits have extended their services from covering mostly designer appeal to now also include electronics like GPUs and consoles. The popularity of this also attracts a lot more people to use these kinds of services as a form of investment, which most certainly does not help an already supply constrained market.
According to the efficient market hypothesis, if there's profit to be made and those companies aren't taking it, someone else will. Have you considered eBay?
What are you talking about? Aren't scalpers doing exactly what efficient market hypothesis would suggest by buying products that are underpriced (graphics cards, consoles) relative to demand and selling them at the highest price people are willing to pay?
They only become that valuable when you completely restrict the supply by scalping them. If I went to the shop and bought all the food and resold it at a higher price that's not 'the market at work', I'm just not a dick head.
This! Particularly when we've reached a point where "scalping as a service" has become a thing.
There are outfits out there that charge a monthly subscription for access to prioritized acquisition channels in combination with bot networks.
This is streamlined to such degree that people don't even need to hold physical inventory themselves: The scalping outlet handles all of that for them, all they need to do is supply the capital for the bots and set the parameters for buying/selling, everything else will be handled by the service.
What you are talking is market manipulation not scalping. How many scalpers do you think truly have $30 million to buy 100000 GPUs?
The situation is far simpler. Consumer demand exceeds supply which drives prices up. Some people figured out where the current equilibrium price is and it turns out that it is really high.
Those high prices encourage consumers with old GPUs to sell them on the market and thus provide more supply instead of throwing their old GPUs away every time they buy a new GPU. If you were to artificially restrict the price it would result in less people selling their GPUs and thus you would end up with less GPUs for everyone.
If the manufacturer is selling a product that is worth X at market rate for a lower price Y then it basically gifted the difference X-Y to the buyer. If the manufacturer is selling the product at market rate prices they will have higher profit margins which encourages them to build a slightly bigger fab next time. It would also ensure that existing manufacturers make enough money to continue running their business despite the paper thin margins outside of a shortage.
We're obviously not in an efficient market, because supply is limited and market entry is nigh-impossible for truly new suppliers, and also because scalpers are viewed as evil and explicitly acted against.
It's already stalled onboarding of HFC customers in Australia due to lack of modems. Apparently the supplier of modems (Arris) are unable to source enough SoC's from Broadcom.
Due to this, people needing replacements and any in-flight new orders apparently have wait times of up to 9-10 weeks...
Not helping this is that some people try to sell them in the secondary market despite them being premises (or rather segment) locked.
For others who didn't know (like myself), HFC in this case does not refer to Henry Ford College, nor to Hydroflorocarbons, but to Hybrid Fiber-Coaxial networks:
I would think "right to repair" has a lot to do with this. When you have to replace everything only after a few years, a lot more chips would be needed.
Maybe this will somehow stop the insanity of premature obsolesence. A 6 year old Android tablet is hardware wise in great condition. Some games that happen to install work great.
However, the last software update was about 4 years ago.
The tablet has storage expansion, but due to artificial limitations in the Android system, only a fraction of software will utilize the sd card.
These are only minor examples.
I could rant for hours on the ”who cares, it’ll be garbage in a year attitude” that has invaded much of the tech world, but others have done a better job. I my self am some times party to this attitude.
It all results in a lot of waste of all kinds and also bad user experience.
To look at the silver lining, maybe this scarcity and some other trends will result in a small corrective trends in engineering and management.
In the meanwhile I’ll keep dreaming of a new gaming PC for a year or two.
It's small, and on the grand scale probably negligible, but e.g. Fairphone is making phones which are designed to be repaired and maintained. They are growing fast, year over year.
If anything, such companies show there is demand and money to be made, there.
I certainly stress a lot less, since I decided to get off the hype and upgrade train. I've got 600+ games on my Steam account that run perfectly on my PC, even though most of the hardware is 2011 vintage and the only "new" hardware in it is an SSD and an RX560.
Yeah, although I'm getting a little nervous about failure-rates now that some core parts of my system are in the 8-10 years old range. (Basically everything but the video card and SSD.)
Particularly the factory-sealed liquid-loop, which has both moving parts in a pump and gradual liquid chemical degradation to consider. On the flip-side, that particular component seems like it would be less sensitive to silicon-fabbing issues.
My view is that if a component fails, I can either find a second-hand replacement or buy a replacement refurb PC and move necessary components over. I can get by with just my laptop for a while, but obviously I don't need a PC for work purposes, that would change the equation.
> In the meanwhile I’ll keep dreaming of a new gaming PC for a year or two.
I was finally looking at replacing the 8-ish-year old parts of my gaming PC which has otherwise been running like a champ (Intel 2500K), and now... Well, I guess I'll just focus a lot on pixel art games :P
I was going to post about as much the same. To add to this is how everything is deprecated so quickly now. I like to use my phones until they break or I am almost forced to. I am just not inclined to buy a new model every year. Same with my laptops, TVs etc. I have an old Sony Ericsson which I can tell they no longer want to support, the updates are coming much more slowly and I swear with each one they are slowing the performance of the device (but perhaps that is being paranoid), I have an old thinkpad which runs a rolling release style linux distribution and so I don't need to upgrade all the time and my TV is 720p / non smart which looks fine to me. I am honestly happy with this lot, but the manufactorers most likely see me as a bad customer.
I think you can go too far in your direction. A 720p TV doesn't look nearly as good as a Full HD or 4k. I remember watching a shot of a sky filled with stars on a blueray movie and having to pause it, just because it was that breathtaking.
There is nothing wrong with a solid thinkpad if you can replace the battery so that it still works well, but my days of jumping from outlet to outlet because I got 20min on a full charge are over. So are the days of carrying a heavy laptop with a bulky charger.
I brought a oneplus phone some years ago and I paid too much for it because I wanted a "great phone". I don't regret the ultra-fast charging and day+ battery though, and my next phone is probably also going to be a oneplus, just a much cheaper model.
So if there is something to sum up this rambeling comment, it should probably be this: know what you are missing in your current tech, and upgrade as it makes sense to do so for your personal use case.
That's what happens when you centralize silicon production to a few players. Integrated circuits are a requirement for modern life. Every country should have its own fab, even if not competitive, to cover its own needs during shortages like these. What if the shortage was longer and more severe? Are people living in smaller countries willing to go without electronics?
Smaller fabs do exist, but they don't enjoy economies of scale unless they could project very strong demand down the road. Nobody invests in such a venture guaranteed to lose a lot of money.
Another issue more pertinent to automotive manufacturing is certification. For example, car makers once had to wait almost a year for Renesas to rebuild their facilities in Naka following the 2011 earthquake despite the company operating fabs in multiple countries. The latter were perfectly capable of producing the product in demand but the cost of certification is so prohibitive that moving production elsewhere was never brought up as an option.
Yeah, but 20 or so years ago they realized that luring new grads with foosball tables and free soda to work on getting people to click on ads is a faster and easier way to make money than making chips which is a slow, capital intensive business where you need to think long term and respect gray beards, so there you go.
I wonder which kinds of chips would result from minimizing capex instead of transistor size. There are some people targeting DIY garage fabs but from what I know they basically target old processes and old chips; I wonder if, once that takes speed, current knowledge, techniques and material availability, that differ from the 70s, will show some proper DIY innovation...
I bet it's still cheaper to roll the dice at a 1 in 50 chance capacity is down 20% for a year than overbuild fabs. They're semiconductors, not staple foods.
Not really. People thought that countries were too interdependent to wage war, but they were wrong and that had nothing to do with the myriad of reasons for the war ( rivalries, historic and current, Russian development (which put Germany waging war on them on a timeline), Austria-Hungary falling apart and trying to compensate, the UK being skeptical of Germany, France wanting revenge, etc etc etc etc)
Fabs already run 24/7. Decentralizing them across more countries wouldn't automatically give you any spare capacity.
As far as I see it, we need more capacity, and for a situation like this it wouldn't matter if that extra capacity was in 50 different countries or if it was all in a 50 km radius in Taiwan.
It would be nice if silicon required to run critical infrastructure is manufacured localy or at least not by an opponant in a (cold or economic) war. IPhones are pretty low on my list of concerns but I would like to have working powerplants, power distribution and communications.
Powerplants, power distribution and other industrial tasks outside of IT don't require latest-generation parts. There are dozens of fabs in many countries that make ICs on older nodes for these applications. Cheap, proven and quite ubiquitous. Anyways, anything safety-critical would likely require re-certification when ported to newer hardware.
Edit: just realized that also supply of these ICs stalled. Right now, it's hard to see whether this supply shortages are just due to temporarily increased demand or not. If the latter is the case, it is still easier and cheaper to set up fabs for older nodes.
I agree but I worry that some of this capacity will go away because it's not profitable enough. I also worry that although the core functionality of, say a water treatment plant, may run on quite old an easily sourced hardware (most likely Siemens PLC:s where I live) some of the monitoring and coordination stuff is dependent on latest gen hardware.
At least Taiwan isn't a country in any kind of war (cold or hot) with the west. They are, however, in a cold war with China.
The current technical situation does make peaceful cross-strait relations critical to global technology, however. A hot war between China and Taiwan would send the silicon state-of-the-art backwards by quite a bit, as would any trade embargo post-annexation if China somehow accomplished that peacefully.
Taiwan's silicon fab prowess is a pretty potent strategic weapon against a much larger adversary!
I'm in the market to replace our 11 year old car and the supply has just dried up with what we'd been looking at. Best case seems like maybe the summer, but at that point, I'd almost rather wait for the next model year.
Presumably this will be good for the foundries and makers of industrial semiconductor machinery, and bad for fabless companies. Hard to know the time horizon, however.
Well this is really a long shot, but the chip shortage is causing production cuts across the car industry. Roughly half of all palladium is used catalytic converters. The cutback in new vehicle production should have a pretty noticeable affect on the price of palladium. If you can time the end of the chip shortage then you can make some money in the palladium futures market.
How far fetched do you think this is?
Big suppliers of ABF substrate includes multiple companies from Taiwan Unimicron, Kinsus Interconnect, and Nan Ya PCB, which all trade in the Taiwan stock market.
I considered investing in those companies...but seems like only Nan Ya has been showing an upward trend. Perhaps we are still early?
I can't answer broadly or about profit specially - but the stocks of all big wafer producers have risen sharply since fall 2020, something to the tune of 30% or more in a couple of months.
I really wish people would stop butchering the English language.
`Monopoly producers...`
And then goes on to list two fucking supplers in TSMC and Samsung, that's not a monopoly, let alone the fact there a many other major semiconductor companies Intel, Global Foundaries etc.
That is why when IBM made the IBM PC based on the Intel 8088 they made Intel license the chip design to AMD and NEC in case Intel could not make chips quick enough to meet demand.
And people thought Intel's fab problems were going to doom it. Maybe in 5 years, but right now, they've got a product 80% as good as competitors when there's a shortage. It's a compromise a lot of people are willing to make right now.
A car doesn't need a computer, its more needless complexity and more points of failure. A classic car can't be hacked, locked or broken by software.
The crisis is direct consequence of adding useless bloatware into cars, which
only need simple microcontrollers - not latest 7nm CPUs.
No car is using anything even near 7nm. The automotive sector is very cost sensitive and if anything, there’s too much complexity because they insist on using parts that are too old.
Market is dictated by 7nm as its the current gen lithography tech(with some 5nm fabs operating now) . You don't see them rushing to build fresh 40/65nm fabs just because cheap processors are in deficit, because its not profitable long-term.
And still, parts larger than 7nm are currently in shortage. 7nm is obviously extreme due to large competition around limited fab capacity, but not the only area with problems.
Ha it's bothering me the way you worded that since you can't get those chips without the attached hardware and software! But it does get me thinking. Obviously we think of Apple systems as premium, and so we can expect premium computer chips to get prioritized - Apple likely paid TSMC handsomely to get (most/all?) of the 5nm allocation for the A14 Bionic and M1 chips. But are other high-end chips "premium"? Because the AMD Ryzen 5, 7 and 9 (5600X, 5800X, 5900X, 5950X) are all in very short supply (at least on my U.S. online stores).
BUT... those are just for enthusiast builders. You can find OEM systems with those chips.
So I guess the only lesson is that being a computer manufacturer will give you priority on high-end chips, and in times of short supply, that's where they'll go first, drying up the supply for small buyers.
Yes, prices for low end AMD processors are crazy and stocks are non existent. Just before Christmas I decided to build a spare computer and wanted to get an AMD Ryzen 3100, prices for that was hovering around the 99 GBP mark. Then I had to put it off until the new year. Now 3100s are like hen's teeth and if there's any the prices are about 180 GBP. I'm not that desperate right now, I just hope prices will drop but the question is when?
The pricing and availability is crazy because the suppliers are afraid to ramp production.
In many categories, like cheap laptops, webcams, small printers, etc demand exploded. More Chromebook orders came in 2020 Q3 than normally happen in a year. Laptop LCDs are super-constrained as well. The dilemma is that expanding manufacturing capacity now will reduce margins tomorrow, especially since the order books will be empty as COVID fades as schools open again.
My guess is you’ll see prices in free fall as the edu backlog clears and COVID restrictions fade. Probably this time next year.
All these articles seem to miss the point between why the car industry is facing shortages, and conflate it with the shortages on high end silicon devices (which have very different causes).
High end devices have seen a massive bottleneck on TSMC and in some of the required materials like ABF, combined with very high demand (new console cycle, people buying new computers for work/study from home).
The automotive sector suffers from another problem. For decades all manufacturers have converted to JIT (lean manufacturing), which means that most of them counted on almost daily deliveries of parts to keep operating. With the transport restrictions and difficulties on many countries, this has become unmanageable and unwieldy, and factory operators have made the obvious call, increase stock margins. Typically a factory might keep stock enough for a day of operations.
A manager sees that they will have problems getting their deliveries in time, so increases their orders to cover a week. This depletes the local distributor that also tries to avoid keeping large stocks. Now a lot of other managers in other companies see that part vanish, and fearing that they’ll be out of parts, they increase their stock keeping too. With boards having from 100 to 1000 unique references, the effects propagate from one manufacturer to another, while everyone runs to get enough stock to avoid having to close.
There has been no less parts than one year ago, but the massive “de-risking” means that there’s now too much demand to keep everyone happy.
Interesting and clear explanation, thanks!
Do you also see this as a possible driver for general inflation? Or is this JIT production mostly restricted to the automotive sector?
On a tangent: Is there any good place to learn about the current state of supply chains? I searched youtube occasionally, looking for talks / interviews with people who have in depth insight in what is happening right now (/during the last year) but I could barely find anything. Shouldn‘t this be one of the hottest topics right now?!
> Or is this JIT production mostly restricted to the automotive sector?
JIT is a major trend everywhere where logistics are involved, but in car manufacturers it's the most pronounced - simply because the sheer physical size of many components required for a car would mean that car manufacturers would need big warehouses, which they have torn down or sold many decades ago.
Instead, society has picked up the cost of storage - by building ever more and more bigger highways for all the trucks.
Not sure about jit requiring bigger highways. The components all need to be shipped sooner (and warehoused) or later (jit). Name amount of matter is transported either way.
Holy sh*t. It is so obvious. Yet I have been deceived by the media that the evil autocorps jam our roads to save a few bucks. It is very tiring that at this point we have to be sceptical about every statement that is dealt to us and peel it apart from first principles. I thought that would be the job of proper journalists...
It's not as easy. In ye olde times, car manufacturers had railway uplinks for shipping stuff around... they still have these, but trucks are (way) cheaper than railway.
JIT is used everywhere, as it improves financial indicators because companies aren’t carrying inventory on the balance sheets. It’s also a big reason why companies in an industry cluster, as proximity to the supply chain de-risks jit.
The shift to modern inventory management coincided with offshoring. It became easier to source parts in Northern Mexico and now China as a result.
> Or is this JIT production mostly restricted to the automotive sector?
It has more to do with Human side of things. No one wants to take the blame. Or similar to the old saying, no one gets fired for buying IBM. The concept is a lot easier to understand once you have worked in a large enterprise and witness it firsthand.
It you put too much inventory as backup. Someone will point the finger at you why are we having so much inventory. The world is at peace. We dont need that many.
Your CFO doesn't like your department? Great!. You have now given them enough ammunition to fire the bullet in your direction.
Your inventory might also depreciate in price. How do you know if your component won't drop price in three months time? Although that is also the case with hiking price. But politics is more about protecting your downside, not upside.
You want to put a contract that keep stable supply for at least a year with an accurate forecast of your sales. Which department is in-charge for that forecast? Let me tell you one little secret. There is no such thing as "forecast", but only educated guess.
You want a stable supply and guarantee? But your manufacture will also have to safe guard their Interest. You cant just forecast your buying volume to be 10M / year and only buying 6M, leaving 4M capacity that were allocated for you but now going empty. If you dont fulfil those requirement, you have to paid a penalty. And If you follow news around Apple, you might have heard Apple paying Samsung this penalty because their OLED panel did not meet the contract's minimum target.
In most circumstance, unless your Vendor absolutely hate you, your penalty will be paid back to you and deducted in your next order over large unit volume. i.e You are paying less for the same component next year. So your manufacture will be happy to keep you as a customer for a longer period of time.
None of these are specific to silicon or tech. Substitute the above with Walmart, Cargrill or any player from any industry. Supply Chain and Logistic are pretty much the same everywhere.
I broadly agree, especially with your description of internal politics.
But I also think it's important to admit that JIT actually has real benefits. It's not "just" a finance or paper magic exercise. JIT is an approach to answering to the question of "how much inventory do I hold at each stage of distribution", where inventory held really IS money that you can't spend else where. As in, it's money you can't spend on salary, or R&D or line expansion or anything.
The risk management part is where shit goes astray.
While big company politics play a role, IMO the main driver is that all of these companies have shifted their activities.
It used to be the case that these companies would build stuff (this doesn’t only apply to cars), and charged for that stuff, and thus had an incentive to have their well guarded IP. This era is long gone.
The manufacturing business has become a race to the bottom because this IP is no longer considered profitable. There’s a mantra in MBA parlance “externalize cost centers, internalize profit centers”. For the manufacturing sector this meant keeping the sales business, while externalizing engineering. This has gone to the point that even the subcontractors have undergone that process (that’s why there’s a “tiered” model). Nowadays most of the time Engineering is 4 subcontractors down. This also means that most manufacturers are buying from the same tech providers, and are sharing most of the IP (and their costs).
The only reason this makes sense is that car companies don’t sell cars anymore, their real business is loaning, because it turns out that manufacturing big things is an amazingly good way of generating free cashflows. Look at the earnings report of any carmaker (or GE for that matter), more than half of their earnings come from finance.
Yes. I thought everyone learned a thing or two in Pandemic with toilet paper and flatten the curve.
Nope. No one cared. They just decide to make an excuse or an enemy to blame and point fingers at.
No one question why Apple didn't have the same problem, considering they are selling 200M+ iPhone ( Likley 220M+ this year ), 20M Mac, 50M iPad, or together 300M+ of their own silicon?
No one question why Apple don't have the same problem with "their" supplier? i.e Broadcom with their WiFi, Qualcomm with their Modem, Skywork, USB4 Controller, etc etc.
Everyone thought TSMC was 7-11, or Circle K or whatever Corner Shop you have in your region. They somehow would have unlimited amount of shelf space for "you" and for everyone else.
Supply Chain and Logistic Management isn't hard. All it needs is some logic and common sense. But over the past 20 years all I have seen is that Common Sense is rather uncommon.
>Supply Chain and Logistic Management isn't hard. All it needs is some logic and common sense.
I find your comment very ironic, since I remember when Apple was notorious for logistical problems that led to shortages of new products. They are very strong in that respect now, and that's how they got to be a >$1T company.
That was because demand always exceed supply, and Steve Jobs refuse to give up on the JIT management system. After iPhone 4 they finally give up. Something just dont scale.
That is not to say Apple doesn't use JIT anymore. They still do it with supply guarantees.
This is exactly what's happening currently to an extreme. If you talk to people in the semiconductor supply chain you'll find that this exact effect regularly happens, but it is currently compounded by the lack of production from the beginning of 2020.
I thought it was also because they cancelled their reservations around March when the outlook was quite pessimistic and that resulted in their alotted capacity being picked up by others?
Car industry canceled a lot of orders during the first virus wave in Q1 2020 and when they wanted to increase their orders in Q3 and Q4 their former capacities were already sold. The german carmakers already tried to gain more capacity by government lobbying, Taiwan answered chip capacity for covid vaccines - so doubtful if that helps (https://www.taiwannews.com.tw/en/news/4114962).
Volkswagen and BMW already wind down some production capacity and use their chip stock primarily for the margin-high models.
Are your really sure this is how it works even for atutomotive electronics?
I am involved in the automotive industry, more in tech aspects than in comercial ones, but what I know is:
- car makers don't make their electronics. I don't know of exceptions. What they normally do is to design the architecture, the requirements and buy from Tier 1 supppliers: Bosch, Continental, Vitesco, Valeo, Delphi, Magneti Marelli etc.
- the contracts, when awarded, are multi anual and usualy specifiy the units per year during project lifetime production and in serial life (after the product is not produced, the supplier MUST GUARANTEE to be able to provide the part)
- these contracts usualy go in numbers like N million parts (ECUs) for 4-5 year total (SoP date to EoP date) + serial life for X years. Then they get detailed, even the plants where the parts are made ar negotiated. Slips on both sides come with penalty. I don't know all the details.
- lots of electronics are safety relevant (iso 26262) especially in this case, once HW design is frozen, is frozen. You cannot change silicon components easy as this will make the product undergo a long series of product and design validation which take many months => the Tier 1 supplier MUST SECURE it's own supply from chip makers.
- usual "suspects" to supply the Tier 1 are chip makers like Infineon, Renesas, NXP, STM. Probably at some point, maybe TSMC comes into the picture. There are many providers of small electronic parts.
Yes, I’ve worked with a lot of these different tiers. The thing here is that these subcontractors are squeezed so hard on margins that lean manufacturing is the only way to go.
Most automotive electronics is built around COTS (components off the shelf), which means the fab are a long way away. For instance the manufacturers you’ve mention, work with TSMC for some things, but those products are certainly not the ones for automotive. Most automotive still works on 45nm. NXP announced about 18 months that they had a design for a M7 processor on 28nm (i.MX RT1170, M7 1GHz), and for most automotive vendors, this would be considered too expensive (NXP markets it for the edge computing sector).
This is a far cry from the 7nm EUV processes that are slowing down the Ryzen 5000 rollout.
This past month I’ve had to redesign a board myself, because we were using a component from Texas Instruments targeted for the automotive sector (a DC/DC converter). Not only have all stocks vanished, some “big client” (around my area this means automotive) has already bought the whole stock (3 consecutive shipments) until Q3. And that’s from Mouser, a distributor that for a lot of these companies, is a last resort (like Digi-key, mouser provides parts and fast delivery, which of course means that you pay a premium).
This is a perfect moment for China to invade Taiwan and disrupt TSMC. That would leave only China and Korea as two main sources of modern semiconductors.
reply