Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> I studied microelectronics. I am aware of the technical challenges. Can you explain those challenges that are primarily non-technical?

We spent far more time buying EDA software, installing it, talking to foundries, getting the PDKs, signing NDAs, dealing with buggy EDA software, dealing with slow EDA response times, etc. than actually working on our chip.

>Things like Silicon-on-insulator, high-k dielectrics, finfets, extreme ultraviolet lithography are not innovative or new ideas?

I'm not saying they aren't, but I have noticed that the general level of openness, and following that, innovation and open-mindedness has dropped dramatically in the past decade or so, and I do have to say that the general semi industry has stayed generally innovative, and much of my criticism is directed towards the rest of the industry primarily. That being said, there is a major glacial pace.

Example of a real conversation I had with an engineer at one of the major (can't name the exact one) foundries about a device that's actually pretty close to reality:

Me: "Why don't you use this X device?"

Him: "Because it's still research"

Me: "Sure, but it's very promising, why aren't there at least any industrial research efforts to commercialize it?"

Him: "Because it's still research"

Me: -__-

SOI is innovative, but it's been held back by cost and the self-heating effect, both things that really aren't that much of a problem.

FinFETs were launched by a DARPA initiative.

High-k dielectrics I will say are the single most interesting (if not innovative) innovation in the last decade in the semi industry, although I have some bias there.

EUV is a feat to engineering no doubt, but again, my grievances aren't really focused in that area.



sort by: page size:

> Try to make a chip, you'll quickly realize that your challenges will be primarily non technical in nature.

I studied microelectronics. I am aware of the technical challenges. Can you explain those challenges that are primarily non-technical?

> It wasn't always this way, the semi industry was once incredibly innovative and open to new ideas, and that helped drive exponential progress.

Things like Silicon-on-insulator, high-k dielectrics, finfets, extreme ultraviolet lithography are not innovative or new ideas?


>Secondly, having worked for semi: there's a lot of conservative force holding back development. We could have had current tech with much less worries than we have now if they didn't respond so allergically to everything that looks a little exotic in the CMOS process, like high-k dielectrics.

Such as? Getting new materials into manufacturing is only the last step. Before that there has to be a significant benefit of doing it. And yes, that is usually benchmarked against risks and capex.


> So, as someone who used to be an engineer in semiconductor manufacturing facilities, this reminds me a lot of Sematech.

However, the prior initiative was VHSIC and that basically produced the modern semiconductor industry (including the software tools!) in the US.

I was always under the impression that the difference between VHSIC and Sematech was simply that VHSIC threw around a LOT more money.


>I suspect knowing the fundamentals gets you pretty far,

Not much further than eighties, and you are not getting anything done with seventies era ICs.

I'd say. After seventies, most of academic research gets almost wholly separate from the fab floor, and of course fab owners and equipment makers keep to their valuable secrets.

At TSMC, the EUV workgroup, for example, is a very select cohort of people with 10+ years at the company, with posted security bonds, screened for loyalty, and signed an omerta style NDA.


> I really doubt that the EDA software is really that much of an investment these days

If EDA tools were so simple, why don't the big IC design companies -- e.g., Intel, Apple, Samsung -- just build their own in-house tools instead of paying millions in license fees?

> It's the lithography tech that's the real crown jewel, and it is being protected.

Sure, but lithography tech is useless unless you have access to solid tooling for design verification. In IC design, no one can afford to debug issues after the chip is fabricated. EDA toolchains include complex simulation software that, when combined with fab-provided models, allow you to simulate circuit behavior down to the lowest level.


> Semiconductor engineering is complex, but probably not so complex that your average Math Olympiad couldn't pick it up in a month or two.

We have a chip shortage going on for months now, if all it took was two months and a bunch of smart people, those billions of dollars in chip orders would've made it happen. That tells me it has to be a bit more complex than you think.


> Can you back this up with any links?

Before I became a developer, I worked QA in chip fabs (both in northern and southern california) in the 90s to the 00s, the writing was on the wall, so to speak.

Do a casual google search.

https://www.marketplace.org/2021/04/21/shortage-of-semicondu...

https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat...

https://www.businessinsider.com/why-us-doesnt-make-chips-sem...

https://www.bloomberg.com/graphics/2021-chip-production-why-...

et al.


>Once you see a modern fab with your own eyes, it will change you in a deep way. I felt a sense of compassion for this incredibly complex and valuable thing that humanity is just barely able to scrape together. Nothing you see in media can prepare you for the real thing.

I interned at Micron as an industrial engineer looking at capacity and equipment purchases. I can very much relate to that. The whole summer was my mind being blown by the orders of magnitude across the board and it's somehow it's an economically viable process.

First, we take this giant silicon crystal taller than humans and cut it into wafers. Then we take these pure materials (like 99.99999% pure) and transfer a tiny big on to the wafers in a successive layering process. Oh no, the deposition process wasn't perfectly even across the whole wafer (because of annoying laws of physics), so we'll throw it into the chemical mechanical planarizarion process to skim off a tiny layer keep the internal mechanical stressed down. Add in other mechanical, etching, lithography, and measurement processes and it gets crazy.

Wafers go through hundreds of manufacturing steps depending entirely on purpose, and with each step there's a possibility of messing up part or all of the wafer.

99% yield on a per-machine basis is sub-par in most manufacturing environments, but achieving that would be devastating in semiconductors. For sake of demonstration:

0.99^100{manufacturing steps} = 36.6% total process yield.

0.995^200 = 36.6% total process yield.

0.9975^400 = 36.7% total process yield.

Each additional 9 on yield is really expensive to add.

In school we talked about 1/1000th of an inch being kinda tight to hold on a CNC mill, with 1/10,000th needing a lot more specialized processing and time. Suddenly I was hearing about nanometer thick layers with tolerances measured in angstroms.

And the capital expenditure was nuts! 6-figures hardly gets you anything in a fab, it's really in the 7-8 figure range where you see most of your equipment landing. That will be old but still viable equipment in a few years.

Somehow depreciating all that equipment designed to make chips sold at pennies to dollars each is profitable.

As far as I'm concerned, it's black magic and truly an incredible achievement for humanity.


> Semiconductor industry is extremely fragile.

Not sure that's true. After all, it doesn't break down all the time, does it?


>The semiconductor guys don’t know much about computers

and you'd expect an opposite huh. Can anyone comment on how accurate this statement is?

Hopefully one day I'll finally get to the point of being able to design my own circuits.


> BayBal, since you're an expert in Semi, I wonder if you can fill in the blanks?

An expert? Ahahh, I never even had formal education in the field, just been trying to enter it, and start studies in it for a few years.

My only real experience with ICs was with a company developing a fancy synchronous rectifier chip what was capable of doing few more tricks with the output waveform besides rectification, and that was mostly just hanging around, and doing complete trivialities like routing, or minimal layout wiggling. I was more useful there as a coffee porter.

> How much do you estimate the full chip manufacturing cost for this would be ?

I don't know how many wafers they buy. I don't know whether they ordered masks from TSMC, or somebody else. I don't know how short they want lead times to be. I don't know if they want to have any device inspection provided. I don't know if they have any agreements on repeated runs, or a flexible capacity purchase. I don't know if they order test, and packaging from TSMC.

From a man who was on Allwinner's original A10 chip team, I heard that the most bare bones 65nm run without mask cost, inspection, or packaging was possible at 1k wafers at $2400-$2500 in 2013-2014 by paying cash 1y in advance.

Today, I'm not even sure if clients are even allowed to, or can order masks on the side these days for latest processes.

The universal advice I heard is that you don't get into 300mm game without at least $10m, or better $20m if you have a brand new, untested design.


> Please don't assume the OP is male or not solving technical challenges. She's a YC founder recently profiled in a technical magazine (PCB Design - see: http://www.iconnect007.com/index.php/article/100587/).

1) Press coverage does not automatically imply either vision or competency. See: Theranos.

2) Do not assume that I do not have the bonafides to critique properly just because I am willing to have my rudeness pointed out. I've probably been fighting with EDA tools longer than she has been alive. And I ran a group which had a budget of more than $70 million for EDA tools.

I assure you I have very strong opinions about EDA tools that are backed up by very hard won experience.

And one of the many of those opinions is that not once did any engineer with experience ever say "Gee, we really need somebody else to manage our parts or footprints." TO A PERSON they all said "We'll do it. Anybody else will screw it up."

So, quite certainly, I have enough vision of the field to deem it "not a technical challenge". You can feel free to disagree, but you're going to have to bring some experience or data to convince me.

And, in case you think I'm being unfair, I can assure you that I ripped several layers of upper managers at Altium a new one for buying Octopart instead of fixing their tool.


> Forgive me for not holding my breath.

You're the one who chose to click on the link to enter this discussion. You know that the Mill isn't in silicon yet and you're personally only interested in things that are, so why are you here? You're just trolling while other people are trying to have a productive academic discussion.


> what's so different in that coming from goo chemistry instead of electric bits?

Very insightful comment reply. If you don't mind, I'll respond in an analogy: What's so different about just building a bigger rocket to move more stuff into space at one time? A: You can't because you reach a finite potential in the relationship between fuel weight and lift. Extrapolating this analogy to the context of your comment reply, sure, in a theoretical sandbox environment that could happen with the silicon circuits and transistors we have today, however I find that unlikely.

Silicon based circuits and the current model and understanding we have of machines and the way they should be built, from a basic principals level is wasteful, thermally inefficient, and requires a sum of resources that if used to model the human brain would surpass our ability to house, power, construct, and allocate the precursor materials for. Given our current technology and the parameters involved, it's just not going to happen for us.

Maybe tomorrow something will get invented that will even further push moore's law into the dirt, but I just don't see that happening in this generation nor lifetime.


> Assembling a motherboard is to semiconductor fabs as flying a toy drone is to landing a Boeing 747.

I disagree. I live next to a fab and have coworkers that have worked the fabs.

Humans in fabs are taken out of the loop as much as possible. That's because when dealing with nanoscale structures, human error is simply too common.

One of my coworkers worked at the fab during a period where they had humans running the forklift that moved the wafers from one stage to the next. That was cut out because the tiny bumps caused by a human operating the controls caused imperfections in the chips that decreased yield (the metric that matters most for a fab). They ultimately removed that work and job and replaced it with robots to carefully move the wafers.

What's complex about a fab ends up being not the frontline work, but rather the layer or 2 in the back (like designing the lithography filter for a given chip). That stuff happen outside the actual plant.


> It's probably possible, but expensive as always.

Frankly, I doubt anyone but a rich eccentric would have the money to get into SoC design in their garage. My university acquired the resources to run a semiconductor class in the form of some antique (70s vintage) silicon lab equipment from Micron. Even though this stuff was brand new before I was born, Micron charged them in the neighborhood of $200k for each device. That's six machines for a total of $1.2 million. And that doesn't even encompass the whole cost of what you'd need to get a proper semiconductor development environment - you'd still need a cleanroom, electrical and gas hookups, chemical certification (you need all sorts of nasty stuff to make semiconductors - HF in particular is quite common in fabrication, and has horrifying effects on exposed skin when mishandled), etc.

Yeah, sorry - with all respect, I don't think anyone's gonna be starting a semiconductor company in their garage any time soon.


> Once you have the desired circuit, you don’t have to build it out of discrete components, you also can send it to a fab

You are still going to use a very old and obsolete process, compared to the microcontroller.

As a rule of thumb, every generation of lithography that has made transistors smaller and more efficient, has also roughly doubled the NRE costs. As you move down the feature size slope, you get all kinds of useful properties, but the tradeoff is that you have to manufacture more of any given design for it to be able to make any economic sense. To the point where you can get an amazing chip that has an arm core, storage and memory in a single package that costs pennies (well, not right now it doesn't, but it did in the past and will again) and uses almost no power, so long as you can use the exact same device that is also shipped in the millions for other things too.


Quote:

Impact:

----

The chip will redefine the standards of Chinese industry by setting totally new quality requirements in the future.

Risks:

Of course Atmel or Infineon might come and make trouble because our chip "looks too much like theirs", but there is no patent or alike backing such claims. We're less expensive and more open. Short: We are better

And $20,000 to get a run of custom chips done, by someone with apparently no previous connections to this sort of thing? Sounds to me like someone who thinks this is a neat idea (it is!) but doesn't really know what they are doing.


> I do not think the people who designed the chip were amateurs.

How do you define amateurs?

Most chip designers in US will work on less than 10 designs in their lifetimes.

In Asia, a chip outsourcing shop pumps out that much in a month.

next

Legal | privacy