Re: The Chips Act

Fred Spinner

Many ASIC designs could also be done as a FPGA first, further speeding up design and verification.   The prototype could be a power hungry but otherwise exact logical equivalent to the final ASIC product. 

Fred W0FMS 

On Wed, Jun 29, 2022, 11:39 AM Mark KB0US via <> wrote:
Once upon a time in the 1980s when chips were made out of stone (a little silicon humor), the U.S. Department of Defense (DoD) cared about one thing -- CPU speed. That was where the intellectual property was and where there was an advantage in war fighting and space (aka Star Wars). Memory was less important (except from a radiation hardening perspective) and support chips weren't very important at all. Standard ICs were only important to the extent you could get them in Mil Std packaging. And so, there was a lot of money available for CPU fabs in the U.S. along with needing to keep the IP in the U.S.

And then along came ASICs (Application Specific ICs) pioneered by TSMC in Taiwan under a model where you would design them at a "design center" near you and then TSMC would make the chip at their "foundry" (i.e., fab). ASICs allowed faster time to market compared to a full custom IC design, reduced size since something made from standard parts could be put in a single part, reasonable speed, and reasonable cost. None of this was the least bit interesting to the DoD. ASIC design is basically tying together standard "blocks" (such as an NAND gate or D-flip flop) without having to worry about how this winds up being implemented in silicon. Over time, these "blocks" have become quite complex and more general purpose.

As time moved along, as it usually does, TSMC (and others) offered more services and built more complex fabs. DoD spending ebbed and flowed and chips became fast enough that they began to move to COTS (commercial off the shelf) instead of building custom things where there were already commercial products available. Having fabs offshore had a lot of benefits since environmental requirements weren't as strict, labor was cheaper, and there were a lot of talented engineers available for low cost. And as these fabs became obsolete, you could build a new one somewhere else with the most favorable economics. 

And then there's the Internet where it makes no difference whether your foundry is in the next building or in Asia. Files are moved instantly and having a team spread across continents became normal instead of "are you absolutely insane!"

So here we are with a system that optimized economics (as capitalism tends to do) but with weaknesses that are only exposed during some sort of global upheaval. This has always been the case but it would have normally involved raw materials such as iron, water, gold, and other commodities. Now it's access to technology.

I don't have a strong opinion about the Chips Act other than to say we've essentially been down this road once before with the early DoD CPU spending. Just as before, we're just as much at risk of the newest technologies going off shore because of the economics. And there will be unintended consequences as well that we can't predict.

By the way, semiconductors aren't the only thing to worry about. Electric Vehicles all depend on batteries from China (and Taiwan) as do all of our laptops and cell phones. A huge number of drugs are manufactured in China and a disruption in their production won't be easily remedied either.

Join to automatically receive all group messages.