And The 45-nm Winner Isn't . . . IntelAnd The 45-nm Winner Isn't . . . Intel
If you've been following the <a href="http://www.information.com/news/showArticle.jhtml?articleID=201200377">quad-core wars</a>, you know that Intel has repeatedly beaten AMD over the head with the news that it (Intel) will be first to market with 45-nm processors. What you didn't know is that both Intel and AMD have been beaten to the 45-nm punch by two companies, one of which you probably wouldn't suspect (the other one is an easy guess).
If you've been following the quad-core wars, you know that Intel has repeatedly beaten AMD over the head with the news that it (Intel) will be first to market with 45-nm processors. What you didn't know is that both Intel and AMD have been beaten to the 45-nm punch by two companies, one of which you probably wouldn't suspect (the other one is an easy guess).In anointing the "winner" of the race to 45-nm, Intel might say I'm splitting hairs, since the chip giant says it has already demonstrated its first 45-nm device in-house. That would be the processor code-named Penryn, which is scheduled to ship before the end of this year. AMD won't ship 45-nm processors until sometime in 2008.
So let me define my rules of engagement: I'm not talking demo, I'm talking shipment. In that regard, the winners are Panasonic and IBM.
Panasonic, the Japanese consumer-electronics giant, started making a 45-nm video codec (coder/decoder) chip, called UniPhier, for use in high-definition video displays, back in June.
The IBM effort is a little more arcane, involving 45-nm ASICs, or application-specific integrated circuits. This is a catch-all involving chips which do "other stuff"--i.e, radio frequency, analog, supporting functions--basically, non-PC-processor circuitry. The same (or a very similar) 45-nm process is also being offered by a consortium of IBM, Taiwan's Chartered Semiconductor, and Samsung, which goes by the name Common Platform.
Lest you think that the Panasonic and IBM efforts are a diversion from the more "mainstream" work of Intel, think again. All chip makers of note are titling toward 45 nm. A case in point: Qualcomm, an important maker of communications chips which power many of the world's cell phones, recently put out a press release crowing about their "tape out"-the design is done, it's ready to go to the manufacturing fab--of their first 45-nm chip.
They didn't specify what chip, but since we're talking Qualcomm, it's probably part of a CDMA chipset for 3G handsets, which is what they do.
By now you're probably asking, why is 45-nm important? Basically, because it enables lower-power operation and allows one to pack more transistors onto each chip. As Intel explains it in more technical terms, 45-nm offers:
Twice the density of its predecessor 65-nm process (i.e., you can pack double the number of transistors into the same silicon real estate). A 30-percent (in Intel's case) reduction in power. This stems from the fact that lower switching voltages are used in 45-nm chips.
The aforementioned effort by Qualcomm is significant, because, when you think about it, 45-nm is even more important for wireless systems than it is for PCs, since its low-power advantages are crucial for handsets. (In fairness, 45-nm is also a key enabling technology for the Ultra-Mobile PCs, or handheld Web browsers, envisioned by Intel. For its part, AMD is working on a similar handheld called Imageon.)
Texas Instruments is also pursuing 45-nm.
Really Arcane Stuff
There's one further point that's important to make. Most mainstream coverage of chip technology, and certainly the majority of the stuff spouted by the vendors themselves, talks about 45-nm as if it's a wonder without any issues or challenges. Unfortunately, that's not true. The ability to even use 45-nm comes after a hard-fought battle to come up with new and exotic materials, which have temporarily allowed chip manufacturers to sidestep the big problem posed by their embrace of ever-smaller chip geometries.
That would be, leakage.
Leakage in nanometer-scale chips is really hard to define in a way that's comprehensible to the average person (which makes one wonder how well defined it really is, period). Notice that even an article which talks about leakage as a matter of course--this one--as if everyone understands it, can't offer a succinct definition.
Wikipedia does about as good a job as I've found. Roughly speaking, leakage means charge (electrons) going where it's not supposed to. The problem gets worse as chip feature get finer, which means that an issue which wasn't a big deal in 130-nm and 90-nm processors becomes very important indeed at 45-nm.
Leakage is part of a whole grab-bag of stuff which relate to "fundamental limits of physics" issues. These are looming limits on semiconductor production, which chip designers worried a lot about in the mid-1990s, when it seems like semiconductor manufacturing techniques would get so fine that the you'd be talking about atomic-level geometries.
"Atomic level" means just what it says; wires or electrical conduction paths which are no thicker than a single atom. I don't know much about modern physics, but I do know enough to understand that, when you're down at the atomic level, it gets impossible to reliable predict or model behavior.
Concerns about fundamental physics got pushed to the back burner once a bunch of funky materials (high-K dielectrics and silicon-on-insulator) were discovered which made 65-nm, 45-nm, and smaller processes viable. Now, it's not longer a "sky is falling" problem, where chip manufacturing as a whole is screwed. Rather, the issue has become focused around leakage, which is a good thing, since it means there's one central problem to attack.
However, don't be fooled: The issues raised by fundamental physics will rear their heads again, and in the not-too-distant future.
About the Author
You May Also Like