through Internet Information
by Yasuharu Dando
The Peak of Semiconductor Technology Is in Sight
(July 2002)(Japanese edition:2002/03/28)
In mid-March, Intel announced that it had developed the world's first one square micron SRAM cell, proving the practical application of 90-nanometer (nm) circuit-building "process" technology. A CPU built with less than 100-nanometer technology (100-nanometers is one-ten millionth of a meter) will be available for actual use next year in homes and offices as the next-generation successor to the current state-of-the-art Pentium 4 chip. Twenty years ago, 100nm was thought to be the theoretical limit of process technology. Though that limit has been surpassed, we are certainly about to arrive at a point where the true limits are in sight.
At this epoch-making time in the history of science and technology, Japan's semiconductor makers seem unmotivated. They are not even trying to prepare for the prediction made twenty years ago -- "When further advances cannot be made in hardware, software will become king." It leaves me feeling somewhat bitter that all I can do is confirm the chilly reality that Japan's hardware makers face today.
The Insulating Layer That Sets the Limit of High Performance
Last November, Intel made a 1.5 terahertz (THz) high speed chip transistor. Moore's Law has ruled over technical advances in the world of chip technology ever since Intel co-founder Gordon Moore famously stated that chip performance and logic density would roughly double every 18 months. Continuing to keep up with this law that has held true for more than ten years, today process technology is moving from 180-nanometers to 130-nanometer design, and will continue to shrink to 90nm to 65nm to 45nm, finally hitting 30nm in the year 2010. Intel's announcement means that it has successfully tested a next-generation transistor with 15-nanometer gate length, which is nothing less than the necessary next step toward achieving 30nm process.
Can this technology advance to infinity? No, because when Intel announced that 45nm process was within the realm of possibility, it also revealed that the insulating layer or gate oxide used in the process was only 0.8-nanometers thick.
0.8 nanometers is the thickness of just three atoms. Though it had been assumed that at least ten atomic layers would be needed, Bell Labs has shown that the silicon insulating layer could provide sustained reliability at just six atomic layers, or 1.5 nanometers. Once 30nm process is achieved, the use of the current silicon dioxides as gate insulator will have to be abandoned and a new material, only known now as a "high gate dielectrics insulator," will have to be decided upon. It had been thought that once a substance gets down to the level of ten atomic layers, it ceases to have "physicality" in the traditional sense and begins to exhibit the characteristics of a "wave." So it was assumed twenty years ago that ten atomic layers would be the physical limit for this type of insulating layer. In fact I could hardly believe it when I read the announcement that the insulator could work to keep electricity from leaking at just three atomic layers.
The latest Pentium 4 model runs at 2.2 giga-hertz (GHz) version. This means it can perform 2.2 billion calculations per second. By integrating transistors that can operate on the tera-hertz level in 2007 and 2010, 20GHz or even 200GHz performance will be attainable. I would like to believe that today's form of semiconductors will find their outside limit somewhere around that point.
Manufacturing integrated circuits at this performance level will render current methods of using light waves to burn or "write" circuit features useless. It is easy to understand that since light waves are more than 100-nanometers long, they cannot be used to write 30-nanometer circuits. Ultra-short wave 10nm class "extreme ultra-violet" rays or electron beams will have to be used, and the entire process will have to be done in an absolute vacuum so as not to be absorbed by the atmosphere. Plants that can support this completely new fabrication process will require massive amounts of capital investment, and it is predicted that only a few manufacturers worldwide will be able to afford to build them.
But we won't be able to use these futuristic devices that may or may not exist at the limits of our imagination for some time. Though labs can successfully test electronic transistors or quantum transistors one by one in a lab setting, they are not being manufactured to meet specifications for semiconductors of a certain size, nor is the manufacturing theory completely established. For this reason, it is thought that it will be another ten or twenty years before transistors of this type can be mass-produced.
But I actually have doubts as to whether we really need CPUs so advanced that they can perform at 20GHz or 200GHz. The only applications where one feels constrained by performance levels with current CPUs in the single GHz range involve detailed graphics processing or MPEG compression of motion pictures. Most ordinary software is already so fast that it spends most of its time waiting for the response of the human operating it.
Recently, I had the opportunity to test several companies' newspaper publishing computers, so I can confirm that a single-gigahertz computer is powerful enough to accomplish the task fairly briskly. Newspaper production requires a variety of layout technology and can be a pretty heavy desktop publishing application. I can see that just a little slightly more advanced CPU would make a difference. But it is possible that the day will come before too long when ordinary computer users will refuse to pay more for a CPU that is faster than they need. If that happens, namely if high-priced state-of-the-art computers are made but cannot be sold in any quantity, then probably advances in product development will slow down as well.
Deficient Strategy of Japan's Makers Has Deprived Them of Competitive Strength
Japan's makers have left CPU development up to Intel, feeling it beyond their power to compete, and continue to wage war in the manufacture of ordinary dynamic random access memory, or DRAM, which requires constant memory refreshing to allow instant access to data. And what is the DRAM situation now? While countering the onslaught of Korean and Taiwanese makers in the late 90s by simply saying, "Our technology is better," the Japanese lost their ability to compete, until today Korean maker Samsung is number one and the Japanese makers have disappeared from among the top ranks in market share, mere shadows of their former selves.
Toshiba withdrew from its ordinary DRAM business at the end of last year, selling its plants and facilities in the US to world number two ranked US company Micron Technology. Though the semiconductor industry in Japan has been working since 2001 on shifting from 100nm to 70nm process technology, technological developments will not come to halt until they are able to complete their plans. They need a second strategy.
The situation as of March this year is as Reuters reports: "The big five Japanese electronics makers will endeavor to develop highly integrated semiconductors with circuits just 100 nanometers wide that can be produced using existing technologies." This is "thanks to a 31.5 billion yen grant earmarked in the second supplementary budget approved in 2001 for what the government terms Next Generation Semiconductor Design & Manufacturing Technology Joint Research Facilities," though it has not been decided what will actually be done. What is consistent is the deficiency in strategic thinking. If the electronics makers are able to discern that things are nearly hopeless in the face of international competition, they should at least have had the foresight to make a quick decision to concentrate ordinary DRAM production to a single company. Even now, the Japanese-style "yoko-narabi" (copycatting, or cooperation by domestic rivals in the face of third party competition) mindset persists.
An article in the November 21, 2001 issue of Newsweek Japan, "How to Resuscitate the High Tech Kingdom," describes by way of introduction how Dutch manufacturers took over the stepper market which had been the domain of the Japanese for many years (steppers are the machines used to "write" on chips), comparing that story with Finland's Nokia which is now the dominant maker of cell phones. "Ten years ago Finland had fallen into the worst depression of the postwar years, till it was whispered that the only Finnish companies that could turn a profit were the government-run liquor sales companies. If we remember that, we can see that Japan still has a chance," ending with the natural admonishment, "That is, if Japan can manage to forget the way it has done things up till now." The government's money is just a sop to a flailing industry if all five electronics makers try to build in the same production capabilities as the rest when it is more than apparent that it is going to be impossible to bring all of them up to the level of world-class competition.
No One Is Working on Software Development?!
Software should be ready and waiting in the wings for hardware development to take a breather. Kimihiro Ohta, a research coordinator with the National Institute of Advanced Industrial Science & Technology, an independent administrative institution under the aegis of METI, claims that there is an overwhelming lack of software researchers in this country, and insists that it is dangerous for Japan to be unable to produce software that is meaningful on an international basis. Other than game machine software, we have very few products that have any meaning to the rest of the world.
"According to statistics by Ministry of Public Management, Home Affairs, and Posts and Telecommunications, there are presently 25,311 researchers devoted to software development in the industry in Japan today." Adjusting for the number of research support staff, which is low compared to the rest of the world, and multiplying by Japan's share of citations (of published dissertations or research), which is the best indicator that work of significance is being done (Japan is 8.7% while the US is 50%), yields a calculation of "just 1,500 people who can actually be called serious researchers. This field, different from others, has almost no international dissertations being presented," so real researchers that can be considered internationally competitive "are very limited in number, and it is entirely possible that there are many less than 1,000." Independent serious researchers are as few and far between as phantoms.
India has grown over the past fifteen years to produce 120,000 software engineers a year, more than the 75,000 cranked out every year in the US. In contrast to this, Japan is estimated to have only a few thousand.
Ohta continues, "In programming technology needed to produce software, it is necessary to learn a programming language. You must also grasp the mathematical logic structure that is the grammar of these languages, and must be capable of creating your own language as necessary. You need to be able to use these languages freely in order to write programs. And in addition to language know-how you also need imaging capabilities."
Isn't there something wrong in this country's education system? The "Data Basics" subject which was introduced into the middle school "Home & Technology" curriculum in 1993, is propelling young children into learning BASIC and drawing schematics, engendering a fear and dislike of computer science. This fundamental mistake dominates computer education all the way through higher education.
To be able to work closely with computers is a much richer occupation than just churning out code. These curricula seem to offer none of the generosity or elasticity that enable one to borrow the amazing power of the machine, taking a fluid image bursting with imagination and making it leap even further ahead I don't think students of these curricula will be capable of creating superior software. In my own experience programming languages can be learned as the need arises. Isn't that what life is all about -- coming up with ideas and images?
In 2003 a new subject to be known as "Information" will begin in the high schools. While reading over the "Intensive Course Manual for Information," sociologist Kazuo Nomura had this critique to offer: "As the latest information sciences themselves reflect recent social trends and continue to expand culturally we have taken to considering Information Science as an academic field, but the curriculum currently proposed for high school Information Science does not reflect these new types of information sciences but looks rather as if pure Computer Science had won its way. It's as if they had turned the clock back to before the Internet."
It is true that the topics lined up for discussion are all technical ones like those you might have seen at a data processing study group.
"There is no sense of that open mind we experienced, when our hearts and minds were moved, when we first encountered the Internet culture. What is there is all discussion of security, keeping everything inside. We have been told for some time that it is the age of Content, but though there is some discussion of 'information appliances' as 'vessels' for Content, there is no concrete study of Content itself."
The malignant growth pattern of junior high school "Data Basics" has morphed into senior high school "Information." The Ministry of Education, Culture, Sports, Science, and Technology has not recognized that junior high school "Data Basics" is a failure. The Ministry is completely ignoring the Internet and seems to think its recent advent is insignificant. In the field of Information Sciences above all, we need a liberal arts style education. We need teachers who have a free-ranging capability and superior cultural sense instead of teachers who have "finally" learned how to use a computer. But maybe this is more than we can hope for.
(special thanks to translation by PHP Institute Inc.
"JAPAN CLOSE-UP" July 2002)
Topsites Japan Popular Japan related directory
japan-guide.com-popular Japan related directory