Nanometer structures will foster a revolution in information technology hardware rivaling the microelectronics revolution begun 30 years ago." -- The National Science and Technology Council We are entering the Nanotechnology Age, an epoch more significant than any preceding age identified by any one material such as stone, bronze, iron or silicon. Nanotechnology's ability to work at the molecular level changes our ability to use all materials. Almost ironically, this enabling technology will once more refocus our attention on information technology, including machine control and sensors. The reason: Nanotechnology is becoming a big part of the next generation of electronics, data storage, and the facilitator of new ways of doing computing, says Clayton Teague, chief of the Manufacturing Engineering Laboratory of the National Institute of Standards and Technology, Gaithersburg, Md. That view is affirmed by R. Stanley Williams, HP Fellow and director of Hewlett-Packard Co.'s Quantum Science Research Labs in Palo Alto, Calif. While noting that the power efficiency of computing has improved by a factor of a billion from the ENIAC computer of the 1950s to today's handheld devices, Williams says that fundamental physics indicates that it should be possible to compute even another billion times more efficiently. "That would put the power of all of today's present computers in the palm of your hand," Williams says. "That says to me that the age of computing really hasn't even begun yet." His assertions are supported by research in molecular electronics that made HP an
IW Technology of the Year winner for 2002 (
Technologies Of The Year -- Molecular Electronics). HP's strategy is to reinvent the integrated circuit with molecular rather than semiconductor components.
IW's award recognized HP's demonstration of the highest-density electronically addressable memory reported to date. The laboratory demonstration circuit, a 64-bit memory using molecules as switches, occupied a square micron of space. That's an area so tiny that more than 1,000 circuits could fit on the end of a strand of a human hair. The bit density of the device is more than 10 times greater than today's silicon memory chips. It combined, for the first time, both memory and logic using rewritable, non-volatile molecular-switch devices. The lab fabricated the circuits using an advanced system of manufacturing called nano-imprint lithography, essentially a printing method that allows an entire wafer of circuits to be stamped out quickly and inexpensively from a master. Thus, the work at HP extends Moore's law that postulates that about every two years, semiconductor performance doubles. Williams talks in terms of extending Moore's law by 50 years. In addition HP's research also tackles Moore's lesser known law -- that the cost of a fabrication facility increases at an even greater rate. The strategy behind HP's accumulation of patents is to have easily manufacturable products ready in five to 10 years, Williams says. He speculates that the first product could be a replacement for flash memory (a memory chip that holds its content without power) for use in high-density, portable devices. "The question, 'When will we see the first complete nanoelectronic-based device?' is the basic competitive issue," says analyst Satish P. Nair at Frost & Sullivan's Technical Insights Division, New York. "Research indicates that the time-to-market for commercial applications of nanoelectronic-based devices is shrinking with the years."
IBM's Millipede Race One example of the shrinking time frame is IBM Corp.'s Millipede project, research pursuing the commercial development of a nanoscale inspired data storage system. The concept combines ultrahigh storage density, terabit capacity, small size and a high data rate. By 2005 IBM could be, possibly with its data storage partner Hitachi Global Storage Technologies, producing postage-stamp-sized memory cards, each of which could hold several feature films or possibly an entire CD collection, says Tom Albrecht, manager of micro and nano-mechanics at IBM's Zurich research lab. Millipede's inspiration comes from scanning tunneling microscopy techniques developed at IBM in 1981. In this data storage adaptation, the microscope's probe functions as a surface modification tool to create small marks that store the data. The tip is heated to create tiny depressions in a polymer film. The same probe reads the data. Since the data rate per probe is low, IBM's approach is to use an array of probes to be competitive with other technologies, Albrecht explains. He says the technique is capable of achieving data densities as high as one terabit per square inch, well beyond the limits for magnetic recording (60-70 Gb/inch). Millipede is transitioning from a research project to a product development phase, but Albrecht admits the final plans are yet to be made on the commercialization question. "We have the majority of the components of a working system functioning, but we haven't yet operated the full system together to create a fully functional storage device. We're getting close." Albrecht sees two possible "homes" for commercialization. "One might be in IBM's microelectronics organization, but since we recently exited the data storage components business, the other possibility might be bringing this to market through external partnerships. The former storage technology division of IBM is now part of a joint venture with Hitachi." Albrecht says the remaining possibility would be licensing the intellectual property. IBM is still deliberating on how to architect the whole device and which sector of the overall storage marketplace to enter. "We've studied possibilities such as the hard disk drive segment, but we're closer to creating a device that is very much like solid state flash memory. We're particularly interested in creating a replacement for solid state flash storage technology." The primary differentiators: high storage capacity and a lower price. For example, Albrecht predicts that in 2006, when solid state flash memory will cost 5 to 10 cents per megabyte, the Millipede product will be 50% cheaper. He says performance characteristics such as data rates and access time are quite similar to that of solid state flash memory. That could put Millipede into a wide variety of portable electronic products such as digital cameras -- both still and video -- and MP3 digital audio players. Albrecht says the potential of Millipede could bring storage densities to where individual bits are occupying spaces that are at the atomic scale. "We're not close to that yet in what we may be introducing in early products. IBM has a way to go before it exploits the full nanoscale limits of Millipede technology."
Nanotechnology In Manufacturing Nanoelectronics is beginning to pose a new level of opportunities and challenges to the industrial world. All relate to economic survival in a world where tomorrow's technology is challenging yesterday's presumptions on products and processes. For vendors such as HP and IBM, the nanoscience response began with developing and maintaining an R&D investment strategy for what is developing into a major disruptive technology. For other firms, the potential of major product disruptions poses a serious question: how to adapt product and process research to thrive with the age of nanoelectronics. Some firms, such as General Electric Co. and Rockwell Automation, are revitalizing corporate R&D efforts with a commitment to longer-range thinking to accommodate what promises to be a radically different industrial future. "The challenge is to study the potential of nanoelectronics to affect customer needs, the product solutions we offer, and how we, ourselves, manufacture them," says Sujeet Chand, CTO and vice president advanced technology, Rockwell Automation, Milwaukee. "From our perspective, we're especially interested in studying what nanoelectronics will offer for the factory floor. We see nanotechnology as opening new doors for us. It's a growth opportunity." Chand describes Rockwell Automation's research strategy as a three-tier model. "The highest tier, basic research, is done at our basic research laboratory -- now an independent company called Rockwell Scientific Co. Once called Rockwell Science Center, it groups about 200 Ph.D.s in such specialties as material science, electronics, information technology and optics. Their focus embraces technology that is five to 10 years from commercialization." Chand initiated Rockwell Automation's nanoscience research with that group. The company's nano team partners include other research laboratories and universities such as Northwestern, University of Pennsylvania, Stanford and MIT, adds Chand. Chand heads the second tier, the internal Advanced Technology Group at Rockwell Automation. "At this level we focus on technology that has commercial potential in the three- to five-year time frame." The third tier, product development, is a division-level process. The development focus: product revisions and new features scheduled for one to two years in the future. With respect to the factory floor, Chand relates nanoelectronics to the trend toward greater distribution of control. "We're pushing electronics such as actuators and sensors lower and lower into devices on the factory floor. It has been happening for some time. "It's routine today to see smart devices that are connected to a network and for control to happen at a very granular level on the factory floor. With nanoelectronics impacting sensor design, that trend will go faster and reach lower levels." How low? In one scenario Chand speculates how low nanoscale sensing might go in a hypothetical automobile paint shop: "Presume the challenge is measuring the quality of the paint that's being deposited by robots in a closed cell . . . ." He says a nanoscience solution might combine nanoelectronic sensors capable of determining quality levels by interacting with nanoparticles that are incorporated in the paint film. "You'd have a closed loop paint control process that would assure quality levels unobtainable today. In addition, the nanoparticles might be designed to act as catalysts."
Smart Machines Industrial interest in evolving machine intelligence could be another opportunity for the future application of nanoelectronics, suggests Jay Lee, director of the Center for Intelligent Maintenance Systems at the University of Wisconsin, Milwaukee. Lee's Center is an industry research consortium that is developing new approaches to the maintenance monitoring of production equipment. The University of Michigan, Ann Arbor, is a partner. The goal is to equip production equipment with sensing systems that can anticipate the earliest beginnings of anything that could lead to machine downtime. The concept is intended to predict and prevent equipment failure. Lee says the developing field of nanoelectronics could offer cost-effective improvements in configuring the embedded intelligence systems required. Industry membership in the research cooperative includes about 40 manufacturers such as Rockwell Automation, General Motors Corp., Intel Corp., Harley-Davidson Inc. and Hitachi. Lee predicts that the eventual convergence of nanoelectronics' with intelligent maintenance systems will be typical of a future for both industrial components and consumer products. "For example, industrial pumps will routinely contain embedded computational intelligence. The lower cost of such embedded systems will gradually win out over today's add-on approach." Nanoelectronic based sensors will be a critical benefit for machine tools as builders continue to add diagnostic capabilities, adds Paul Warndorf, vice president technology, the Association for Manufacturing Technology, McLean, Va. "Eventually nanoelectronics will add a kind of cognition -- giving machine tools the ability to sense and adapt to changes that could impact tolerances and quality. Customers are beginning to ask for machines that, in effect, understand their tasks, enabling them to compensate when necessary." He also sees the nano-based approach leading to multifunctional sensors that will change the economics of utilization via greater capability at less cost. "Intelligence will increase at the sensor level."
Nano's Engineering Challenge Don't conclude that nanoelectronics is only a technology challenge, albeit a profound one, warns Uzi Landman, director of the Center for Computational Materials Science and a professor of physics, Georgia Institute of Technology, Atlanta. In Landman's view, organizations planning to secure a nanoelectronic edge will have to rethink the engineering function, especially the training and education of engineers assigned to nano science projects. "They will be less the people that follow schematics and more the people that actually participate in the act of discovery. And there will be a new continuum from the abstract physicist to the engineering people who are actually going to build devices." He says one of the biggest problems in all of nanotechnology is not knowing how the material is going to behave. "All of the concepts have to be revisited again. Engineering success with nanotechnology will hinge on understanding the critical difference between capacity and capability." Landman defines capacity as the ability to do more per unit of time or to do more for less money. A computer with large capacity can grind more numbers. "Capability is different. In a computer it signifies being able to do completely new things. With nanoelectronics, capability may improve -- not by a factor of two, but possibly by a factor of 2,000! "The most difficult thing for the engineering function is to ponder the question what to do. How to do it is a different, more traditional question for the engineering function. The how is easier to solve than the what. For example, when you try to connect nanoelectronics to the macro world -- it's at the point of contact that things can go wrong. We may know how to make the connection, but we don't know what to expect when we do that. The connection can actually overwhelm the nanoscale device. It's an exciting new voyage of discovery." Landman says companies that want to leverage the future with nanotechnology need engineers schooled in quantum mechanics. "They will be basically doing applied physics and chemistry while they pursue materials science." He says the trend is already observable in the special course he teaches on the physics of small systems. "More than half the class are engineers. They understand the new fundamental -- that a different background is needed for working with nanotechnology. Things have to be approached differently. It's not only the facts, the thinking has to be different. For example, virtually no rethinking is required for the design of a very large integrated circuit these days. It's known. Few questions have to be asked. "In contrast, with nanoscale devices, the engineer needs to rethink the whole problem. "Expect a prolonged period without standardization of processes. Science and technology will become completely intermingled in this new engineering infrastructure." Nanoscience is not a case of applying existing know-how, Landman stresses. It is a voyage of discovery, case by case, product by product. Engineers will have the triumph of discovery and the responsibility of translating that knowledge into a manufactured device.
Sponsored Recommendations
Sponsored Recommendations
Voice your opinion!
Voice your opinion!
To join the conversation, and become an exclusive member of IndustryWeek, create an account today!
Members Only Content
Members Only Content
Sponsored
Sponsored