Friday, August 29, 2008

Plenty of room at the bottom

We do have plenty of room at the bottom; however, just a few years after Feynman’s vision was published, J Morton from Bell laboratories noticed what he called the tyranny of large systems. This tyranny arises for the fact that scaling is, in general, not part of the laws of na-ture………………………………. Feynman was only interested in fundamental limitations. The exponential growth of silicon technology with respect to the numbers of transistors on a chip seems to prove Feynman right, at least up to now.

How can this be if the original transistors were not scalable? How could one always find a modification that permitted further scaling? One of the reasons for continued miniaturization of silicon technology is that its basic idea is very flexible: use solids instead of vacuum tubes. The high density of solids permits us to create very small structures without hitting the atomic limit. Gas molecules or electrons in tubes have a much lower density than electrons or atoms in solids typically have. One has about 10^18 atoms in a cm3 of gas but 10^23 in a cm3 of a solid. Can one therefore go to sizes that would contain only a few hundred atoms with current silicon technology? No, the reason is that current technology is based on t he doping of silicon with donors and acceptors to create electron- and hole-inversion layers. The doping densities are much lower than the densities of atoms in a solid, usually below 10^20 per cm3. There-fore, to go to the ultimate limits of atomic size, a new type of transistor, without doping is needed. But even if we have such transistors, can they be interconnected? Interestingly enough interconnection problems have always been overcome in the past. The reason was that use of the third dimension has been made for interconnects. Chip designers have used the third dimension – not to overcome the limitations that two dimensions place on the number of transistors, but to overcome the limitations that two dimensions present for interconnecting the transistors. There is an increasing number of stacks of metal interconnect layers on chips – 2,5,8. How many can we have? One can still improve the conductivity of the metals in use by using for example copper technology…………. Feynman suggested that there will be plenty of room at the bottom only when the third dimension is used. Can we also use it to improve the packing density of transistors? The current technology is based on a silicon surface that contains patterns of doping atoms and is topped by silicon dioxide. To use the third dimen-sion, a generalization of the technology is needed. One would need another layer of silicon on top of the silicon dioxide, and so forth. Actually such technology does already exist: silicon on insulator SOI technology. These may be scalable further than current devices and may open the horizon to the use of the third dimension……………. No doubt it is the business income that will determine the limitations of scaling to a large extent.

Transistors of the current technology have been developed and adjusted to accommodate the tyranny from the top, in particular the demands set forth by the von Neuman architecture of conventional computers. It is therefore not surprising that new devices are always looked at with suspicion by design engineers and are always found wanting with respect to some tyran-nical requirement. Many regard it extremely unlikely that a completely new device will be used for silicon chip technology. Therefore architectures that deviate from von Neuman’s principles have received increasing attention. Thee architectures invariably involve some form of parallelism. Switching and storage is not localized ot a single transistor or small cir-cuit. The devices are connected to each other and their collective interactions are the basis for computation. It has been shown that such collective interactions can perform some tasks in ways much superior to von Neuman’s processing.

Unifying science based on the unifying features of nature at the nanoscale provides a new foundation for knowledge, innovation, and integration of technology. Revolutionary and syn-ergistic advances at the interfaces between previously separated fields of science, engineering, and areas of relevance are poised to create nano-bio-info-cogno (NBIC) transforming tools, products and services. There is a longitudinal process of convergence and divergence in major areas of science and engineering. For example, the convergence of sciences at the macroscale was proposed during the Renaissance, and it was followed by narrow disciplinary specializa-tion in science and engineering in the 18th to 20th centuries. The convergence at the nanoscale reached its strength in about 2000, and one may estimate a divergence of the nanosystem ar-chitectures in the following decades. Fundamental changes envisioned through nanotechnol-ogy have required a long term R&D vision. A two decade timescale was planned for transi-tioning from the focus on passive nanostructures in 2001 to 2005 to molecular active nanosys-tems after 2015. We provided detailed technical input for two hearings in the Congress in both the Subcommittee on Basic Science, Committee on Science, US House of Representa-tives (June 22, 1999) and the Senate, and support was received from both parties. We had the attention of Neal Lane, the presidential science advisor, and Tom Kalil, economic assistant to the president. The preparatory materials included a full 200 page benchmarking report, 10 page research directions, and 1 page summary on immediate goals. After the hearing in the House, Nick Smith the chair of the first public hearing in preparation of NNI said “Now we have sufficient information to aggressively pursue nanotechnology funding”. Rick Smalley came and testified despite his illness. In Nov 1999, the OMB recommended nanotechnology as the only new R&D initiative for FY 2001. National Nanotechnology Initiative was pre-pared with the same rigor as a science project, between 1997 and 2000; we prepared a long term vision of research and development (Roco et al 2000), and we completed an interna-tional benchmarking of nanotechnology in academe, government, and industry (Siegel et al 1999). Other milestones included a plan for the US govt investment (NSTC 2000) a brochure explaining nanotechnology for the public (NSTC 1999) and a report on the societal implica-tions of nanoscience and nanotechnolgoy (Roco and Bainbridge, 2001) . In the first year (2001), of National Nanotechnology Initiative approval, the six agencies of the NNI invested about 490$ million. In Financial Years 2002 and 2003, NNI increased significantly, from 6 to 16 departments and agencies. The presidential announcement of NNI with its vision and pro-gram partially motivated or stimulated the international community. The average annual rate of increase of the NNI budgets was over 35% including congressionally directed funding – in the first five years (FYs 2001 – 2005). The total R&D investment in FYs 2001 -2006 was over $ 5 billion, increasing form the annual budget of $ 270 million in 2000 to $ 1.3 billion including congressionally directed projects in 2006. An important outcome is the formation of an interdisciplinary nanotechnology commuity with about 50,000 contributors, 60 large R&D centres, networks, and user facilities has been established since 2000. this expanding industry consists of more than 1500 companies with nanotechnology products with a value exceeding $ 40 billion at an annual rate of growth estimated at about 25%. With such growth and com-plexity, participation of a coalition of academic organisations, industry, businesses, civil or-ganisations, government, and NGOs in nanotechnology development becomes essential as an alternative to the centralized approach. The role of government continues in basic research and education but its emphasis is changing, while the private sector becomes increasingly dominant in funding nanotechnology applications.

Only since 1981 have we been able to measure the size of a cluster of atoms on a surface (IBM, Zurich) and begun to provide better models for chemistry and biology self-organisation and self assembly. Ten years later, in 1991, we were able to move atoms on surfaces (IBM, Almaden), and after ten more years in 2002, we assembled molecules by physically position-ing the component atoms. Yet , we cannot visualize or model with proper spatial and temporal accuracy a chosen domain of engineering or biological relevance at the nanoscale. Nanotech phenomena hold the promise for fundamentally new applications. Possible examples include chemical manufacturing using designed molecular assemblies, processing of information us-ing photons or electron spin, detection of chemicals or bioagents using only a few molecules, detection and treatment of chronic illnesses by subcellular interventions, regenerating tissue and nerves, enhancing learning and other cognitive processes by understanding the society of neurons, and cleaning contaminated soils with designed nanoparitcles.

Using input from industry and academic experts in the US Asia Pacific countries, and Europe between 1997 and 1999, we have projected that $ 1 trillion in products incorporating nanotechnology and about 2 million jobs worldwide will be affected by nanotechnology by 2015 (Roco and Bainbridge, 2001). Extrapolating from information technology, where for every worker, another 2.5 jobs are created in related areas, nanotechnology has the potential to create 7 million jobs overall by 2015 in the global market.

New governance approach

...Just as nanotechnology is changing how we think about unity of matter at the nanoscale and manufacturing, it is also changing how we think about the management of the ereseach enterprise.

This switch can be seen a sthe specializatio of scientific disciplines has migrated to more unifying concepts for scientific research and system integration in engineering and technology.

Most of the major US science and and technology programmes in the 20 century, such as space exploration and energy and evironmental programmes, have been pulled primarily by external factors. The economy, natural resources, national security, and international agreements and justifications have initiatied top down R&D funding decisions. In contrast nanotechnology development was initially pushed by fundamental knowledge (nanoscience and nanoengineering - and the long term promise of its transformative power. for this reason, we have done the preparation and governance of nanotechnology differently. For nano reserch policies have been motivated by long term vision rather than short term economic and political decisions.

Brenner et al, (2007), Handbook of Nanoscience, Engineering, and Technology, Tay-lor&Francis



………In addition to structural information, local chemical composition, bonding and electronic states are key aspects of nanostructures. A range of spectroscopic techniques is now available for probing local structures on the atomic scale. Among these, electron energy loss spectroscopy and energy dispersive xray analysis have emerged as the most powerful and widely used. These techniques probe, respectively the energy loss of electrons passing through a thin specimen and the rays generated by the incident electron beam. In many applications of nanotechnology the measurement of the magnetic and electrical properties of individual nanoscale objects is becoming increasingly important. Relatively recent instrumental developments have enabled the technique of electron holography, first proposed by Gabor in 1947, to be used, to obtain measurements of these. …………As nanostructues become smaller, their 3-dimentsional shape assumes much greater importance. This is particularly the case for supported catalysts, where active sites may be determined by such entities. Although electron tomography had been previously applied successfully to biological structures it has only recently been extended to for applications in nanoscience.
John Hutchison et al, Nanocharacterisation, RSC, 2007

One of the characteristic features of nano-systems is their high surface to volume ratio. Their electronic and magnetic properties are often distinguished by quantum mechanical behaviour. Their mechanical and thermal properties can be formulated within the framework of classical statistical mechanics of small systems as presented by Hill. In 1964 he introduced the subject of thermodynamics of small systems that covered chemical thermodynamics of mixtures, colloidal particles, polymers and macromolecules. The study of efficiency of heat engines by Carnot, in 1824, stated that the efficiency of a heat engine depended only on the temperature difference between its heat source and heat sink and not on the working substance. Clapeyron developed the relationship between vapour pressure and an unknown function of empirical temperature scale. Gibbs thesis published in 1875 entitled “On the equilibrium of heterogeneous substances”. Thomson formulated the first and second laws of thermodynamics in 1851.