Saturday, January 24, 2009

Atomistic visualization of deformation in gold

Atomistic visualization of deformation in gold

A mechanical force acting on a solid causes deformation and fracture. These two processes are closely related to several fields of technology and have been studied for a long time by scientists and engineers. In particular, the elucidation of the deformation process and its mechanism has been a fundamental subject in solid-state physics and metallurgy. Various types of deformation mechanisms have been proposed from mechanical tests and structure analyses, for example, dislocation mechanisms,1–3 twinning,4–6 and grain boundary sliding.7,8 All of these mechanisms have been explained using atomistic models. According to these models, deformation proceeds through generation, multiplication, growth, and annihilation of line- and plane-type internal lattice defects, such as dislocations, stacking faults, and twins. Detailed studies have been performed for the elucidation of the relations between the deformation processes and the mechanisms. Several types of transmission electron microscopy ~TEM! have played a significant role in this elucidation. For example, the structures of dislocations were investigated by conventional static TEM ~Ref. 9! and it is known that their behaviors can be analyzed by conventional dynamic TEM using cinema photography.10,11 In particular, the
behaviors were directly observed by in situ deformation and conventional dynamic TEM.12,13 The electron-irradiated induced motions of twins in gold thin films14 and gold clusters,15 and surfaces and stacking faults in cadmium telluride16,17 were also observed at an atomic level by dynamic high-resolution TEM ~DHRTEM! using television camera and video tape recording systems. The deformation process, however, has not been directly observed in real space on an atomic scale and the elemental atomic processes in deformation have still not been clarified. A different type of microscopy is required in order to investigate the atomic processes: DHRTEM using a television camera and video tape recording system with a piezodriving specimen holder is expected to be the optimum method to analyze the atomic process of deformation.18,19 The purpose of the present study is to elucidate the atomic processes of mechanical deformation in gold by direct atomistic visualization by DHRTEM. EXPERIMENTAL PROCEDURES A piezodriving specimen holder for a transmission electron microscope was developed for subnanometer scale mechanical deformation and in situ observations. Figure 1 is an illustration of the specimen holder. The mobile side is connected with a pipe-type piezoelectric device for fine displacement and a microscrew motor for coarse displacement. The specimen on the mobile side is mounted on the tip of a lever connected with the piezodevice. The mobile side are displaced along the x direction from 0 to 1 mm by the motor. The fine displacement along the x direction is controlled by homogeneous elongation and shrinkage of the piezodevice. The fine displacements along the y and z directions are controlled by elongation and shrinkage on one side of the pipe. Resolution of the fine displacement by the piezodevice is less than 0.16 nm along the x direction and 0.22 nm along the y and z directions. Piezodriving methods are used for the displacement of scanning needles in several combination-type microscopes of reflection electron microscopy and scanning tunneling microscopy ~STM!, or TEM and STM.

PHYSICAL REVIEW B VOLUME 57, NUMBER 18 1 MAY 1998-II 0163

Tuesday, January 20, 2009

The Lincoln Inaugural Bible

Oxford is playing a key role in a historical event today – as President-Elect Barack Obama takes the oath of office on a Bible published by Oxford University Press.

In addition to this, a number of Oxford University alumni will be playing key roles in President Obama’s new team.

Today (20 January 2009), President-Elect Obama will swear on the Lincoln Inaugural Bible – the same one upon which Abraham Lincoln swore March 4, 1861, to uphold the Constitution.

The 1,280-page Bible was published in 1853 by the Oxford University Press, and was originally purchased by William Thomas Carroll, Clerk of the Supreme Court.

The Bible itself is bound in burgundy velvet with a gold-washed white metal rim around the three outside edges of both covers. Its edges are all heavily gilt. In the centre of the top cover is a shield of gold wash over white metal with the words "Holy Bible" set into it. The book is 15 cm long, 10 cm wide, and 4.5 cm deep when closed.

In the back of the volume, along with the seal of the Supreme Court, it is annotated: ‘I, William Thomas Carroll, clerk of the said court, do hereby certify that the preceding copy of the Holy Bible is that upon which the Hon. R. B. Taney, Chief Justice of the said Court, administered to His Excellency, Abraham Lincoln, the oath of office as President of the United States ...'

Today’s inauguration is even more historic than Lincoln’s remarkable swearing-in, almost 150 years ago. On that occasion, the oath of office was administered by Chief Justice Roger Brooke Taney, then 84 years old. As the author of the infamous "Dred Scott" decision of 1857, which held in part that Congress did not have the power to abolish slavery in the territories, Taney was no friend to Lincoln or the cause of emancipation. In the Inaugural Address which followed, President Lincoln appealed to his countrymen to follow ‘the better angels of our nature.’

In addition to the OUP-published Bible playing such a key role, there are also a number of Oxonians in President Obama’s team. Dr Susan Rice (New College, 1986), who will be the Ambassador to the UN, has an MPhil and DPhil in International Relations from Oxford. Michèle Flournoy (Balliol College, 1983), Under Secretary of Defense (Policy), also holds an MLitt in International Relations. And Retired Admiral Dennis C Blair (Worcester College, 1968), Director of National Intelligence, has a BA in Modern History and Modern Languages (Russian). Finally Oxford Professor Diana Liverman was appointed to the new committee on ‘America’s Climate Choices’ convened by the US National Academies at the request of Congress, to advise the Government on responses to climate change.

http://www.ox.ac.uk/media/news_stories/2009/090120_2.html

Sunday, January 18, 2009

Feynman's Path Integral Formulation

The path integral formulation of quantum mechanics is a description of quantum theory which generalizes the action principle of classical mechanics. It replaces the classical notion of a single, unique trajectory for a system with a sum, or functional integral, over an infinity of possible trajectories to compute a quantum amplitude.

The path integral formulation was developed in 1948 by Richard Feynman. Some preliminaries were worked out earlier, in the course of his doctoral thesis work with John Archibald Wheeler.

This formulation has proved crucial to the subsequent development of theoretical physics, since it provided the basis for the grand synthesis of the 1970s which unified quantum field theory with statistical mechanics (see the first three textbooks below). If we realize that the Schrödinger equation is essentially a diffusion equation with an imaginary diffusion constant, then the path integral is a method for summing up all possible of random walks.

wikipedia.com


Caging Schrödinger's Cat - Quantum Nanotechnology3-Dec-2008

Weird new possibilities emerge as we explore the nanoworld, the universe at the length scale of a billionth of a metre. Here the theory of quantum mechanics bewilders our everyday common sense, as Erwin Schrödinger famously expressed when he imagined a cat that was both dead and alive at the same time! Now Dr Simon Benjamin shows us how experts in physics, chemistry and materials science are working together to harness this strange reality. Underlying their research is the promise of building what may be the most exotic and powerful technology ever conceived: the quantum computer.

Simon Benjamin, Oxford University
http://media.podcasts.ox.ac.uk/mat/nanotechnology/quantum4-medium-video.mp4?CAMEFROM=podcastsGET


Poisson Bracket

In mathematics and classical mechanics, the Poisson bracket is an important operator in Hamiltonian mechanics, playing a central role in the definition of the time-evolution of a dynamical system in the Hamiltonian formulation. In a more general setting, the Poisson bracket is used to define a Poisson algebra, of which the Poisson manifolds are a special case. These are all named in honour of Siméon-Denis Poisson.

Wikipedia

Gold nanoparticles in glass

Sphere gold particles 25 nm, in glass reflect red
Sphere gold particles 50 nm, in glass reflect green
Sphere gold particles 100 nm, in glass reflect orange
Sphere silver particles 100nm in glass reflect yellow
Sphere silver particles 40 nm in glass reflect blue
Prism shape silver particles 100 nm, in glass reflect red


http://mrsec.wisc.edu/Edetc/SlideShow/slides/quantum_dot/stained_glass.html


Measurement of temperatur for glasses

In contrast to the viscosity the thermal expansion, heat capacity, and many other properties of inorganic glasses show a relatively sudden change at the glass transition temperature. This effect is used for measurement by Differential scanning calorimetry (DSC) and dilatometry.

The viscosity at the glass transition temperature depends on the sample preparation (especially the cooling curve), the heating or cooling curve during measurement and the chemical composition.[4] In general, the glass transition temperature is close to the annealing point of glasses at 1013 poise = 1012 Pa·s. For dilatometric measurements heating rates of 3-5 K/min are common, for DSC measurements 10 K/min, considering that the heating rate during measurement should equal the cooling rate during sample preparation.

www.Wikipedia.com

O. V. Mazurin, Yu. V. Gankin: "Glass transition temperature: problems of measurements and analysis of the existing data"; Proceedings, International Congress on Glass, July 1-6, 2007, Strasbourg, France.

Friday, January 16, 2009

Important medieval records to go online

The most important unpublished records of the Hundred Years War, the Gascon Rolls, will be made available to academic researchers and the general public, thanks to a project led by Oxford.

Academics from Oxford are collaborating with the University of Liverpool and King’s College London on the initiative and have been funded almost three-quarters of a million pounds by the Arts and Humanities Research Council. The National Archives and The Ranulf Higden Society are also co-operating in the project.

The Hundred Years War is a significant era of history, which ended after a massive defeat by the French of an English army on a battlefield at Castillon, near Bordeaux. This terminated three hundred years of English rule in southwest France and the end of England’s rule as a continental European land power.

Dr Malcolm Vale, of St John’s College, said: 'The history of the old emnity between England and France today still arouses interest and, in some quarters, passion. Its origins lay in the Middle Ages, and some parts of the story have not yet been fully told. One phase of the conflict - now known as the Hundred Years War (1337-1453) - was provoked and fuelled by English claims to hold overseas territories, particularly the duchy of Aquitaine.

‘This research project aims to make available the most important unpublished documentary source for that war, its prelude, course and aftermath so we can arrive at a better understanding of how and why relations between the two countries deteriorated, leading to a century-long conflict. Its consequences have resonances even today - in, for example, the Joan of Arc story and the mythologies, which have grown up around it on both sides of the Channel. This project will make an important contribution to international scholarship and to the history of a region of France with which British connections have always been close.'

There are 113 unpublished manuscripts, covering the years 1317 to 1468, which are currently held in the National Archives in London. They contain copies of letters, grants and many other documents mostly written in Latin, and will be published in English summaries in on-line and printed form. The work of the project will be highly innovative producing a resource which will include on-line indices, a search function and the facility to view both images and text within a highly sophisticated and interconnected framework. It is expected to take three years to complete the project.

Dr Vale is the project’s director, and Paul Booth, of the University of Liverpool, co-director. They will work with two post-doctoral researchers, from Oxford and Liverpool, to read, translate, and summarise the entries on the rolls. The Centre for Computing in the Humanities at King's College London will develop the technical framework.

Finally, The Ranulf Higden Society, a group of experienced, independent researchers, will produce a full edition (text and translation) of the roll for 1337-38, which covers the outbreak of the Hundred Years War.

www.ox.ac.uk/media

Diagrammatic Monte Carlo Method

In an exciton, the electron and the hole are bound together by an electric attraction-known as the Coulomb force-in a fashion very similar to that of an electron and a positron in a hydrogen atom. The presence of the host lattice and its thermal and magnetic excitations that consist of phonons and magnons, respectively-known collectively as the ‘bosonic’ field-can affect the excitons considerably.

The researchers, including Andrei Mishchenko from the RIKEN Advanced Science Institute in Wako, aimed to develop a technique to study the excitons’ interaction with phonons in an exact way. In particular, they focused on taking into consideration the fact that phonons do not act instantaneously as occurs in the Coulomb attraction. “Previously, the only way to treat the exchange [between electrons and holes] by bosons was an instantaneous approximation, where the influence of particle–boson interaction was included into the model by renormalization of the instantaneous coupling,” explains Mishchenko.

Mishchenko and colleagues’ technique is known as a Diagrammatic Monte Carlo Method and is based on the diagrams that the Nobel laureate Richard Feynman introduced to quantum field theory. The method per se existed already and was normally used with all variables expressed as a function of spatial coordinates. This, however, limits the size of the area that can be examined in a calculation. The team therefore formulated the algorithm for momentum space. This provides the “possibility to overcome the limitation of the direct space method [for] finite systems and handle the problem [in] a macroscopic system,” says Mishchenko.

Like any new theoretical method, the team’s numerical technique must be compared with known scenarios to verify its validity, so Mishchenko and colleagues used it to study excitons with different values for the electron and hole masses. They found very good agreement with previous theories within the limit in which it is reasonable to neglect any retardation effect. Importantly however, the results show that in standard conditions it is incorrect to neglect the retardation.

As Mishchenko explains: “Our ‘free-from-approximations’ results show that the domain of validity of the instantaneous approximation is very limited.”

*Burovski, E., Fehske, H. & Mishchenko, A.S. Exact Treatment of Exciton-Polaron Formation by Diagrammatic Monte Carlo Simulations. Physical Review Letters 101, 116403 (2008).



The Wiedemann-Franz law: When you are static and cold, you have more of electric conductivity in you. That is because electrons transportation properties diminish by velocity because of higher collisions between electrons. The same is true for heat, which increase collisions of electrons hence decrease their conductivity. Electric conduction depends on free electrons moving in one direction. the Wiedemann and Franz proved that proportionality of thermal and electric conductivity is the same for all metals at the same temperature.

Thermal conductivity / electric conductivity = Temperature x Constant L
K/ σ = L T (1)

There is difficulty in nano-scale materials to conduct electrical measurements. But STM with a fixed tip (2) has been used to measure conductivity. Also electrostatic force microscopy EFM that applies voltage between the tip and the sample made image of local charge domains on a sample surface. (3) Resistance to electrical conductivity decrease with size because of improved order and lesser amount of defects in the lattice, p (resistance) = 1/ σ (4) But scattering of free electrons on the nanosurfaces may have adverse effect on electric conductivity.

(1) http://en.wikipedia.org/wiki/Wiedemann-Franz_law
(2) Kelsall et al, 2007, Nanoscale science and technology, Willey, p 126
(3) Kelsall et al, 2007, Nanoscale science and technology, Willey, p 92
(4) Owens et al, 2007, The physics and chemistry of Nanosolids, p 81

- Jones, William; March, Norman H. (1985). Theoretical Solid State Physics. Courier Dover Publications.

The Casimir force

When two uncharged objects are placed in a vacuum with no external fields, we wouldn’t expect them to have any force between them other than gravity. Quantum electrodynamics says otherwise. It shows that tiny quantum oscillations in the vacuum will give rise to an attraction called the Casimir force.

Scientists at the RIKEN Advanced Science Institute in Wako, and co-workers at the National Academy of Sciences of Ukraine (NASU), have shown for the first time that the Casimir force has a complex dependence on temperature. They propose a related experiment that could clarify the theory around this important interaction, which has widespread applications in physics and astronomy, and could eventually be exploited in nano-sized electrical and mechanical systems.

“The Casimir force is one of the most interesting macroscopic effects of vacuum oscillations in a quantum electromagnetic field,” says Franco Nori from RIKEN and the University of Michigan in the USA. “It arises because the presence of objects, especially conducting metals, alters the quantum fluctuations in the vacuum.”

The Casimir force was first predicted in 1948, but has only recently been measured in the laboratory because experiments are difficult—the force is negligible except when the distance between objects is very small. More experiments are needed to understand how the force depends on temperature, an important practical consideration.

“As the temperature increases, metal objects in a vacuum experience two competing effects,” explains Sergey Savel’ev from RIKEN and Loughborough University in the UK. “They lose some of their electrical conductivity, which tends to cause a decrease in the Casimir force. At the same time they are bombarded with more radiation pressure from the thermal heat waves, and this increases the Casimir force.”

Nori and co-workers derived the temperature dependence for Casimir attractions between a thin film and a thick flat plate, and between a thin film and a large metal sphere. They found that the Casimir force will tend to decrease near room temperature, but can increase again at higher temperatures as the thermal radiation effects take over.

RIKEN’s Valery Yampol’skii, who also works at NASU, says that “if these temperature effects were observed in an experiment, they would resolve some fundamental questions about electron relaxation in a vacuum”. Such an experiment would be near-impossible with pieces of bulk metal, but could be done using extremely thin metal films.

Yampol’skii, V.A., Savel’ev, S., Mayselis, Z.A., Apostolov, S.S. & Nori, F. Anomalous temperature dependence of the Casimir force for thin metal films. Physical Review Letters 101, 096803 (2008).

http://www.azonano.com/news.asp?newsID=9452

Coloured glass

Colour in glass may be obtained by addition of coloring ions that are homogeneously distributed and by precipitation of finely dispersed particles (such as in photochromic glasses).[6] Ordinary soda-lime glass appears colorless to the naked eye when it is thin, although iron(II) oxide (FeO) impurities of up to 0.1 wt%[27] produce a green tint which can be viewed in thick pieces or with the aid of scientific instruments. Further FeO and Cr2O3 additions may be used for the production of green bottles. Sulfur, together with carbon and iron salts, is used to form iron polysulfides and produce amber glass ranging from yellowish to almost black.[38] Manganese dioxide can be added in small amounts to remove the green tint given by iron(II) oxide.

Wikipedia



John Dalton's Vision of Colours

In 1794, shortly after his arrival in Manchester, Dalton was elected a member of the Manchester Literary and Philosophical Society, the "Lit & Phil", and a few weeks later he communicated his first paper on "Extraordinary facts relating to the vision of colours", in which he postulated that shortage in colour perception was caused by discolouration of the liquid medium of the eyeball. In fact, a shortage of colour perception in some people had not even been formally described or officially noticed until Dalton wrote about his own. Although Dalton's theory lost credence in his own lifetime, the thorough and methodical nature of his research into his own visual problem was so broadly recognized that Daltonism became a common term for colour blindness. Examination of his preserved eyeball in 1995 demonstrated that Dalton actually had a less common kind of colour blindness, deuteroanopia, in which medium wavelength sensitive cones are missing (rather than functioning with a mutated form of their pigment, as in the most common type of colour blindness, deuteroanomaly). Besides the blue and purple of the spectrum he was able to recognize only one colour, yellow, or, as he says in his paper,

that part of the image which others call red appears to me little more than a shade or defect of light. After that the orange, yellow and green seem one colour which descends pretty uniformly from an intense to a rare yellow, making what I should call different shades of yellow.

This paper was followed by many others on diverse topics on rain and dew and the origin of springs, on heat, the colour of the sky, steam, the auxiliary verbs and participles of the English language and the reflection and refraction of light.



Valency bonds (Combining power)

In chemistry, valence bond theory is one of two basic theories, along with molecular orbital theory, that developed to use the methods of quantum mechanics to explain chemical bonding. It focuses on how the atomic orbitals of the dissociated atoms combine on molecular formation to give individual chemical bonds. In contrast, molecular orbital theory has orbitals that cover the whole molecule.[1]

In 1916, G.N. Lewis proposed that a chemical bond forms by the interaction of two shared bonding electrons, with the representation of molecules as Lewis structures. In 1927 the Heitler-London theory was formulated which for the first time enabled the calculation of bonding properties of the hydrogen molecule H2 based on quantum mechanical considerations. Specifically, Walter Heitler determined how to use Schrödinger’s wave equation (1925) to show how two hydrogen atom wavefunctions join together, with plus, minus, and exchange terms, to form a covalent bond. He then called up his associate Fritz London and they worked out the details of the theory over the course of the night.[2] Later, Linus Pauling used the pair bonding ideas of Lewis together with Heitler-London theory to develop two other key concepts in VB theory: resonance (1928) and orbital hybridization (1930). According to Charles Coulson, author of the noted 1952 book Valence, this period marks the start of “modern valence bond theory”, as contrasted with older valence bond theories, which are essentially electronic theories of valence couched in pre-wave-mechanical terms.


....The exact inception, however, of the theory of chemical valencies can be traced to an 1852 paper by Edward Frankland, in which he combined the older theories of free radicals and “type theory” with thoughts on chemical affinity to show that certain elements have the tendency to combine with other elements to form compounds containing 3, i.e. in the three atom groups (e.g. NO3, NH3, NI3, etc.) or 5, i.e. in the five atom groups (e.g. NO5, NH4O, PO5, etc.), equivalents of the attached elements. It is in this manner, according to Franklin, that their affinities are best satisfied. Following these examples and postulates, Franklin declares how obvious it is that:[3]
“ A tendency or law prevails (here), and that, no matter what the characters of the uniting atoms may be, the combining power of the attracting element, if I may be allowed the term, is always satisfied by the same number of these atoms. ”

This “combining power” was afterwards called quantivalence or valency (and valence by American chemists).


1- Murrel, J. N.; S. F. Tedder (1985). The Chemical Bond. John Wiley & Sons. ISBN 0-471-90759-6.
2- Partington, J.R. (1989). A Short History of Chemistry. Dover Publications, Inc. ISBN 0-486-65977-1.
3- Franklin, E. (1852). Phil. Trans., vol. cxlii, 417.

Thursday, January 15, 2009

Economic shock

As many as one million working-age men died due to the economic shock of mass privatisation policies followed by post-communist countries in the 1990s, according to a new study published in The Lancet.

The Oxford-led study measured the relationship between death rates and the pace and scale of privatisation in 25 countries in the former Soviet Union and Eastern Europe, dating back to the early 1990s. They found that mass privatisation came at a human cost: with an average surge in the number of deaths of 13 per cent or the equivalent of about one million lives.

The rapid privatisation programme, part of a plan known by economists as ‘shock therapy’, led to a 56 per cent increase in unemployment, which the study says played an important role in explaining why privatisation claimed so many lives. Many employers provided extensive health and social care for their employees, so through privatisation workers experienced the ‘double whammy’ of losing not only their livelihood but also their means of surviving the crisis.

David Stuckler from Oxford, and colleagues Dr Lawrence King from Cambridge University and Professor Martin McKee, from the London School of Hygiene and Tropical Medicine, took death rates reported by the World Heath Organisation for men of working age (15-59 years) in 25 post-communist countries and compared them to the timing and extent of participation in mass privatisation and other transition policies.

The team took into account other factors that might affect rising death rates (such as economic depression, initial conditions and health infrastructure). They also examined other measures of privatisation from the European Bank for Reconstruction and Development, a bank which gave loans in support of radical mass privatisation.

www.ox.ac.uk/media

AFM

In 1986, Gerd Binnig and Heinrich Rohrer shared the Nobel Prize in Physics “for their design of the scanning tunnelling microscope”. World of possibility. The AFM has inspired a variety of other scanning probe techniques.


Atomic force microscopy Getting a feeling for the nanoworld
Nature Nanotechnology News and Views (01 Aug 2007)

Nanomedicine Elastic clues in cancer detection
Nature Nanotechnology News and Views (01 Dec 2007)

AFMs can, however, image the topography of a surface much faster than they can map forces, and instruments that are capable of video-rate imaging will soon be available1, 2, 3. Although methods such as pulsed-force-mode AFM4 now permit much faster force measurements, we cannot even begin to dream of mapping mechanical forces with nanoindentation at this rate. However, we can try to gain similar information about mechanical forces from dynamic AFM and the other high-speed techniques that are used for imaging. In the amplitude modulation or 'tapping' mode of dynamic AFM, the tip is attached to a long cantilever that oscillates (with an amplitude of a few nanometres) at or near the resonance frequency of the cantilever. During each cycle the tip gently touches the surface, in effect performing a full nanoindentation cycle.

STARK R. Atomic force microscopy: Getting a feeling for the nanoworld, Nature Nanotechnology 2, 461-462 (2007)

Yamanaka, K., Ogiso, H. & Kolosov, O. Ultrasonic force microscopy for nanometer resolution subsurface imaging. Appl. Phys. Lett. 64, 178–180

Maivald, P.et al. 1991, Using force modulation to image surface elasticities with the atomic force microscope. Nanotechnology 2, 103

Ge, S.et al. 2000, Shear modulation force microscopy study of near surface glass transition temperatures. Phys. Rev. Lett. 85, 2340–2343Dinelli, F., Buenviaje, C. &

Overney, R. M. Glass transition measurements on heterogeneous surfaces. Thin Solid Films 396, 138–144 (2001).

Experimental and theoretical investigations of processes at surfaces have developed by measuring and analysing forces and properties of surfaces, films and interfacial phenomena in nanostructures. Microscopes such as scanning tunnelling with the sharp tip instead of lenses entered the nanoworld since 1981. In Scanning Tunnelling Microscope STM wavefunctions of electrons on the tip were overlapped by the ones of atoms of the surface. An atomic difference of distance between the tip and the surface changes the tunnelling current exponentially. The initial introduction to Scanning Tunnelling Microscopy entitled “Tunnelling through a controllable vacuum gap”, was published in Applied Physics Letters, in January 1982. (1) Atomic Force Microscopy AFM detects chemical interaction forces by measuring repulsion of atoms between the tip apex of microscope and a conductive surface vertically. AFM experiments initiated by scientists in IBM and Stanford in 1986 involve analysis of surface, topography, bonding, resistance, corrosion, friction, lubricant-film thickness, and mechanical properties at nanoscale.


Binnig, G., Rohrer, H., Gerber, Ch. & Weibel, E. App. Phys. Lett. 40, 178–180 (1982).
-Gerber Christoph et al, How the Doors to the Nanoworld Were Opened, Nature Nanotechnology, VOL 1 | OCTOBER 2006
Hill
-Sugimoto Y et al, Complex Patterning by Vertical Interchange Atom Manipulation Using Atomic Force Microscopy, Science, 17 Oct 2008, vol 322, no 5900, pp 413 - 417
- Bhushan, B. (ed.), 1997 Micro/nanotribology and its applications (proceedings of the NATO Advanced Study Institute on Micro/Nanotribology and its Applications, held in Sesimbra, Portugal, June 16-28 1996), NATO ASI Series, Series E: Applied Sciences, vol. 330, Dordrecht; London, Kluwer Academic.

Nanotribology: to predict level of wearing and friction of devices on atomic scale, engineers from several disciplines such as mechanical, materials, and chemical engineers have conducted various tests on machine components.


Christoph Gerber How the doors to the nanoworld were opened, Nature
nature nanotechnology | VOL 1 | OCTOBER 2006 | www.nature.com/naturenanotechnology

History of AFM: in 1986, Gerd Binnig and Heinrich Rohrer shared the Nobel Prize in Physics “for their design of the scanning tunnelling microscope”.

World of possibility. The AFM (centre) has inspired a variety of other scanning probe techniques.

Originally the AFM was used to image the topography of surfaces, but by modifying the tip it is possible to measure other quantities (for example, electric and magnetic properties, chemical potentials, friction and so on), and also to perform various types of spectroscopy and analysis.

Binnig, G., Quate, C. F. & Gerber, Ch. Phys. Rev. Lett. 56, 930–933 (1986). (most quoted AFM paper 4700)


The Planck constant (denoted h), also called Planck's constant, is a physical constant used to describe the sizes of quanta in quantum mechanics. It is named after Max Planck, one of the founders of quantum theory. The Planck constant is the proportionality constant between energy (E) of a photon and the frequency of its associated electromagnetic wave (ν). This relation between the energy and frequency is called the Planck relation.

A closely related constant is the reduced Planck constant, denoted ħ ("h-bar"), which is equal to the Planck constant divided by (or reduced by) 2π. It is used when frequency is expressed in terms of radians per second instead of cycles per second. The expression of a frequency in radians per second is often called angular frequency (ω), where ω = 2πν.


The de Broglie relations
The first de Broglie equation relates the wavelength λ to the particle momentum as

where is Planck's constant, is the particle's rest mass, is the particle's velocity, is the Lorentz factor, and is the speed of light in a vacuum.
The greater the energy, the larger the frequency and the shorter (smaller) the wavelength. Given the relationship between wavelength and frequency, it follows that short wavelengths are more energetic than long wavelengths. The second de Broglie equation relates the frequency of the wave associated to a particle to the total energy of the particle such that

where is the frequency and is the total energy. The two equations are often written as


where is momentum, is the reduced Planck's constant (also known as Dirac's constant, pronounced "h-bar"), is the wavenumber, and is the angular frequency.
See the article on group velocity for detail on the argument and derivation of the de Broglie relations. Group velocity (equal to the electron's speed) should not be confused with phase velocity (equal to the product of the electron's frequency multiplied by its wavelength).
Matter wave phase
In quantum mechanics, particles also behave as waves with complex phases. By the de Broglie hypothesis, we see that
.
Using relativistic relations for energy and momentum, we have

where E is the total energy of the particle (i.e. rest energy plus kinetic energy in kinematic sense), p the momentum, γ the Lorentz factor, c the speed of light, and β the velocity as a fraction of c. The variable v can either be taken to be the velocity of the particle or the group velocity of the corresponding matter wave. See the article on group velocity for more detail. Since the particle velocity v < c for a massive particle according to special relativity, phase velocity of matter waves always exceed c, i.e.
,
and as we can see, it approaches c when the particle velocity is in the relativistic range. The superluminal phase velocity does not violate special relativity, for it doesn't carry any information. See the article on signal velocity for detail.


wikipedia

Saturday, January 10, 2009

Measuring functionality of cells in quantitative manner

Blue Gene project = A tiny part of the brain has been simulated by IBM in “Blue Brain” project – simulating 100,000 neurons containing 30 million connections that provide precise electrical conversations between them. IBM intend to scale it up to the real size of human brain.
If “a neuron's natural firing time is delayed by just a few milliseconds, the entire sequence of events was disrupted. The connected cells became strangers to one another.” (1) The main question remains to be answered like all other areas of systems biology – and that is about the system rather than its components! “The problem is that if you ask a hundred computational neuroscientists to build a functional model, you'll get a hundred different answers.”(1) To model functional interactions between molecules merely in one single cell of the heart, we need 10-30 giant computers such as Blue Gene. Hence, there are no enough materials available to actually enable us to simulate the entire human heart. (2) Information regarding interactions resides neither in the genome nor even in the individual proteins that genes code for. It lies at the level of protein interactions within the context of subcellular, cellular, tissue, organ, and system structures.(3) Advanced computer algorithms and hardware have made it possible to quantify these interactions between different components of cells. Computer-intensive quantitative measures of functionality are developed to the edges for providing precise quantification of systems physiology.

(1) http://seedmagazine.com/news/2008/03/out_of_the_blue.php
(2) Denise Noble, computational modelling of biological system; http://noble.physiol.ox.ac.uk/People/DNoble/
(3) Modelling the heart from genes to cells up to the whole organs; http://www.sciencemag.org/cgi/content/abstract/295/5560/1678

Sunday, January 04, 2009

Food influencing cognitive ability


Brain foods: the effects of nutrients on brain function

Advances in molecular biology have revealed the ability of food-derived signals to influence energy metabolism and synaptic plasticity and, thus, mediate the effects of food on cognitive function....The omega-3 fatty acid docosahexaenoic acid (DHA), which humans mostly attain from dietary fish, can affect synaptic function and cognitive abilities by providing plasma membrane fluidity at synaptic regions. DHA constitutes more than 30% of the total phospholipid composition of plasma membranes in the brain, and thus it is crucial for maintaining membrane integrity and, consequently, neuronal excitability and synaptic function. Dietary DHA is indispensable for maintaining membrane ionic permeability and the function of transmembrane receptors that support synaptic transmission and cognitive abilities. Omega-3 fatty acids also activate energy-generating metabolic pathways that subsequently affect molecules such as brain-derived neurotrophic factor (BDNF) and insulin-like growth factor 1 (IGF1). IGF1 can be produced in the liver and in skeletal muscle, as well as in the brain, and so it can convey peripheral messages to the brain in the context of diet and exercise. BDNF and IGF1 acting at presynaptic and postsynaptic receptors can activate signalling systems, such as the mitogen-activated protein kinase (MAPK) and calcium/calmodulin-dependent protein kinase II (CaMKII) systems, which facilitate synaptic transmission and support long-term potentiation that is associated with learning and memory. BDNF has also been shown to be involved in modulating synaptic plasticity and cognitive function through the phosphatidylinositol 3-kinase (PI3K)/Akt/ mammalian target of rapamycin (mTOR) signalling pathway. The activities of the mTOR and Akt signalling pathways are also modulated by metabolic signals such as insulin and leptin (not shown). 4EBP, eukaryotic translation-initiation factor 4E binding protein; CREB, cyclic AMP-responsive element (CRE)-binding protein; IGFR, insulin-like growth factor receptor; IRS1, insulin receptor substrate 1; p70S6K, p70 S6 kinase.

Brain foods: the effects of nutrients on brain function, Nature Review Neuroscience, Vol 9 July 2008

Friday, January 02, 2009

NanoKTN Announces Two Major Conferences in 2009

The NanoKTN has announced it will hold two major conferences in the first quarter of 2009 in healthcare and energy.

Nano 4 Life will explore the key areas within the life sciences where nanotechnology offers the most opportunity to advance healthcare provision and improve product discovery and development.

Nano4Energy will focus on the exploitation of nano-enabled clean energy generation, storage and conversion technologies.

http://mnt.globalwatchonline.com

Biology advances into unknown

Biology's Big Bang

Molecular biologists have gone from thinking that they know roughly what is going on in their subject to suddenly realising that they have barely a clue.

That might sound a step backwards; in fact, it is how science works. The analogy with physics is deeper than just that between RNA and the neutron. There is in biology at the moment a sense of barely contained expectations reminiscent of the physical sciences at the beginning of the 20th century. It is a feeling of advancing into the unknown, and that where this advance will lead is both exciting and mysterious.

Know thine enemy

As Samuel Goldwyn so wisely advised, never make predictions--especially about the future. But here is one: the analogy between 20th-century physics and 21st-century biology will continue, for both good and ill.

Physics gave two things to the 20th century. The most obvious gift was power over nature. That power was not always benign, as the atomic bomb showed. But if the 20th century was distinguished by anything from its predecessors, that distinctive feature was physical technology, from motor cars and aeroplanes to computers and the internet.

It is too early to be sure if the distinguishing feature of the 21st century will be biological technology, but there is a good chance that it will be. Simple genetic engineering is now routine; indeed, the first patent application for an artificial living organism has recently been filed (see page 96). Both the idea of such an organism and the idea that someone might own the rights to it would have been science fiction even a decade ago. And it is not merely that such things are now possible. The other driving force of technological change--necessity--is also there. Many of the big problems facing humanity are biological, or are susceptible to biological intervention. The question of how to deal with an ageing population is one example. Climate change, too, is intimately bound up with biology since it is the result of carbon dioxide going into the air faster than plants can remove it. And the risk of a new, lethal infection suddenly becoming pandemic as a result of modern transport links (see page 67) is as biological as it gets. Even the fact that such an infection might itself be the result of synthetic biology only emphasises the biological nature of future risks.

At the moment, policymakers have inadequate technological tools to deal with these questions. But it is not hard to imagine such tools. Ageing is directly biological. It probably cannot be stopped, but knowing how cells work--really knowing--will allow the process to be transformed for the better. At least part of the answer to climate change is fuel that grows, rather than fuel that is dug up. Only biotechnology can create that. And infections, pandemic or otherwise, are best dealt with by vaccines, which take a long time to develop. If cells were truly understood, that process might speed up to the point where the vaccine was ready in time to do something useful.

Economist, Biology's Big Bang, 6/16/2007, Vol. 383 Issue 8533, p13-13, 1p




Nanotechnology: Only Natural

Although, Nanoscience has created an era of rising questions on different relationships happening at nanoscale with an unknown perspective, but the pace of curiosity has only grown following the landmark support of the National Nanotechnology Initiative NNI by a politician, Bill Clinton in the year 2000. Many respectable scientists were reluctant to push the agenda forward, for the way scenarios and fairy tales were unfolding in the media. But the support from highly unexpected source for the advancement of magical technology set the agenda for science to go deep. The aim was to mobilise the scientific community to engage in further research in the small world, to uncover relationships and explore plenty of information at nanoscale.
Nanoscience raised as many questions about technology as about life itself that are related to biology (Jones R, 2004). Scanning probe microscopy techniques provided clear visibility with the prospect of watching cells doing their daily jobs. Materials in smaller size are not simply representative of their bulks. The ratio of forces of attraction (Van der Waals) becomes much powerful with active atoms exposed on the surfaces. “At atomic level we have new kind of forces” (Feynman 1959) and new kinds of effects and possibilities, including all natural programming that are at work in biology. Biology is filled with plenty of executable information already explored, with which relationships at very small scale involving 1014 atoms active in every single cell can be exposed. For instance, membrane proteins account for 50% of known drug targets (Oxford Univ research into cystic fibrosis).
With powerful support pouring continuously from official sources, media interest turned realistic. There was collective realisation that scientific researches unlike fictions are funded with sole interest in solving medical and environmental issues. Scientists are working on the problems that are threatening life, nature and the environment. No one other than fiction writers are funded or are interested to work on the fiction side of Nanotechnologies. Nanoscience, hence, found further legitimacy with recent announcement from the Dept of Industry, Universities and Skills DIUS that provided £ 1 billion fund available for R&D programmes. Equally in the US increasing investment in R&D by the government showed a rising budget-curve from $ 270 m in 2001 to 1.3 billion in 2006. More than 1500 companies were accounted for to be involved in producing nanomaterials with an annual growth rate of 25%, which reached over $ 40 Billions (Roco 2007). Elsewhere, EU investment summed up to $ 1.6 billion in 2008. Thence, Nanotechnology found the momentum. The hope is that Nanotechnologies create 2 million jobs for the skilled labour by 2015.
Nanoscale technology followed laws of nature in computing machines as well, using bottom up manufacturing with tiny wires and transistors with the objective to mimic human brains in connecting nonlinear networks. Computers ought to do magical things as they get smaller and use more complicated wiring. Consider recognising a face, which our brains readily process, not only from the same angel or for the same age, rather for recognising traces that are searchable and evoked in the brain. IBM’s research project funded by DARPA involves several universities who made it their task to build complex designed computer, namely iBrain, which is destined for problem solving rather than responding to specific questions. This is an attempt to mimic mammal brain system which translates into nonlinear computer design, in which connections are developed by sensory devices similar to that of brain that receives load of data feedbacks from various senses. One realistic use for this is particularly unprecedented, that is to gather global financial information and make smart decisions in the light of processed complex information.
Such non-linear computing machine may equally assist us in measuring algorithms of so much sought exposure risks of nanoparticles, particularly as there are competing forces present, out of which we need to calculate dominant force and direction, in order to anticipate reactivity dynamics. Nanoparticles <100nm and engineered-nanoparticles strange behaviours have become points of safety concerns, although their risk potentials are poorly understood. Larger surface area of nanoparticles that increase their reactivity with the environment may accelerate production of reactive oxygen species (ROS), including free radicals. ROS prompted is considered to cause oxidative stress which may damage proteins and membranes (Nel, 2006). But how rapidly ROS are produced, and the ratio of scale of nanomaterials in relation with the speed of ROS production is the key to the problem of toxicity measure of nanoparticles. For instance Peter Dobson from Oxford has discovered how to eliminate exposure risks of photocatalytic properties of nanotitania used in sunscreens by changing their functionality.
There are yet, loads of detailed information already available as a result of research studies conducted by drug companies on large enough population. Similarly, research findings from descriptive analysis by universities, and research organisations are plenty as well. However, there are still gaps between experimental and computational measures, in terms of quantification. Additionally, there are issues of making descriptive characterisation meaningful in diverse functionalities in order to extrapolate mechanism-based observations for the task of predictive scientific model. Likewise there are still gaps in making comparisons between various research findings that may be used for quantification and qualification of processes in system biology. For example in diabetes the measurement of blood sugar enabled people to take control of disease. For cancer there is no such marker, but the assumption is that electric devices can measure changes that affect cells to turn cancerous.

The key question is with holistic approach that considers simplifying and modelling a systemic response, some regular and repeating patterns can be drawn out from pile of information hidden in research results. Jim Heath from Caltech expresses concern that “implants of putting something in the body turned out to be a much different game even in a mouse. It is much more complicated environment that we thought”. But, the pathway of discovery is extremely rich. The skill gaps to identify exposure health and environmental risk potentials calls for similar approach. The latter has been recognised by regulatory bodies, and officials, aiming at ensuring “a high level of protection of human health and the environment”. REACH and NNI are under enormous pressure to compile Environment, Health and Safety EHS information and turn them into workable computational and statistical power for figuring out measures of human exposure toxicity.

Nonetheless, Nanoparticles have always been present in nature. “If we consider atmospheric dust alone, estimates indicate about one billion metric tons per year are produced globally” (Kellogg and Griffin 2006). http://www.springerlink.com/content/940g431151604721/.

Oxford University’s David Pyle’s multidimensional research on volcanic nanoparticles found trace-metals flying thousand miles away from the source. http://www.earth.ox.ac.uk/~davidp/

Even so, omnipresent in nature, nanoparticles require classification of whole new categories for their diverse functionality, dynamic, structure and size. These gaps are to be eventually filled by combining in vivo findings from behaviours of Nanomaterials already present in consumer goods with in vitro research results to take accurate account of nanoparticles behaviour toward life.

Nasrin Azadeh-McGuire
Post Graduate Nanotechnology,
Begbroke Science Park, Oxford University


References:

Big Picture on Nanoscience
http://www.wellcome.ac.uk/Professional-resources/Education-resources/Big-Picture/Nanoscience/index.htm

Dobson P, Centenary Lectures, University of Oxford, 6 May 2008
www.eng.ox.ac.uk/events/centenary/programme.html
www.isis-innovation.com/documents/Dobson-precisandbiographicalnote.pdf

Canton, J., 2004 'Designing the future - NBIC technologies and human performance', Coevolution of Human Potential and Converging Technologies, vol 1013, pp. 186-198.

Nanotechnology KTN, Knowledge Transfer Networks,
http://mnt.globalwatchonline.com/epicentric_portal/site/MNT?mode=0

Goddard W et al, (2007), Handbook of Nanoscience, Engineering, and Technology, Taylor & Francis

Oxford Centre for Integrative Systems Biology, OCISB, News (2008), University of Oxford leads cystic fibrosis research with Mac.
http://www.sysbio.ox.ac.uk/newsevents/index.html

Nel A, Xia T, Li N (2006). “Toxic potential of materials at the nanolevel”. Science Vol 311:622-627

Service R, Report Faults U.S. Strategy for Nanotoxicology Research, Science, Vol 322, 19 Dec 2008, p 1779, http://www.sciencemag.org/cgi/content/full/322/5909/1779a

NTP (National toxicology Programme) Nanotechnology Safety Initiative,http://ntp-server.niehs.nih.gov/?objectid=7E6B19D0-BDB5-82F8-FAE73011304F542A

REACH and the regulation of nanotechnology, http://www.safenano.org/Uploads/NanoREACH.pdf

By Erika Morphy, TechNewsWorld, IBM, Academics Seek to Create a Computer That's More Like Us, http://www.technewsworld.com/story/65237.html

http://icon.rice.edu

http://www.safenano.org/nanoREACH.aspx

Prof Jim Heath at Caltech, Woodrow Wilson nano-programmes
www.nanotechproject.org
http://www.penmedia.org/podcast/nano/Podcast/Entries/2008/8/29_Episode_5_-_Creating_Tomorrows_Tools_Today.html