Sponser's Link

Archives

Live Feeds

Visitor Counter

SEARCH BOX


Nanotechnology and nanomaterials represents a multi-billion industry today and will continue to grow over the next few years, driven by demand from the electronics, energy, chemicals, life coatings and catalysts sectors. Markets nanotechnology is leading to growth opportunities in are covered in this report including:

- Aerospace
- Automotive
- Coatings
- Cleaning and sanitary
- Composites
- Construction and exterior protection
- Cosmetics and personal care
- Electronics, optoelectronics and data storage
- Energy
- Environment, water and filtration
- Medical and biotechnology
- Military and defense
- Textiles

Browse: World Market for Nanotechnology Market

The market is forecasted from 2002 through to 2016. End user markets and applications are also outlined. The report covers the following nanomaterials:

- Metal oxide nanopowders
- Carbon Nanotubes
- Fullerenes and POSS
- Graphene
- Nanoclays
- Nanocapsules
- Nanoporous materials
- Nanofibers
- Nanosilver
- Quantum Dots

Table of Contents

1 EXECUTIVE SUMMARY

2 METHODOLOGY

3 MARKET VOLUMES AND DEMAND

3.1 Applications of nanomaterials

3.2 Production estimates 2010 23

3.3 Demand by material type and market

3.3.1 Aluminium Oxide

3.3.2 Antimony Tin Oxide

3.3.3 Bismuth Oxide

3.3.4 Carbon Nanotubes

3.3.5 Cerium Oxide

3.3.6 Cobalt Oxide

3.3.7 Copper Oxide

3.3.8 Fullerenes and POSS

3.3.9 Graphene

3.3.10 Iron Oxide

3.3.11 Magnesium Oxide

3.3.12 Manganese Oxide

3.3.13 Nanoclays

3.3.14 Nanofibers

3.3.15 Nanosilver

3.3.16 Nickel Oxide

3.3.17 Quantum Dots

3.3.18 Silicon Oxide

3.3.19 Titanium Dioxide

3.3.20 Yttrium Oxide

3.3.21 Zinc Oxide

3.3.22 Zirconium Oxide

Browse:Nanotechnology Market

4 MARKETS FOR NANOTECHNOLOGY

4.1 Aerospace

4.2 Automotive

4.3 Coatings

4.4 Cleaning and sanitary

4.5 Composites

4.6 Construction and exterior protection

4.7 Cosmetics and personal care

4.8 Electronics, optoelectronics and data storage

4.9 Energy

4.10 Environment, water and filtration

4.11 Medical and bio

4.12 Military and defense

4.13 Packaging

4.14 Textiles

5 CARBON NANOTUBES PRODUCERS

6 FULLERENES AND POSS PRODUCERS

7 GRAPHENE PRODUCERS

8 METAL OXIDE NANOPOWDER PRODUCERS

9 NANOCOMPOSITES PRODUCERS

10 NANOCOATINGS PRODUCERS

11 NANOCLAY PRODUCERS

12 NANOFIBERS PRODUCERS

13 NANOSILVER PRODUCERS

14 QUANTUM DOTS PRODUCERS


Nanotechnology for Wood

Nanotechnology treated woodThe use of nanotechnology is a real innovation in the development of wood coatings, particularly in relation to UV absorbing and penetration. Although wood is naturally durable and strong, the effects of out-door exposure can degrade not only these physical properties, but its natural beauty as well. In meeting this challenge, Nanovations has introduced a new VOC-free technology for clear impregnating wood protection.
The only way to slow the UV degradation of the surface is to incorporate a pigment or a UV stabilizer into the formulation. The colorless, UV-resistant, water-repellent preservative represents the broadest category of clear natural finishes on the market.

Nanoscale UV absorbers offer unique benefits in protecting coatings and coated substrates from being degraded by UV radiation. The small size of the particles makes it possible to offer high protection without affecting the transparency of    the impregnation.
 The  water based nature of the products is a functional and environmentally friendly solution against rot and moos and algae build-up. Water absorption is significantly reduced.   
Utilizing this well-developed science of nanotechnology, the product absorbs the most damaging rays of the Sun’s spectrum, Nanovations Lignol® Wood coatings, protect wood for longer in the harsh solar radiation in Australia and around the world without the discoloring effect of these  rays.   
Nanoscale UV absorber will protect the wood substrate from degrading UV radiation and will increase the lifetime of the coating. These products are most interesting in harsh outdoor environments.



Wooden house -Nanovations 
Please notice:
  


Hardwood with low water absorption rates and with a density higher than 900 kg/m3 can be difficult to treat. Some of the hardwood in that range like Merbau, Ironwood, Jarrah, Belian for example are not absorbing finishes well , not even solvent based oils. If you select such a wood species for external use, please be aware that if you want to keep the natural look of such a hardwood, more regular maintenance will be necessary. 


1. INTRODUCTION 
Nanotechnology has affected nearly every field of Engineering and Science but most of the innovation and funding (private) in Nanotechnology came from Electronics giants, in search for making faster computers. The other fields that worked with nano electronics hand in hand were nano-photonics and nano-instrumentation. Also the marketing and making of nano gadgets started from the computers and mobiles which are the only machines made at nano scale that were available economically in the market at a very early stage. So it is of no doubt that the only area where nanotechnology penetrated deeply is electronics where it had lead to cost advantage and performance attributes especially in transistors and today we have 1 billion transistors in the latest processor. The backbone of nanotechnology in electronics are the results that we have taken from nano physics that is quantum physics and solid state physics because then we talk of things at nano scale these are the two stream of physics that helps us in predicting things. Eventually when we talk of electronics it is all about electrons and how we use them in various gadgets to get the required result. So it is very important to know electrons and how it behaves at nano scale in electronics. 

Introduction and Importance Quantum Mechanics A fundamental aspect of quantum mechanics is the particle-wave duality, introduced by De Broglie, according to which any particle can be associated with a matter wave whose wavelength is inversely proportional to the particle's linear momentum. Whenever the size of a physical system becomes comparable to the wavelength of the particles that interact with such a system, the behavior of the particles is best described by the rules of quantum mechanics. All the information we need about the particle is obtained by solving its Schrodinger equation. The solutions of this equation represent the possible physical states in which the system can be found. But quantum mechanics is not required to describe the movement of objects in the macroscopic world. The wavelength associated with a macroscopic object is in fact much smaller than the object's size, and therefore the trajectory of such an object can be excellently derived using the principles of classical mechanics. Things change, for instance, in the case of electrons orbiting around a nucleus, since their associated wavelength is of the same order of magnitude as the electron-nucleus distance. We can use the concept of particle-wave duality to give a simple explanation of the behavior of carriers in a semiconductor nanocrystal. In a bulk inorganic semiconductor, conduction band electrons (and valence band holes) are free to move throughout the crystal, and their motion can be described satisfactorily by a linear combination of plane waves whose wavelength is generally of the order of nano-meters. This means that, whenever the size of a semiconductor solid becomes comparable to these wavelengths, a free carrier confined in this structure will behave as a particle in a potential box. The solutions of the Schrodinger equation in such case are standing waves confined in the potential well, and the energies associated with two distinct wave functions are, in general, different and discontinuous. This means that the particle energies cannot take on any arbitrary value, and the system exhibits a discrete energy level spectrum. Transitions between any two levels are seen as discrete peaks in the optical spectra, for instance. The system is then also referred to as ''quantum confined''. The main point here is that in order to rationalize (or predict) the physical properties of nanoscale materials, such as their electrical and thermal conductivity or their absorption and emission spectra, we need first to determine their energy level structure. 

2. THEORY OF NANO-ELECTRONICS 
2.1 PRESENT STATE OF NANO-ELECTRONICS. Moore's law states that the number transistor on an integrated chip for a component doubles every two years. How ever this law does not holds perfectly true for RAM (Random Access memory). This does not mean that the number of transistor on a chip increases but about the density of transistors at which the cost per transistor is the lowest.Currently processors are fabricated at 90nm and 65nm that are being introduced by Intel. The 90 nanometer (90 nm) process refers to the level of semiconductor process or fabrication technology that wasachieved in the 2002-2003 period, by most leading semiconductor companies, like Intel, Texas Instruments, IBM etc. However it is not true for RAM and hard disk. New materials seeing possible use in nano-electronics and will probably keep this law on track. The future is seen as molecular electronics but there is lots of work still to be done to make that possible. 

2.2 SILICON NANOTECHNOLOGY 
2.2.1 CMOS Nanotechnology 
For the past several decades, miniaturization in silicon integrated circuits has pro- gressed steadily with an exponential scale described by Moore's Law. This incredible progress has generally meant that critical dimensions are reduced by a factor of two every three years, while chip density increases by a factor of four over this period. However, modern chip manufacturers have been accelerating this pace recently, and currently chips are being made with gate lengths in the 45 to 65 nm range. More scaling is expected, however, and 15-nm gate lengths are scheduled for production before the end of this decade. 

In MOSFET there is electric field between the gate and the semiconductor is such that an inverted carrier population is created and forms a conducting channel. This channel extends between the source and drain regions, and the transport through this channel is modulated by the gate potential. As the channel length has gotten smaller, there has been considerable effort to incorporate a variety of new effects into the simple (as well as the more complex) models. These include short-channel effects, narrow width effects, degradation of the mobility due to surface scattering, hot carrier effects, and velocity overshoot. Ballistic transport in the MOSFET (discussed in later part) Thermodynamics is just as significant in limiting scaling as the preceding effects. The first way it limits scaling is in its control of the subthreshold behavior of MOSFETs. The subthreshold current of a MOSFET originates in the high-energy tail of the statistical distribution of carriers in its source region. The carriers in the source are governed by Fermi-Dirac statistics, and so the tail of the distribution is essentially Boltzmann. 
There two major scattering regions - the barrier between the channel and the source and within the channel. 

There also exists a phenomenon Granularity is the failure of thermodynamic averaging in small devices. Quantum behavior in the device, there are two effects and Effective Carrier Wave Packet . These effects also include tunneling through the gate insulator, tunneling through the band gap, quantum confinement issues, interface scattering, discrete atomistic effects in the doping and at interfaces, and thermal problems associated with very high power densities. 

Ballistic Properties:- 
It is the phenomenon where the the contribution in electrical resistivity due to scattering by the atoms, molecules or impurities in the medium itself, is negligible or absent meaning the electron can move without hindrances. There is no loss of kinetic energy due to collision of hitting of electrons with atom of metal thereby electrons move in a mean free path where it can move freely. 
Quantum mechanical scaling limitations include both confinement effects and tun- neling effects. Confinement effects occur when electron or hole wave functions are squeezed into narrow spaces between barriers. In FETs this primarily happens in the channel, where the charges are squeezed between the gate insulator on one side and the built-in field of the body on the other side. Quantum confinement in this approximately triangular well raises the ground state energy of the electrons or holes, which increases the threshold voltage, and shifts the mean position of the carriers a little farther from the Si-SiO2 interface. Quantum mechanical tunneling is generally more detrimental to scaling than the Confinement effects. When electrons or holes tunnel through the barriers of the FET, it causes leakage current. As scaling continues, this ultimately causes unacceptable increases in power dissipation. The leakage may also cause some types of dynamic logic circuits to lose their logic state, but the former problem usually seems to arise first. 

There are primarily two forms of tunneling leakage: tunneling current through the gate insulator, and tunneling current through the drain-to-body junction. The atomistic effects that cause limitations to scaling are those in which the discreteness of matter gives rise to large statistical variations in small devices. These statistical variations occur because the atoms or molecules tend to display Poisson statistics in their number or position, and the Poisson distribution for small numbers can become very wide. 

2.2.2 Memory 
As the semiconductor device feature size enters the sub-50-nm range, two new effects come into play. One is the quantum effect, which is rooted in the wave nature of the charge carriers, and gives rise to non classical transport effects such as resonant tunneling and quantum interference. The other is related to the quantized nature of the electronic charge, often manifested in the so-called single-electron effect: Charging each electron to a small confined region requires a certain amount of energy in order to overcome the Coulomb repulsion; if this charging energy is greater than the thermal energy, kb*T (kb Boltzman constant, T temperature), a single electron added to the region could have a significant effect on other electrons entering the confined region. 

To increase the storage density of semiconductor memories, the size of each memory cell must be reduced. A smaller memory cell also leads to higher speed and lower power consumption. This is the incentive for studying the nanoscale semiconductor memory. One of the general schemes for semiconductor data storage is by storing charges on a capacitor. The charged state and the uncharged state can be used to represent binary information 1 and 0, respectively. Usually charges are transferred to the capacitor through a resistive. The motivation for this work is to investigate the ultimate limit of a floating gate MOS memory. In a conventional floating gate memory, there are typically on the order of 10 to power 4 electrons stored on the floating gate to represent one bit of information. The ultimate limit in scaling down the floating gate memory is to use only one electron for the same purpose, hence the name "single-electron MOS memory" (SEMM). The advantage of such a memory is that not only can it be very small, but also it can provide some unique characteristics that are not available in the conventional device, such characteristics as quantized threshold voltage shift and quantized charging voltage. 

To make single-electron memory practical, both thermal fluctuation and quantum fluctuations of the stored charge have to be minimized. In order to reduce the variation in the device structure, we would like to build a single-electron memory device in crystalline silicon that has well-controlled dimensions. We defined the transistor channel and the floating gate by using lithography. Finally, the single-electron memory potentially has a number of advantages over conventional memories: (1) the quantized characteristics of the device make it immune to the noise from the environment-unless the noise level reaches a certain threshold, it will not affect the memory state. The immunity to noise is especially important for the future terabits integration, simply because of the sheer large number of devices present on a single small chip area. (2) the inherent quantized nature of the SEMM makes it possible to easily implement multilevel logic storage in a single memory cell; (3) the device can operate at a higher speed due to the use of only one or few electrons during writing and erasing; (4) for the same reason, the device can also have ultralow power consumption. 

2.3 NANO TUBES, CNT ELECTRONICS. 
Single-walled carbon nanotubes (SWNTs), which are graphite cylinders made of a hexagonal carbon-atom lattice, have drawn a great deal of interests due to their Fundamental research importance and tremendous potential technical applications . For Example, they might play an important role in future molecular electronic devices, such as room-temperature single electron and field-effect transistors , and rectifiers . A SWNT can be either a semiconductor or a metal, depending on its helicity and diameter. The electronic properties of the SWNT have been the subject of an increasing number of experimental and theoretical studies since 1995. And it is expected that very soon SWNT will see it's application in Nano electronics. SWNT is going to see it's application in transistors where it can reduce the gate length and also reduce leakage current. SWNTs have very low electrical resistance. Resistance occurs when an electron get deflected away from its path, when it is traveling through a material. In a 3-D conductor, electrons have plenty of opportunity to scatter, since they can do so at any angle. All these scatterings will give rise to electrical resistance. The situation is different in 1D. In a truly 1D conductor, however, electrons can only travel forward or backward. Only backscattering will lead to electrical resistance. But backscattering in nanotubes is impeded by the special symmetry of graphite and carbon nanotubes, and is therefore less likely to happen. Because of this, electrons can travel in nanotubes for long distances without being scattered, and this type of ballistic transport has been observed experimentally 

2.4 NANO WIRES 
A nanowire is a wire of dimensions of the order of a nanometer. They are also called quantum wires because their properties are governed by quantum mechanics. They can be used to link or connect tiny component is nanocircuits. They are referred as 1 dimensional materials (because their length to width ratio is very high). The electrons here are quantum confined and occupy different energy levels than those of bulk material. They will see their application in electronics, opto-electronics and Micro Electro mechanical systems. They will be seeing possible applations in future molecular electronic devices, as resonant tunneling diodes, single-electron transistors, and field effect structures and also in making logic gates. 

2.5 QUANTUM DOT 
It is the semiconductor nanostructure which exhibits the phenomenon of confining motions of electrons of conduction, valence or excitons in all three spatial directions. They have superior quantum and optical properties and are being researched for diode laser, amplifier, sensors, etc. They are also seeing application in Light Emitting Diodes(Quantum Dot Single Electron Device). The ability to control electron charging of a capacitive node by individual electron makes there devices suitable for memory application. A quantum well is a potential well that confines particles, which were originally free to move in three dimensions, to two dimensions, forcing them to occupy a planar region 

3. MANUFACTURING CHALLENGES Nanofabrication is being developed to construct devices such as resonant tunneling diodes and transistors and single electron transistors and carbon nanotube transistors. The most common type of transistor being developed for use at the nanoscale is the field effect transistor. Economics issues are constraining nano-electronics to hit market. Two ways of manufacturing nano materials are:- 
1. Bottom up self assembly (wet chemistry) In this type of fabrication we start from atoms or molecules to get to the desired material. 
2. Top down self assembly (Lithography and derivatives) In this type of fabrication the bulk material is broken down into smaller pieces. Thought we have knowledge about many new materials and their physics at nanoscale but to get the technology economically available (cost effective) and to get the state of art levels of manufacturing nanomaterials is still under development. 

4. ABBREVIATION:- 
IBM - International Business Machine 
CMOS - Complimentary metal oxide semiconductor 
CNT - Carbon nanotube 
SWNT - Single Walled Nano Tube (carbon nanotube in most cases) 
MOSFET - Metal-oxide-semiconductor field-effect transistor 

The advances in ultra-large-scale integration (ULSI) technology mainly have been based on downscaling of the minimum feature size of complementary metal-oxide semiconductor (CMOS) transistors. The limit of scaling is approaching and there are unsolved problems such as the number of electrons in the device’s active region. If this number is reduced to less than 10 electrons (or holes), quantum fluctuation errors will occur and the gate insulator thickness will become too small to block quantum mechanical tunneling, which may result in unacceptably large leakage currents. On the other hand, the recent evolution of nanotechnology may provide opportunities for novel devices, such as single-electron devices, carbon nanotubes, Si nanowires, and new materials, which may solve these problems. Utilization of quantum effects and ballistic transport characteristics also may provide novel func- tions for silicon-based devices. Among various candidate materials for nanometer scale devices, silicon nanodevices are particularly promising because of the existing silicon process infrastructure in semiconductor industries, the compatibility to CMOS circuits, and a nearly perfect interface between the natural oxide and silicon. 

For the past several decades, miniaturization in silicon integrated circuits has pro1 gressed steadily with an exponential scale described by Moore’s Law. This incredible progress has generally meant that critical dimensions are reduced by a factor of two every three years, while chip density increases by a factor of four over this period. However, modern chip manufacturers have been accelerating this pace recently, and currently chips are being made with gate lengths in the 45 to 65 nm range. More scaling is expected, however, and 15-nm gate lengths are scheduled for production before the end of this decade. Such devices have been demonstrated by Intel and AMD, and IBM has recently shown a 6-nm gate length p-channel FET. While the creation of these very small transistors is remarkable enough, the fact that they seem to operate in a quite normal fashion is perhaps even more remarkable. 


nanotechnology research
Nanomaterials for energy production and storage, nanoparticles for drug delivery and biosensors for diagnostics… Nanotechnology is an emerging technology with applications in almost all sectors, and it is expected to lead the future of technological development. However, as a relatively new field, the use of research results is still in its early stages and there were no good practices identified, especially from a multidisciplinary perspective (research centers, venture capitalists and experts in intellectual protection).
Models of traditional technology transfer were probably not applicable for the “nano” area. Therefore, it was considered fundamental to carry out a research project in this subject because transfer models are essential for any sector related to R&D in order to market and protect the research results appropriately.
As Intellectual Property Rights can be applied to different stages of the product development, from basic research to commercialisation, the first step was to examine the value chains and the appropriate IP model for the technology transfer for twelve selected case studies from five industries: electronics, energy, life sciences, materials, and water and environment. The Nano2Market partners then conducted a Technology Mapping exercise in order to list the areas of the “nano” innovations. Models and strategies were validated by a panel of experts.
As a result, the project has provided an Intellectual Property Guide for nanotechnology research, which develops guidelines for licensing agreements and consortium and comments and suggestions to manage IP in nanotechnology projects. The guidelines depend on the type of application, level of development of the research, and the size of the organisation (small or large research institutions, or small, medium or large enterprises). The size is important because partners found that smaller organisations employed different strategies because the funds were not available. People are generally more confident working, investing and collaborating with larger institutions or enterprises but the smaller ones must wait until the technology is in a more advanced stage before licensing.
Nano2market has also designed an interactive Toolbox, which provides tools for companies that wish to collaborate on nanotech projects and tools for investors (business angels, venture capitalists and corporate venture capitalists). This instrument offers best practices reports for businesses and useful information on how to place a development in the market thanks to an adequate business plan for entrepreneurs.
Coordinated from the Office of International Projects at the University of Alicante (Spain), the project Nano2Market (Best Practices for IPR and Technology Transfer in Nanotechnology development) lasted a year and had a budget of 700.000 euros, financed by the European Commission under the Seventh Framework Programme.


carbon nanotubes
Characterizing devices at low current levels requires knowledge, skill, and the right test equipment. Even with all three, achieving accuracy in these measurements can be a challenge because the current level is often at or below the noise level of the test setup. To ensure measurement accuracy, it is important to know the type of test equipment to use, the different sources of measurement error, and the appropriate techniques to minimize these errors. Examining several test examples, such as characterization of a field-effect transistor (FET) and a carbon nanotube, can help in the learning process.
The term low current is relative, of course. A current level considered low for one application, such as 1mA, may be high for a device operating at 10nA. In general, an instrument’s noise level will establish its low-level sensitivity, with low current measurements referring to those made near an instrument’s noise level. Trends in portable and remote electronic devices, along with advances in semiconductors and nanotechnology, are requiring greater use of low current measurements. Small geometry devices, photovoltaic devices, and carbon nanotubes (CNTs) are a few examples of devices designed to operate at extremely low current levels, and all of these devices must be characterized in terms of their current-voltage characteristics (I-V measurements).
A number of instruments are available for low-current measurements, depending on the type of device under test (DUT) and the level of current to be measured. Perhaps the most ubiquitous tool on production lines and in field service is the digital multimeter (DMM), which typically provides capabilities for measuring current, voltage, resistance, and temperature. The range of commercial products is wide, from low-cost units with 3½-digit readout resolution to rack-mount and benchtop high precision laboratory units. The most sensitive DMMs available can measure current levels as low as about 10pA.
When greater precision is needed, various forms of ammeters are available to measure current. These can be as simple as older types that measure current flow from the mechanical deflection of a coil in a magnetic field. More modern digital ammeters use an analog-to-digital converter (ADC) to measure the voltage across a shunt resistor and then determine and display the current from that reading. Newer picoammeters typically use a feedback resistor, which allows more accuracy in current measurements at such low levels. They are available in various configurations, including high-speed models and logarithmic units capable of measuring a wide range of currents. While they are extremely versatile, it is useful to understand the performance limitations of feedback ammeters.
Feedback Ammeter Performance. A simple feedback ammeter can be modeled with a small number of parameters. The current source is modeled as a voltage source in series with a parallel RC circuit, i.e., the source resistance (RS) and parallel source capacitance (CS). The feedback ammeter is modeled as a feedback amplifier with a parallel RC feedback circuit across it (i.e., RF and CF), with the two amplifier inputs being the external current source and the internal voltage noise source, VNOISE. The capacitances in the source and measurement circuits are parasitic elements associated with the resistances and circuit wiring. Using this model and ignoring capacitance, the noise gain of the ammeter circuit can be found from:
Output Voltage Noise = (Input Voltage Noise) x (1 + RF/RS)
As this equation implies, the output of a feedback ammeter circuit is a voltage, which is proportional to the input current. As the source resistance decreases in value, the output noise increases. When RF = RS, the input noise is multiplied by a factor of 2. If the source resistance is too low, it can have a detrimental effect on the noise performance of the measurement system. The optimum source resistance is a function of required measurement range for an ammeter, with a minimum value of 1MegOhm to measure nanoamps of current, compared to a minimum value of 1GigOhm to measure picoamps of current.
However, source capacitance can also affect the noise performance of a low current measurement instrument. In general, as the source capacitance increases, the noise gain also increases. This means that the equation above should be modified by substituting the feedback impedance (ZF) for the feedback resistance (RF) and the source impedance (ZS) for the source resistance (RS).
Additional current measurement instruments include electrometers and source-measure units (SMUs). An electrometer is essentially a voltmeter with a high input impedance (1TOhm and higher) that can be used to measure low current levels. It can be used as an ammeter to measure low current levels even at low voltages, and can also be used as a voltmeter to make voltage measurements with minimal effect on the circuit being measured. As an ammeter, an electrometer can measure currents as low as the instrument’s input offset current, as low as 1fA in some cases. As a voltmeter, an electrometer can measure the voltage on a capacitor without significantly discharging the device, and can measure the potential of piezoelectric crystals and high-impedance pH electrodes.
The SMU is an innovation for making low-current measurements. It combines precision current sources and voltage sources with sensitive detection circuitry for measuring both current and voltage. An SMU can simultaneously provide a source of current and measure voltage or provide a source of voltage and measure current. A well-equipped SMU may include a voltage source, current source, ammeter, voltmeter, and ohmmeter and is also programmable for use in automatic-test-equipment (ATE) systems.
Minimizing External Noise. All of these measuring instruments are effective tools for measuring current, but their sensitivity to low levels of current will be limited mainly by sources of noise, both within and external to the instrument. The DUT also affects the level of current that can be accurately measured with a given instrument, because the DUT’s source resistance (RS) establishes the level of Johnson current noise (IJ), which is low-level noise caused by temperature effects on electrons in a conductor. Johnson noise, which can be expressed in terms of either current or voltage, is essentially the voltage noise of a device divided by the device resistance:
IJ= √(4kTB/RS) / RS,
where
k = Boltzmann’s constant (1.38 × 10–23 J/K),
T = Absolute temperature of the source (in ºK),
B = the noise bandwidth (in Hz), and
RS = the resistance of the source (in ohms).
Both temperature and noise bandwidth affect the Johnson current noise. A reduction in either parameter will also reduce the Johnson current noise. Cryogenic cooling, for example, is often used to reduce noise in amplifiers and other circuits but adds cost and complexity. The noise bandwidth can be reduced by filtering, but this will result in slowing the measurement speed. The Johnson current noise also decreases as the DUT’s source resistance decreases, but this is not often a practical or even possible option.
Ideally, a current measurement would be just that of the DUT source. However, current noise from various unwanted sources can make it difficult to read a low-level DUT source current. One of these unwanted sources is part of the measurement system itself, i.e., the coaxial cables used to interconnect test instruments to each other or to the DUT. Typical test cables can generate as much as tens of nanoamps of current as a result of the triboelectric effect. This occurs when the outer shield of a coaxial test cable rubs against the cable’s insulation when the cable is flexed. As a result, electrons are stripped from the insulation, and added to the current total. In some applications, such as nanotechnology and semiconductor research, the current generated by this effect may exceed the level of current to be measured from the DUT.
Triboelectric effects can be minimized by using low-noise cable, with an inner insulator of polyethylene coated with graphite underneath the outer shield. The graphite reduces friction, and provides a path for the displaced electrons to return to their original locations, eliminating random electron motion and their contribution to the additional noise level. Excess current flow from triboelectric effects can also be minimized by reducing the length of the test cables as much as possible. The test setup should be isolated from vibration to minimize unwanted movement of the test cables, by positioning test cables on top of vibration-absorbing material, such as foam rubber. Test cable movement can also be minimized by taping the cables to a stable surface, such as the test bench.
Piezoelectric effect is another source of error in low-current measurements. It causes spurious current generation due to mechanical stress on susceptible materials. The effect varies by material, although some materials commonly used in electronic systems, such as polytetrafluoroethylene (PTFE) dielectrics, can produce a relatively large amount of current for a given amount of stress and vibration. Ceramic materials are less affected by piezoelectric effects and produce lower current levels. To minimize current generated by this effect, it is critical to minimize mechanical stress on insulators and construct the low-current test system using insulating materials with minimal piezoelectric properties.
Insulators can also degrade low-current measurement accuracy by means of dielectric absorption. This phenomenon occurs when a high-enough voltage across an insulator causes positive and negative charges to polarize. When the voltage is removed from the insulator, it gives up the separated charges as a decaying current, which is added to the total amount measured during a test. The decay time for the current from dielectric absorption to dissipate can be from minutes to hours. The effect can be minimized by applying only low-voltage levels to insulators used for low-current measurements.
Insulators can also degrade low-current measurement accuracy due to contamination from salt, moisture, oil, or even fingerprints on the surface of the insulator. Contamination effects can also plague printed circuit boards in a test fixture or in the test setup when, for example, excessive flux is used when soldering. On an insulator, the contamination acts to form a low-current battery at a sensitive current node within the insulator, generating noise currents that can be on the order of nanoamps. To minimize measurement errors from insulator contamination, an operator should wear gloves when handling insulators or simply avoid touching them. The use of solder should be minimized, and solder areas should be cleaned with an appropriate solvent, such as isopropyl alcohol. A clean cotton swab should be used for every cleaning, and cotton swabs should never be reused or dipped into the cleaning solution after having been used for cleaning.
It is critical to make low-current measurements in the absence of magnetic fields, because such fields can induce current flow in conductors. This is typically due to variations in magnetic field intensity, or motion of a conductor within a magnetic field. Both cases should be avoided to maintain measurement accuracy, which is best accomplished by properly shielding the measuring instrument or system.
Minimizing Instrument Offset Current. An instrument used for low-current measurements should show a zero reading when its input terminals are left in an open-circuit condition. Unfortunately, this is rarely the case due to a small current known as the input offset current. It is caused by bias currents of active devices in measuring instrument circuitry, as well as leakage current through insulators in the instrument or test system. Most instrument manufacturers specify the input offset current on their products’ data sheets for comparison purposes, and this small amount of current must be taken into account in any low-current measurement. In other words, the instrument’s reading is actually the sum of the DUT source current and the instrument’s input offset current.
The input offset current can be found by capping the input connector and selecting the lowest current range available on the measuring instrument. The reading shown by the instrument, after it has properly settled to a stable value, should be within the specification shown on the instrument’s data sheet and can be subtracted from DUT readings. On some instruments, a current-suppression function can partially null input offset current.
Another way to subtract input offset current from a low-current measurement is to use a relative function found on some measuring equipment, such as ammeters. The relative function stores the reading of whatever residual offset current is being measured with the input terminals left in an open-circuit condition; this reading is treated as the zero point for subsequent readings.
Application Examples. Some examples of practical low-current measurements include characterization of field effect transistors (FETs) and CNT devices. A more common FET test involves evaluation of a device’s common-source characteristics. Even at low current levels, the drain current can be studied using a simple test setup with a two-channel SMU, such as the Keithley Series 2600A System SourceMeter instrument. A two-channel SMU has the capability to source current or voltage and measure current or voltage simultaneously. To characterize a FET, it is mounted in a test fixture that allows secure ground and bias connections. One SMU channel supplies a swept gate-source voltage (VGS) to the FET while the other supplies a swept drain-source voltage (VDS) and measures the FET’s drain current (ID). This simple test setup allows the measurement of drain currents as low as 10nA or less.
Electronic materials such as photovoltaic wafers and CNT sheets are typically characterized in terms of their current density—the amount of current they can generate for a given area of material. Researchers from South Korea’s Seoul National University, conduct such tests to evaluate multi-walled carbon nanotube (MWNT) devices fabricated on an arc-discharge CNT substrate using a Keithley Model 6517 electrometer. In these studies, current densities as low as 10–4/cm2 were measured at applied electric fields of 5V/μm and less. Practical analysis of the I-V characteristics of CNT-based electronics can also be performed in a manner similar to that for the FET by using a pair of SMUs to sweep drain and gate voltages while measuring and plotting the drain current as a function of gate voltage.
The required resolution and accuracy of low-current measurements will dictate the type of measurement tool used. When accuracy is less of an issue, a basic DMM may suffice. But for more demanding requirements, a precision electrometer or SMU may be needed. These precision instruments are optimized for low-current measurements, providing measurement resolution as small as 1fA. More techniques and tips on low current measurements are contained in Keithley’s Low Level Measurements Handbook.


nanoribbon
As far back as the 1990s, long before anyone had actually isolated graphene – a honeycomb lattice of carbon just one atom thick – theorists were predicting extraordinary properties at the edges of graphene nanoribbons. Now physicists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), and their colleagues at the University of California at Berkeley, Stanford University, and other institutions, have made the first precise measurements of the “edge states” of well-ordered nanoribbons.
A graphene nanoribbon is a strip of graphene that may be only a few nanometers wide (a nanometer is a billionth of a meter). Theorists have envisioned that nanoribbons, depending on their width and the angle at which they are cut, would have unique electronic, magnetic, and optical features, including band gaps like those in semiconductors, which sheet graphene doesn’t have.
“Until now no one has been able to test theoretical predictions regarding nanoribbon edge-states, because no one could figure out how to see the atomic-scale structure at the edge of a well-ordered graphene nanoribbon and how, at the same time, to measure its electronic properties within nanometers of the edge,” says Michael Crommie of Berkeley Lab’s Materials Sciences Division (MSD) and UC Berkeley’s Physics Division, who led the research. “We were able to achieve this by studying specially made nanoribbons with a scanning tunneling microscope.”
The team’s research not only confirms theoretical predictions but opens the prospect of building quick-acting, energy-efficient nanoscale devices from graphene-nanoribbon switches, spin-valves, and detectors, based on either electron charge or electron spin. Farther down the road, graphene nanoribbon edge states open the possibility of devices with tunable giant magnetoresistance and other magnetic and optical effects.
The well-tempered nanoribbon
“Making flakes and sheets of graphene has become commonplace,” Crommie says, “but until now, nanoribbons produced by different techniques have exhibited, at best, a high degree of inhomogeneity” – typically resulting in disordered ribbon structures with only short stretches of straight edges appearing at random. The essential first step in detecting nanoribbon edge states is access to uniform nanoribbons with straight edges, well-ordered on the atomic scale.
Hongjie Dai of Stanford University’s Department of Chemistry and Laboratory for Advanced Materials, a member of the research team, solved this problem with a novel method of “unzipping” carbon nanotubes chemically. Graphene rolled into a cylinder makes a nanotube, and when nanotubes are unzipped in this way the slice runs straight down the length of the tube, leaving well-ordered, straight edges.
Graphene can be wrapped at almost any angle to make a nanotube. The way the nanotube is wrapped determines the pitch, or “chiral vector,” of the nanoribbon edge when the tube is unzipped. A cut straight along the outer atoms of a row of hexagons produces a zigzag edge. A cut made at a 30-degree angle from a zigzag edge goes through the middle of the hexagons and yields scalloped edges, known as “armchair” edges. Between these two extremes are a variety of chiral vectors describing edges stepped on the nanoscale, in which, for example, after every few hexagons a zigzag segment is added at an angle.
These subtle differences in edge structure have been predicted to produce measurably different physical properties, which potentially could be exploited in new graphene applications. Steven Louie of UC Berkeley and Berkeley Lab’s MSD was the research team’s theorist; with the help of postdoc Oleg Yazyev, Louie calculated the expected outcomes, which were then tested against experiment.
Chenggang Tao of MSD and UCB led a team of graduate students in performing scanning tunneling microscopy (STM) of the nanoribbons on a gold substrate, which resolved the positions of individual atoms in the graphene nanoribbons. The team looked at more than 150 high-quality nanoribbons with different chiralities, all of which showed an unexpected feature, a regular raised border near their edges forming a hump or bevel. Once this was established as a real edge feature – not the artifact of a folded ribbon or a flattened nanotube – the chirality and electronic properties of well-ordered nanoribbon edges could be measured with confidence, and the edge regions theoretically modeled.
Electronics at the edge
“Two-dimensional graphene sheets are remarkable in how freely electrons move through them, including the fact that there’s no band gap,” Crommie says.
“Nanoribbons are different: electrons can become trapped in narrow channels along the nanoribbon edges. These edge-states are one-dimensional, but the electrons on one edge can still interact with the edge electrons on the other side, which causes an energy gap to open up.”
Using an STM in spectroscopy mode (STS), the team measured electronic density changes as an STM tip was moved from a nanoribbon edge inward toward its interior. Nanoribbons of different widths were examined in this way. The researchers discovered that electrons are confined to the edge of the nanoribbons, and that these nanoribbon-edge electrons exhibit a pronounced splitting in their energy levels.
“In the quantum world, electrons can be described as waves in addition to being particles,” Crommie notes. He says one way to picture how different edge states arise is to imagine an electron wave that fills the length of the ribbon and diffracts off the atoms near its edge. The diffraction patterns resemble water waves coming through slits in a barrier.
For nanoribbons with an armchair edge, the diffraction pattern spans the full width of the nanoribbon; the resulting electron states are quantized in energy and extend spatially throughout the entire nanoribbon. For nanoribbons with a zigzag edge, however, the situation is different. Here diffraction from edge atoms leads to destructive interference, causing the electron states to localize near the nanoribbon edges. Their amplitude is greatly reduced in the interior.
The energy of the electron, the width of the nanoribbon, and the chirality of its edges all naturally affect the nature and strength of these nanoribbon electronic states, an indication of the many ways the electronic properties of nanoribbons can be tuned and modified.
Says Crommie, “The optimist says, ‘Wow, look at all the ways we can control these states – this might allow a whole new technology!’ The pessimist says, ‘Uh-oh, look at all the things that can disturb a nanoribbon’s behavior – how are we ever going to achieve reproducibility on the atomic scale?’”
Crommie himself declares that “meeting this challenge is a big reason for why we do research. Nanoribbons have the potential to form exciting new electronic, magnetic, and optical devices at the nanoscale. We might imagine photovoltaic applications, where absorbed light leads to useful charge separation at nanoribbon edges. We might also imagine spintronics applications, where using a side-gate geometry would allow control of the spin polarization of electrons at a nanoribbon’s edge.”
Although getting there won’t be simple — “The edges have to be controlled,” Crommie emphasizes — “what we’ve shown is that it’s possible to make nanoribbons with good edges and that they do, indeed, have characteristic edge states similar to what theorists had expected. This opens a whole new area of future research involving the control and characterization of graphene edges in different nanoscale geometries.”

Sponser's Link