David Kirkpatrick

September 2, 2010

Cool nanotech image — the perfect nanocube

Check this out

Caption: These electron microscope images show perfect-edged nanocubes produced in a one-step process created at NIST that allows careful control of the cubes’ size, shape and composition.

Credit: NIST

Usage Restrictions: None

Related news release: The perfect nanocube: Precise control of size, shape and composition


Caption: These electron microscope images show perfect-edged nanocubes produced in a one-step process created at NIST that allows careful control of the cubes’ size, shape and composition.

Credit: NIST

Usage Restrictions: None

Related news release: The perfect nanocube: Precise control of size, shape and composition

Head below the fold for the accompanying release: (more…)

August 18, 2010

The world’s darkest material

I’ve previously blogged on a world’s darkest material in the past (couldn’t find the post in the archives, however) and it was nanotech-based as well so it’s possible this is the same stuff. Pretty cool either way.

From the link:

Harnessing darkness for practical use, researchers at the National Institute of Standards and Technology have developed a laser power detector coated with the world’s darkest material — a forest of carbon nanotubes that reflects almost no light across the visible and part of the infrared spectrum.

NIST will use the new ultra-dark detector, described in a new paper in ,* to make precision laser power measurements for advanced technologies such as optical communications, laser-based manufacturing, solar energy conversion, and industrial and satellite-borne sensors.

Inspired by a 2008 paper by Rensselaer Polytechnic Institute (RPI) on “the darkest man-made material ever,”** the NIST team used a sparse array of fine nanotubes as a coating for a thermal detector, a device used to measure . A co-author at Stony Brook University in New York grew the nanotube coating. The coating absorbs  and converts it to heat, which is registered in pyroelectric material (lithium tantalate in this case). The rise in temperature generates a current, which is measured to determine the power of the laser. The blacker the coating, the more efficiently it absorbs light instead of reflecting it, and the more accurate the measurements.

This is a colorized micrograph of the world’s darkest material — a sparse “forest” of fine carbon nanotubes — coating a NIST laser power detector. Image shows a region approximately 25 micrometers across. Credit: Aric Sanders, NIST

August 11, 2010

Better understanding graphene

Yes, graphene is something of a miracle material (hit this link for my extensive graphene blogging), and yes it’s proving to be very vexing material as well. There’s a whole lot of promise, but not so much practice because graphene is proving to be a very fickle material. Research like this from the Georgia Institute of Technology is particularly important because unlocking the secret life of graphene will allow for increasing practical applications. Better understanding will lead to better utilization.

The release:

Study of electron orbits in multilayer graphene finds unexpected energy gaps

Electron transport

IMAGE: Stacking of graphene sheets creates regions where the moiré alignment is of type AA (all atoms have neighbors in the layer below), AB (only A atoms have neighbors) or BA…

Click here for more information.

Researchers have taken one more step toward understanding the unique and often unexpected properties of graphene, a two-dimensional carbon material that has attracted interest because of its potential applications in future generations of electronic devices.

In the Aug. 8 advance online edition of the journal Nature Physics, researchers from the Georgia Institute of Technology and the National Institute of Standards and Technology (NIST) describe for the first time how the orbits of electrons are distributed spatially by magnetic fields applied to layers of epitaxial graphene.

The research team also found that these electron orbits can interact with the substrate on which the graphene is grown, creating energy gaps that affect how electron waves move through the multilayer material. These energy gaps could have implications for the designers of certain graphene-based electronic devices.

“The regular pattern of energy gaps in the graphene surface creates regions where electron transport is not allowed,” said Phillip N. First, a professor in the Georgia Tech School of Physics and one of the paper’s co-authors. “Electron waves would have to go around these regions, requiring new patterns of electron wave interference. Understanding such interference will be important for bi-layer graphene devices that have been proposed, and may be important for other lattice-matched substrates used to support graphene and graphene devices.”

In a magnetic field, an electron moves in a circular trajectory – known as a cyclotron orbit – whose radius depends on the size of the magnetic field and the energy of electron. For a constant magnetic field, that’s a little like rolling a marble around in a large bowl, First said.

“At high energy, the marble orbits high in the bowl, while for lower energies, the orbit size is smaller and lower in the bowl,” he explained. “The cyclotron orbits in graphene also depend on the electron energy and the local electron potential – corresponding to the bowl – but until now, the orbits hadn’t been imaged directly.”

Placed in a magnetic field, these orbits normally drift along lines of nearly constant electric potential. But when a graphene sample has small fluctuations in the potential, these “drift states” can become trapped at a hill or valley in the material that has closed constant potential contours. Such trapping of charge carriers is important for the quantum Hall effect, in which precisely quantized resistance results from charge conduction solely through the orbits that skip along the edges of the material.

IMAGE: This graphic shows electrons that move along an equipotential, while those that follow closed equipotentials (as in a potential-energy valley) become localized (right). The arrows denote the magnetic field,…

Click here for more information.

The study focused on one particular electron orbit: a zero-energy orbit that is unique to graphene. Because electrons are matter waves, interference within a material affects how their energy relates to the velocity of the wave – and reflected waves added to an incoming wave can combine to produce a slower composite wave. Electrons moving through the unique “chicken-wire” arrangement of carbon-carbon bonds in the graphene interfere in a way that leaves the wave velocity the same for all energy levels.

In addition to finding that energy states follow contours of constant electric potential, the researchers discovered specific areas on the graphene surface where the orbital energy of the electrons changes from one atom to the next. That creates an energy gap within isolated patches on the surface.

“By examining their distribution over the surface for different magnetic fields, we determined that the energy gap is due to a subtle interaction with the substrate, which consists of multilayer graphene grown on a silicon carbide wafer,” First explained.

In multilayer epitaxial graphene, each layer’s symmetrical sublattice is rotated slightly with respect to the next. In prior studies, researchers found that the rotations served to decouple the electronic properties of each graphene layer.

“Our findings hold the first indications of a small position-dependent interaction between the layers,” said David L. Miller, the paper’s first author and a graduate student in First’s laboratory. “This interaction occurs only when the size of a cyclotron orbit – which shrinks as the magnetic field is increased – becomes smaller than the size of the observed patches.”

The origin of the position dependent interaction is believed to be the “moiré pattern” of atomic alignments between two adjacent layers of graphene. In some regions, atoms of one layer lie atop atoms of the layer below, while in other regions, none of the atoms align with the atoms in the layer below. In still other regions, half of the atoms have neighbors in the underlayer, an instance in which the symmetry of the carbon atoms is broken and the Landau level – discrete energy level of the electrons – splits into two different energies.

Experimentally, the researchers examined a sample of epitaxial graphene grown at Georgia Tech in the laboratory of Professor Walt de Heer, using techniques developed by his research team over the past several years.

They used the tip of a custom-built scanning-tunneling microscope (STM) to probe the atomic-scale electronic structure of the graphene in a technique known as scanning tunneling spectroscopy. The tip was moved across the surface of a 100-square nanometer section of graphene, and spectroscopic data was acquired every 0.4 nanometers.

The measurements were done at 4.3 degrees Kelvin to take advantage of the fact that energy resolution is proportional to the temperature. The scanning-tunneling microscope, designed and built by Joseph Stroscio at NIST’s Center for Nanoscale Science and Technology, used a superconducting magnet to provide the magnetic fields needed to study the orbits.

According to First, the study raises a number of questions for future research, including how the energy gaps will affect electron transport properties, how the observed effects may impact proposed bi-layer graphene coherent devices – and whether the new phenomenon can be controlled.

“This study is really a stepping stone in long path to understanding the subtleties of graphene’s interesting properties,” he said. “This material is different from anything we have worked with before in electronics.”


In addition to those already mentioned, the study also included Walt de Heer, Kevin D. Kubista, Ming Ruan, and Markus Kinderman from Georgia Tech and Gregory M. Rutter from NIST. The research was supported by the National Science Foundation, the Semiconductor Research Corporation and the W.M. Keck Foundation. Additional assistance was provided by Georgia Tech’s Materials Research Science and Engineering Center (MRSEC)

May 20, 2010

The latest supercomputer concept — atomtronic computers

This is a really wild idea, and truly once we are able to manipulate individual atoms in this way supercomputing will only be one of the amazing things that’ll be happening.

From the link:

“The emerging field of atomtronics aims to construct analogies of electronic components, systems and devices using ultracold atoms,” say Ron Pepino and pals at the National Institute of Standards and Technology (NIST) in Boulder Colorado.

Today, they outline their vision for atomtronics, show how it works and explain why it could shape the future of information processing.

The idea is to manipulate neutral atoms using lasers in a way that mimics the behaviour of electrons in wires, transistors and logic gates. Over the last decade or two, physicists at NIST and elsewhere have become masters at creating optical lattices in which atoms can be pushed pulled and prodded at will.

But this kind of optical lion taming has limited appeal so Pepino and co have begun a program to put tame atoms to work.

The problem is that atoms don’t behave like electrons so building the atomtronic equivalent of something even as straightforward as a simple circuit consisting of a battery and resistor in series, requires some thinking out of the box.

Pepino and co say that transferring atoms from one reservoir to another is a decent enough analogy and that this transfer can take place thorugh an optical lattice in which atoms tunnel at a uniform rate. That’s their simple circuit analogy.

March 18, 2010

Graphene may be key to storing hydrogen

Needless to say this will have a major impact on using hydrogen as a power source in fuel cells or other applications.

The release:

Layered graphene sheets could solve hydrogen storage issues

IMAGE: A graphene-oxide framework (GOF) is formed of layers of graphene connected by boron-carboxylic “pillars.” GOFs such as this one are just beginning to be explored as a potential storage medium…

Click here for more information.

Graphene—carbon formed into sheets a single atom thick—now appears to be a promising base material for capturing hydrogen, according to recent research* at the National Institute of Standards and Technology (NIST) and the University of Pennsylvania. The findings suggest stacks of graphene layers could potentially store hydrogen safely for use in fuel cells and other applications.

Graphene has become something of a celebrity material in recent years due to its conductive, thermal and optical properties, which could make it useful in a range of sensors and semiconductor devices. The material does not store hydrogen well in its original form, according to a team of scientists studying it at the NIST Center for Neutron Research. But if oxidized graphene sheets are stacked atop one another like the decks of a multilevel parking lot, connected by molecules that both link the layers to one another and maintain space between them, the resulting graphene-oxide framework (GOF) can accumulate hydrogen in greater quantities.

Inspired to create GOFs by the metal-organic frameworks that are also under scrutiny for hydrogen storage, the team is just beginning to uncover the new structures’ properties. “No one else has ever made GOFs, to the best of our knowledge,” says NIST theorist Taner Yildirim. “What we have found so far, though, indicates GOFs can hold at least a hundred times more hydrogen molecules than ordinary graphene oxide does. The easy synthesis, low cost and non-toxicity of graphene make this material a promising candidate for gas storage applications.”

The GOFs can retain 1 percent of their weight in hydrogen at a temperature of 77 degrees Kelvin and ordinary atmospheric pressure—roughly comparable to the 1.2 percent that some well-studied metal-organic frameworks can hold, Yildirim says.

Another of the team’s potentially useful discoveries is the unusual relationship that GOFs exhibit between temperature and hydrogen absorption. In most storage materials, the lower the temperature, the more hydrogen uptake normally occurs. However, the team discovered that GOFs behave quite differently. Although a GOF can absorb hydrogen, it does not take in significant amounts at below 50 Kelvin (-223 degrees Celsius). Moreover, it does not release any hydrogen below this “blocking temperature”—suggesting that, with further research, GOFs might be used both to store hydrogen and to release it when it is needed, a fundamental requirement in fuel cell applications.

Some of the GOFs’ capabilities are due to the linking molecules themselves. The molecules the team used are all benzene-boronic acids that interact strongly with hydrogen in their own right. But by keeping several angstroms of space between the graphene layers—akin to the way pillars hold up a ceiling—they also increase the available surface area of each layer, giving it more spots for the hydrogen to latch on.

According to the team, GOFs will likely perform even better once the team explores their parameters in more detail. “We are going to try to optimize the performance of the GOFs and explore other linking molecules as well,” says Jacob Burress, also of NIST. “We want to explore the unusual temperature dependence of absorption kinetics, as well as whether they might be useful for capturing greenhouse gases such as carbon dioxide and toxins like ammonia.”


The research is funded in part by the Department of Energy.

* J. Burress, J. Simmons, J. Ford and T.Yildirim. “Gas adsorption properties of graphene-oxide-frameworks and nanoporous benzene-boronic acid polymers.” To be presented at the March meeting of the American Physical Society (APS) in Portland, Ore., March 18, 2010. An abstract is available at http://meetings.aps.org/Meeting/MAR10/Event/122133

October 7, 2009

Small business cybersecurity guide from NIST

Cybersecurity is important at all levels of business, and is often a place where small business looks to cut corners and save money.

The release:

New computer security guide can help safeguard your small business

Just in time for October’s Cyber Security Awareness Month, the National Institute of Standards and Technology (NIST) has published a guide to help small businesses and organizations understand how to provide basic security for their information, systems and networks. NIST has also created a video that explores the reasons small businesses need to secure their data.

The guide, Small Business Information Security: The Fundamentals, was authored by Richard Kissel, who spends much of his time on the road teaching computer security to groups of small business owners ranging from tow truck operators to managers of hospitals, small manufacturers and nonprofit organizations. The 20-page guide uses simple and clear language to walk small business owners through the important steps necessary to secure their computer systems and data.

Small businesses make up more than 95 percent of the nation’s businesses, are responsible for about 50 percent of the Gross National Product and create about 50 percent of the country’s new jobs, according to a 2009 Small Business Administration report. Yet these organizations rarely have the information technology resources to protect their sensitive information that larger corporations do.

Consequently, they could be seen as easy marks by hackers and cyber criminals, who could easily focus more of their unwanted attention on small businesses. And just like big companies, the computers at small businesses hold sensitive information on customers, employees and business partners that needs to be guarded, Kissel says. He adds that regulatory agencies have requirements to protect some health, financial and other information.

“There’s a very small set of actions that a small business can do to avoid being an easy target, but they have to be done and done consistently,” Kissel says.

In the guide Kissel provides 10 “absolutely necessary steps” to secure information, which includes such basics as installing firewalls, patching operating systems and applications and backing up business data, as well as controlling physical access to network components and training employees in basic security principles.

He also provides 10 potential security trouble spots to be aware of such as e-mail, social media, online banking, Web surfing and downloading software from the Internet, as well as security planning considerations. The guide’s appendices provide assistance on identifying and prioritizing an organization’s information types, recognizing the protection an organization needs for its priority information types and estimating the potential costs of bad things happening to important business information.


NIST works with the Small Business Administration and the Federal Bureau of Investigation in this outreach to educate small businesses.

Small Business Information Security: The Fundamentals can be downloaded from the Small Business Corner Web site athttp://www.csrc.nist.gov/groups/SMA/sbc/.

The related video, “Information Technology Security for Small Business. It’s not just good business. It’s essential business,” features experts from NIST and the Small Business Administration. The video is available on You Tube and the Small Business Corner of the NIST Computer Security Web pages.

September 10, 2009

Want to see what the NIST has to say about national ID cards?

Check out this release.

(For the record I am extremely against the concept of any type of ID card, national or otherwise, that incorporates this level of personal data.  Quite hackable and doesn’t make the public any safer. These tracking devices only give the government that much more information on U.S. citizens.)

The release:

New NIST publications describe standards for identity credentials and authentication systems

Two publications from the National Institute of Standards and Technology (NIST) describe new capabilities for authentication systems using smart cards or other personal security devices within and outside federal government applications. A report describes a NIST-led international standard, ISO/IEC 24727, which defines a general-purpose identity application programming interface (API). The other is a draft publication on refinements to the Personal Identity Verification (PIV) specification.

NIST is responsible for developing specifications for PIV cards required for the government under Homeland Security Presidential Directive 12. These smart cards have embedded chips that hold information and biometric data such as specific types of patterns in fingerprints called “minutiae” along with a unique identifying number. The goal is to develop methods that allow each worker to have a PIV card that works with PIV equipment at all government agencies and with all card-reader equipment regardless of the manufacturer.

Because there is growing interest in using secure identity credentials like PIV cards for multiple applications beyond the federal workplace, NIST provided its smart card research expertise in the development of an international standard—ISO/IEC 24727 – Identification cards – Integrated circuit card programming interfaces—that provides a set of authentication protocols and services common to identity management frameworks.

The new NIST report, Use of ISO/IEC 24727 is an introduction to that standard. It describes the standard’s general-purpose identity application programming interface, the “Service Access Layer Interface for Identity (SALII)”, which allows cards and readers to communicate and operate with applications seamlessly. The report also describes a proof-of-concept experiment demonstrating that existing PIV cards and readers can work interoperably with ISO/IEC 24727. The applications tested included logging on to Windows or Linux systems, signing and encrypting email, and performing Web authentications.

NIST Interagency Report 7611 Use of ISO/IEC 24727 may be downloaded at http://csrc.nist.gov/publications/nistir/ir7611/nistir7611_use-of-isoiec24727.pdf.

NIST researchers also are involved in improving PIV components and providing guidelines that the private sector and municipalities can use with a similar smart ID card. They have drafted an update to an earlier publication that contains the technical specifications for interfacing with the PIV card to retrieve and use identity credentials.

Special Publication 800-73-3, Interfaces for Personal Identity Verification, provides specifications for PIV-Interoperable and PIV-Compatible cards issued by non-federal issuers, which may be used with the federal PIV system. It also provides specifications designed to ease implementation, facilitate interoperability and ensure performance of PIV applications in the federal workplace. The new publication specifies a PIV data model, card edge interface and application programming interface. The report also provides editorial changes to clarify information in the earlier version. (For background, see “Updated Specification Issued for PIV Card Implementations,” NIST Tech Beat, Oct. 14, 2008 [http://www.nist.gov/public_affairs/techbeat/tb2008_1014.htm].)


The draft version of NIST SP 800-73-3 is open for public comment through Sept. 13, 2009. The document is available online athttp://csrc.nist.gov/publications/PubsDrafts.html#800-73-3. Comments should be addressed to PIV_comments@nist.gov with “Comments on Public Draft SP 800-73-3” in the subject line

July 2, 2009

Nanotech data storage may have hit snag

There have been high hopes for utilizing nanotechnology in data storage and the possibilities for very large storage volume in very small areas. New research from the NIST and the Institute of Solid State Physicshave uncovered an unexpected range for the “pinning effect” which is creating changes in the material 1000 times further than thought to be possible. The high range may dramatically limit how small nanotech data storage devices can become.

The release:

Unexpectedly long-range effects in advanced magnetic devices

IMAGE: NIST MOIF (Magneto-optic imaging film) technique is unique in being able to image magnetic domains in real time while they are forming, growing and disappearing. Bright and dark regions represent…

Click here for more information. 

A tiny grid pattern has led materials scientists at the National Institute of Standards and Technology (NIST) and the Institute of Solid State Physics in Russia to an unexpected finding—the surprisingly strong and long-range effects of certain electromagnetic nanostructures used in data storage. Their recently reported findings* may add new scientific challenges to the design and manufacture of future ultra-high density data storage devices.

The team was studying the behavior of nanoscale structures that sandwich thin layers of materials with differing magnetic properties. In the past few decades such structures have been the subjects of intense research because they can have unusual and valuable magnetic properties. The data read heads on modern high-density disk drives usually exploit a version of the giant magnetoresistance (GMR) effect, which uses such layered structures for extremely sensitive magnetic field detectors. Arrays of nanoscale sandwiches of a similar type might be used in future data storage devices that would outdo even today’s astonishingly capacious microdrives because in principle the structures could be made even smaller than the minimum practical size for the magnetic islands that record data on hard disk drives, according to NIST metallurgist Robert Shull.

The key trick is to cover a thin layer of a ferromagnetic material, in which the magnetic direction of electrons, or “spins,” tend to order themselves in the same direction, with an antiferromagnetic layer in which the spins tend to orient in opposite directions. By itself, the ferromagnetic layer will tend to magnetize in the direction of an externally imposed magnetic field—and just as easily magnetize in the opposite direction if the external field is reversed. For reasons that are still debated, the presence of the antiferromagnetic layer changes this. It biases the ferromagnet in one preferred direction, essentially pinning its field in that orientation. In a magnetoresistance read head, for example, this pinned layer serves as a reference direction that the sensor uses in detecting changing field directions on the disk that it is “reading.”.

Researchers have long understood this pinning effect to be a short-range phenomenon. The influence of the antiferromagnetic layer is felt only a few tens of nanometers down into the ferromagnetic layer—verticallly. But what about sideways? To find out, the NIST/ISSP team started with a thin ferromagnetic film covering a silicon wafer and then added on top a grid of antiferromagnetic strips about 10 nanometers thick and 10 micrometers wide, separated by gaps of about 100 micrometers. Using an instrument that provided real-time images of the magnetization within grid the structure, the team watched the grid structure as they increased and decreased the magnetic field surrounding it.

What they found surprised them.

As expected, the ferromagnetic material directly under the grid lines showed the pinning effect, but, quite unexpectedly, so did the uncovered material in regions between the grid lines far removed from the antiferromagnetic material. “This pinning effect extends for maybe tens of nanometers down into the ferromagnet right underneath,” explains Shull, “so you might expect that there could be some residual effect maybe tens of nanometers away from it to the sides. But you wouldn’t expect it to extend 10 micrometers away—that’s 10 thousand nanometers.” In fact, the effect extends to regions 50 micrometers away from the closest antiferromagnetic strip, at least 1,000 times further than was previously known to be possible.

The ramifications, says Shull, are that engineers planning to build dense arrays of these structures onto a chip for high-performance memory or sensor devices will find interesting new scientific issues for investigation in optimizing how closely they can be packed without interfering with each other.




* Y.P. Kabanov, V.I. Nikitenko, O.A. Tikhomirov, W.F. Egelhoff, A.J. Shapiro and R.D. Shull. Unexpectedly long-range influence on thin-film magnetization reversal of a ferromagnet by a rectangular array of FeMn pinning films. Physical Review B79, 144435, 2009. DOI: 10.1103/PhysRevB.79.144435.

June 17, 2009

Nonstick nanogold

Cool and useful — “teflon” gold.

The release:

Nonstick and laser-safe gold aids laser trapping of biomolecules

IMAGE: The gold posts in this colorized micrograph, averaging 450 nanometers in diameter, are used to anchor individual biomolecules such as DNA for studies of their mechanical properties. The background surface…

Click here for more information. 

Biophysicists long for an ideal material—something more structured and less sticky than a standard glass surface—to anchor and position individual biomolecules. Gold is an alluring possibility, with its simple chemistry and the ease with which it can be patterned. Unfortunately, gold also tends to be sticky and can be melted by lasers. Now, biophysicists at JILA have made gold more precious than ever—at least as a research tool—by creating nonstick gold surfaces and laser-safe gold nanoposts, a potential boon to laser trapping of biomolecules.

JILA is a joint institute of the National Institute of Standards and Technology (NIST) and the University of Colorado at Boulder.

JILA’s successful use of gold in optical-trapping experiments, reported in Nano Letters,* could lead to a 10-fold increase in numbers of single molecules studied in certain assays, from roughly five to 50 per day, according to group leader Tom Perkins of NIST. The ability to carry out more experiments with greater precision will lead to new insights, such as uncovering diversity in seemingly identical molecules, and enhance NIST’s ability to carry out mission work, such as reproducing and verifying piconewton-scale force measurements using DNA, Perkins says. (A one-kilogram mass on the Earth’s surface exerts a force of roughly 10 newtons. A piconewton is 0.000 000 000 001 newtons. See “JILA Finds Flaw in Model Describing DNA Elasticity” NIST Tech Beat, Sept. 13, 2007.)

Perkins and other biophysicists use laser beams to precisely manipulate, track and measure molecules like DNA, which typically have one end bonded to a surface and the other end attached to a micron-sized bead that acts as a “handle” for the laser. Until now, creating the platform for such experiments has generally involved nonspecifically absorbing fragile molecules onto a sticky glass surface, producing random spacing and sometimes destroying biological activity. “It’s like dropping a car onto a road from 100 feet up and hoping it will land tires down. If the molecule lands in the wrong orientation, it won’t be active or, worse, it will only partially work,” Perkins says.

Ideally, scientists want to attach biomolecules in an optimal pattern on an otherwise nonstick surface. Gold posts are easy to lay down in desired patterns at the nanometer scale. Perkins’ group attached the DNA to the gold with sulfur-based chemical units called thiols (widely used in nanotechnology), an approach that is mechanically stronger than the protein-based bonding techniques typically used in biology. The JILA scientists used six thiol bonds instead of just one between the DNA and the gold posts. These bonds were mechanically strong enough to withstand high-force laser trapping and chemically robust enough to allow the JILA team to coat the unreacted gold on each nanopost with a polymer cushion, which eliminated undesired sticking. “Now you can anchor DNA to gold and keep the rest of the gold very nonstick,” Perkins says.

Moreover, the gold nanoposts were small enough—with diameters of 100 to 500 nanometers and a height of 20 nanometers—that the scientists could avoid hitting the posts directly with lasers. “Like oil and water, traditionally laser tweezers and gold don’t mix. By making very small islands of gold, we positioned individual molecules where we wanted them, and with a mechanical strength that enables more precise and additional types of studies,” Perkins says.


 The research was supported by a W.M. Keck Grant in the RNA Sciences, the National Science Foundation, and NIST.

* D.H. Paik, Y. Seol, W. Halsey and T.T. Perkins. Integrating a high-force optical trap with gold nanoposts and a robust gold-DNA bond. Nano Letters. Articles ASAP (As Soon As Publishable) Publication Date (Web): June 3, 2009 DOI: 10.1021/nl901404s.

The latest cybersecurity news

This release is from todayand covers the most up-to-date cybersecurity work done for national defense. Given the information society and interconnectedness of today’s world, cybersecurity is a very real matter of national defense. At the same time it’s an area frought with privacy and other concerns.

The release:

NIST, DOD, intelligence agencies join forces to secure US cyber infrastructure

The National Institute of Standards and Technology (NIST), in partnership with the Department of Defense (DOD), the Intelligence Community (IC), and the Committee on National Security Systems (CNSS), has released the first installment of a three-year effort to build a unified information security framework for the entire federal government. Historically, information systems at civilian agencies have operated under different security controls than military and intelligence information systems. This installment is titled NIST Special Publication 800-53, Revision 3, Recommended Security Controls for Federal Information Systems and Organizations.

“The common security control catalog is a critical step that effectively marshals our resources,” says Ron Ross, NIST project leader for the joint task force. “It also focuses our security initiatives to operate effectively in the face of changing threats and vulnerabilities. The unified framework standardizes the information security process that will also produce significant cost savings through standardized risk management policies, procedures, technologies, tools and techniques.”

This publication is a revised version of the security control catalog that was previously published in response to the Federal Information Security Management Act (FISMA) of 2002. This special publication contains the catalog of security controls and technical guidelines that federal agencies use to protect their information and technology infrastructure.

When complete, the unified framework will result in the defense, intelligence and civil communities using a common strategy to protect critical federal information systems and associated infrastructure. This ongoing effort is consistent with President Obama’s call for “integrating all cybersecurity policies for the government” in his May 29 speech on securing the U.S. cybersecurity infrastructure.

The revised security control catalog in SP 800-53 provides the most state-of-the-practice set of safeguards and countermeasures for information systems ever developed. The updated security controls—many addressing advanced cyber threats—were developed by a joint task force that included NIST, DOD, the IC and the CNSS with specific information from databases of known cyber attacks and threat information.

Additional updates to key NIST publications that will serve the entire federal government are under way. These will include the newly revised SP 800-37, which will transform the current certification and accreditation process into a near real-time risk management process that focuses on monitoring the security state of federal information systems, and SP 800-39, which is an enterprise-wide risk management guideline that will expand the risk management process.


 NIST Special Publication 800-53, Revision 3, is open for public comment through July 1, 2009. The document is available online at http://csrc.nist.gov/publications/PubsDrafts.html#800-53_Rev3. Comments should be sent to sec-cert@nist.gov.

June 4, 2009

Flexible memory

Via KurzweilAI.net— Flexible electronics are a hot development item and this flexible memory chip from NIST look like a promising addition to the field.

Electronic Memory Chips That Can Bend And Twist
Science Daily, June 3, 2009

A flexible memory switch that operates on less than 10 volts, maintains its memory when power is lost, and still functions after being flexed more than 4,000 times has been developed by National Institute of Standards and Technology(NIST) researhers.


The switchcan be built out of inexpensive, readily available materials and it performance is similar to that of a memristor (changes its resistance depending on the amount of current that is sent through it and retains this resistance even after the power is turned off).

Read Original Article>>

April 24, 2009

Nanotech splitting water cells

An important finding toward developing cost-effective alternative fuel sources.

The release:

Discovery of an unexpected boost for solar water-splitting cells

IMAGE: Scanning electron microscope image of typical titania nanotubes for a photocatalytic cell to produce hydrogen gas from water. Nanotubes average roughly 90-100 nanometers in diameter.

Click here for more information. 

A research team from Northeastern University and the National Institute of Standards and Technology (NIST) has discovered, serendipitously, that a residue of a process used to build arrays of titania nanotubes—a residue that wasn’t even noticed before this—plays an important role in improving the performance of the nanotubes in solar cells that produce hydrogen gas from water. Their recently published results* indicate that by controlling the deposition of potassium on the surface of the nanotubes, engineers can achieve significant energy savings in a promising new alternate energy system.

Titania (or titanium dioxide) is a versatile chemical compound best known as a white pigment. It’s found in everything from paint to toothpastes and sunscreen lotions. Thirty-five years ago Akira Fujishima startled the electrochemical world by demonstrating that it also functioned as a photocatalyst, producing hydrogen gas from water, electricity and sunlight. In recent years, researchers have been exploring different ways to optimize the process and create a commercially viable technology that, essentially, transforms cheap sunlight into hydrogen, a pollution-free fuel that can be stored and shipped.

Increasing the available surface area is one way to boost a catalyst’s performance, so a team at Northeastern has been studying techniques to build tightly packed arrays of titania nanotubes, which have a very high surface to volume ratio. They also were interested in how best to incorporate carbon into the nanotubes, because carbon helps titania absorb light in the visible spectrum. (Pure titania absorbs in the ultraviolet region, and much of the ultraviolet is filtered by the atmosphere.)

This brought them to the NIST X-ray spectroscopy beamline at the National Synchrotron Light Source (NSLS)**. The NIST facility uses X-rays that can be precisely tuned to measure chemical bonds of specific elements, and is at least 10 times more sensitive than commonly available laboratory instruments, allowing researchers to detect elements at extremely low concentrations. While making measurements of the carbon atoms, the team noticed spectroscopic data indicating that the titania nanotubes had small amounts of potassium ions strongly bound to the surface, evidently left by the fabrication process, which used potassium salts. This was the first time the potassium has ever been observed on titania nanotubes; previous measurements were not sensitive enough to detect it.

The result was mildly interesting, but became much more so when the research team compared the performance of the potassium-bearing nanotubes to similar arrays deliberately prepared without potassium. The former required only about one-third the electrical energy to produce the same amount of hydrogen as an equivalent array of potassium-free nanotubes. “The result was so exciting,” recalls Northeastern physicist Latika Menon, “that we got sidetracked from the carbon research.” Because it has such a strong effect at nearly undetectable concentrations, Menon says, potassium probably has played an unrecognized role in many experimental water-splitting cells that use titania nanotubes, because potassium hydroxide is commonly used in the cells. By controlling it, she says, hydrogen solar cell designers could use it to optimize performance.




* C. Richter, C. Jaye, E. Panaitescu, D.A. Fischer, L.H. Lewis, R.J. Willey and L. Menon. Effect of potassium adsorption on the photochemical properties of titania nanotube arrays. J. Mater. Chem., published online as an Advanced Article, March 27, 2009. DOI: 10.1039/b822501j

** The NSLS is part of the Department of Energy’s Brookhaven National Laboratory.

April 9, 2009

Quantum computing news

The final release dump post. As always I prefer providing you the entire release rather than rework it into something different. Any commentary or strong feelings on the release makes it into the intro, but usually it’s just news that I find interesting, cool or maybe just funny. Quantum computing news is always interesting and very, very cool.

The release:

Quantum computers will require complex software to manage errors

IMAGE: While rudimentary is a fair description of this early computer, the National Bureau of Standards — SEAC, built in 1950 –prototype quantum computers have not even reached its level of…

Click here for more information. 

Highlighting another challenge to the development of quantum computers, theorists at the National Institute of Standards and Technology (NIST) have shown* that a type of software operation, proposed as a solution to fundamental problems with the computers’ hardware, will not function as some designers had hoped.

Quantum computers—if they can ever be realized—will employ effects associated with atomic physics to solve otherwise intractable problems. But the NIST team has proved that the software in question, widely studied due to its simplicity and robustness to noise, is insufficient for performing arbitrary computations. This means that any software the computers use will have to employ far more complex and resource-intensive solutions to ensure the devices function effectively.

Unlike a conventional computer’s binary on-off switches, the building blocks of quantum computers, known as quantum bits, or “qubits,” have the mind-bending ability to exist in both “on” and “off” states simultaneously due to the so-called “superposition” principle of quantum physics. Once harnessed, the superposition principle should allow quantum computers to extract patterns from the possible outputs of a huge number of computations without actually performing all of them. This ability to extract overall patterns makes the devices potentially valuable for tasks such as codebreaking.

One issue, though, is that prototype quantum processors are prone to errors caused, for example, by noise from stray electric or magnetic fields. Conventional computers can guard against errors using techniques such as repetition, where the information in each bit is copied several times and the copies are checked against one another as the calculation proceeds. But this sort of redundancy is impossible in a quantum computer, where the laws of the quantum world forbid such information cloning.

To improve the efficiency of error correction, researchers are designing quantum computing architectures so as to limit the spread of errors. One of the simplest and most effective ways of ensuring this is by creating software that never permits qubits to interact if their errors might compound one another. Quantum software operations with this property are called “transversal encoded quantum gates.” NIST information theorist Bryan Eastin describes these gates as a solution both simple to employ and resistant to the noise of error-prone quantum processors. But the NIST team has proved mathematically that transversal gates cannot be used exclusively, meaning that more complex solutions for error management and correction must be employed.

Eastin says their result does not represent a setback to quantum computer development because researchers, unable to figure out how to employ transversal gates universally, have already developed other techniques for dealing with errors. “The findings could actually help move designers on to greener pastures,” he says. “There are some avenues of exploration that are less tempting now.”




* B. Eastin and E. Knill. Restrictions on transversal quantum gate sets. Physical Review Letters, 102, 110502, March 20, 2009.

Nanotech and wireless communication

Number two of the release dump. Nanotechnology improving wireless communication.

The release:

Nano changes rise to macro importance in a key electronics material

By combining the results of a number of powerful techniques for studying material structure at the nanoscale, a team of researchers from the National Institute of Standards and Technology (NIST), working with colleagues in other federal labs and abroad, believe they have settled a long-standing debate over the source of the unique electronic properties of a material with potentially great importance for wireless communications.

The new study* of silver niobate not only opens the door to engineering improved electronic components for smaller, higher performance wireless devices, but also serves as an example of understanding how subtle nanoscale features of a material can give rise to major changes in its physical properties.

Silver niobate is a ceramic dielectric, a class of materials used to make capacitors, filters and other basic components of wireless communications equipment and other high-frequency electronic devices. A useful dielectric needs to have a large dielectric constant—roughly, a measure of the material’s ability to hold an electric charge—that is stable in the operating temperature range. The material also should have low dielectric losses—which means that it does not waste energy as heat and preserves much of its intended signal strength. In the important gigahertz range of the radio spectrum—used for a wide variety of wireless applications—silver niobate-based ceramics are the only materials known that combine a high, temperature-stable dielectric constant with sufficiently low dielectric losses.

It’s been known for some time that silver niobate’s unique dielectric properties are temperature dependent—the dielectric constant peaks in a broad range near room temperature in these ceramics, which makes them suitable for practical applications. Earlier studies were unable to identify the structural basis of the unusual dielectric response because no accompanying changes in the overall crystal structure could be observed. “The crystal symmetry doesn’t seem to change at those temperatures,” explains NIST materials scientist Igor Levin, “but that’s because people were using standard techniques that tell you the average structure. The important changes happen at the nanoscale and are lost in averages.”

Only in recent years, says Levin, have the specialized instruments and analytic techniques been available to probe nanoscale structural changes in crystals. Even so, he says, “these subtle deviations from the average are so small that any single measurement gives only partial information on the structure. You need to combine several complementary techniques that look at different angles of the problem.” Working at different facilities** the team combined results from several high-resolution probes using X-rays, neutrons and electrons—tools that are sensitive to both the local and average crystal structure— to understand silver niobate’s dielectric properties. The results revealed an intricate interplay between the oxygen atoms, arranged in an octahedral pattern that defines the compound’s crystal structure, and the niobium atoms at the centers of the octahedra.

At high temperatures, the niobium atoms are slightly displaced, but their average position remains in the center—so the shift isn’t seen in averaging measurements. As the compound cools, the oxygen atoms cooperate by moving a little, causing the octahedral structure to rotate slightly. This movement generates strain which “locks” the niobium atoms into off-centered positions—but not completely. The resulting partial disorder of the niobium atoms gives rise to the dielectric properties. The results, the researchers say, point to potential avenues for engineering similar properties in other compounds.




The work was supported in part by the U.S. Department of Energy and the U.K. Science and Technology Facilities Council.

* I. Levin, V. Krayzman, J.C. Woicik, J. Karapetrova, T. Proffen, M.G. Tucker and I.M. Reaney. Structural changes underlying the diffuse dielectric response in AgNbO3. Phys. Rev. B 79, 104113, posted online March 26, 2009.

** The study required measurements at the Advanced Photon Source at Argonne National Laboratory, the Lujan Neutron Center at Los Alamos National Laboratory and the ISIS Pulsed Neutron and Muon Source at Rutherford Appleton Laboratory (United Kingdom). In addition to NIST, researchers from Argonne, Los Alamos, ISIS and the University of Sheffield contributed to the paper.

March 14, 2009

Scientists cheer Omnibus Bill

It’s always a good sign for R&D when scientists once again cheer actions from Washington. May the theocrats go hide away in caves and read their fairy tales by the light of candles and campfires.

The release:

APS applauds Senate passage of FY09 omnibus bill

Funding will enable scientists to continue transformational research, leading to innovation, job creation and economic prosperity for the nation

WASHINGTON, D.C. – The American Physical Society (APS) is elated that the Senate has approved the FYO9 Omnibus Bill, which will allow scientists to continue cutting-edge research that will lead to innovation, job creation and economic growth for the United States.

Specifically, APS lauds the bill’s support of research programs at the Department of Energy’s Office of Science, the National Science Foundation and the National Institute of Standards and Technology. Scientists, who receive funding from these agencies, can now further their research on developing solutions to some of the country’s most pressing challenges – developing clean, affordable energy, improving health care and strengthening science and math instruction in our schools.

“At a time when the nation is coping with a deep recession and striving for an economic recovery, federal investments in science and technology are more critical to America’s future than ever,” said Michael S. Lubell, APS director of public affairs. “Crises provide opportunities for creative outcomes. It is gratifying to see science high on Congress’ priority list.”

APS applauds the leadership of Congress and President Obama on the importance of funding science, the seed corn of new discoveries, job growth and economic prosperity for the nation. As policymakers seek solutions to the nation’s many challenges, funding in the FY09 Omnibus Bill, as well as predictable, sustainable increases in the future, will ensure that they can count on scientists to lead in developing those solutions.




About APS: The American Physical Society is the world’s leading professional organization of physicists, representing more than 46,000 physicists in academia and industry in the U.S. and internationally. It has offices in College Park, Md., Ridge, N.Y., and Washington, D.C.

February 12, 2009

Nanotech improving concrete

The latest in nanotechnology improving our lives.

The release:


Viscosity-enhancing nanomaterials may double service life of concrete

IMAGE: The barely visible blue-green area at the top of this X-ray image of concrete with the NIST nanoadditive shows that very few chloride ions (in green) penetrate into the concrete….

Click here for more information. 

Engineers at the National Institute of Standards and Technology (NIST) are patenting a method that is expected to double the service life of concrete. The key, according to a new paper*, is a nano-sized additive that slows down penetration of chloride and sulfate ions from road salt, sea water and soils into the concrete. A reduction in ion transport translates to reductions in both maintenance costs and the catastrophic failure of concrete structures. The new technology could save billions of dollars and many lives.

Concrete has been around since the Romans, and it is time for a makeover. The nation’s infrastructure uses concrete for millions of miles of roadways and 600,000 bridges, many of which are in disrepair. In 2007, 25 percent of U.S. bridges were rated as structurally deficient or functionally obsolete, according to the Federal Highway Administration. Damaged infrastructure also directly affects large numbers of Americans’ own budgets. The American Society of Civil Engineers estimates that Americans spend $54 billion each year to repair damages caused by poor road conditions.

Infiltrating chloride and sulfate ions cause internal structural damage over time that leads to cracks and weakens the concrete.

Past attempts to improve the lifetime of concrete have focused on producing denser, less porous concretes, but unfortunately these formulations have a greater tendency to crack. NIST engineers took a different approach, setting out to double the material’s lifetime with a project called viscosity enhancers reducing diffusion in concrete technology (VERDICT). Rather than change the size and density of the pores in concrete, they reasoned, it would be better to change the viscosity of the solution in the concrete at the microscale to reduce the speed at which chlorides and sulfates enter the concrete. “Swimming through a pool of honey takes longer than making it through a pool of water,” engineer Dale Bentz says.

They were inspired by additives the food processing industry uses to thicken food and even tested out a popular additive called xanthum gum that thickens salad dressings and sauces and gives ice cream its texture.

Studying a variety of additives, engineers determined that the size of the additive’s molecule was critical to serving as a diffusion barrier. Larger molecules such as cellulose ether and xanthum gum increased viscosity, but did not cut diffusion rates. Smaller molecules—less than 100 nanometers—slowed ion diffusion. Bentz explains, “When additive molecules are large but present in a low concentration, it is easy for the chloride ions to go around them, but when you have a higher concentration of smaller molecules increasing the solution viscosity, it is more effective in impeding diffusion of the ions.”

The NIST researchers have demonstrated that the additives can be blended directly into the concrete with current chemical admixtures, but that even better performance is achieved when the additives are mixed into the concrete by saturating absorbant, lightweight sand. Research continues on other materials as engineers seek to improve this finding by reducing the concentration and cost of the additive necessary to double the concrete’s service life.




A non-provisional patent application was filed in September, and the technology is now available for licensing from the U.S. government; the NIST Office of Technology Partnerships can be contacted for further details.

* D.P. Bentz, M.A. Peltz, K.A. Snyder and J.M. Davis. VERDICT: Viscosity Enhancers Reducing Diffusion in Concrete Technology. Concrete International. 31 (1), 31-36, January 2009.

November 25, 2008

Search and rescue robots

The release from today:

Rescue robot exercise brings together robots, developers, first responders

IMAGE: Robots are being trained to map spaces using their sensors. This robot travels through a simulated “wooded area ” that has uneven terrain and randomly placed PVC pipes as “trees. ” It…

Click here for more information. 

The National Institute of Standards and Technology (NIST) held a rescue robot exercise in Texas last week in which about three dozen robots were tested by developers and first responders in order to develop a standard suite of performance tests to help evaluate candidate mechanical rescuers. This exercise was sponsored by the Department of Homeland Security’s Science and Technology Directorate to develop performance standards for robots for use in urban search and rescue missions.

Urban search and rescue robots assist first responders by performing such tasks as entering partially collapsed structures to search for living victims or to sniff out poisonous chemicals. NIST is developing robot standards for testing in cooperation with industry and government partners.

“It is challenging to develop the test standards as the robots are still evolving,” explained Elena Messina, acting chief of the Intelligent Systems Division, “because standards are usually set for products already in use. But it is critical for developers to be able to compare results, which is not possible without reproducible test environments. So, we have reproducible rough terrain that everyone can build in their labs, whereas you can’t reproduce a rubble pile. This way, developers in Japan can run tests, and people in Chicago can understand what the robot achieved.”

The event took place at Disaster City, Texas, a test facility run by the Texas Engineering Extension Service (TEEX). The facility offers an airstrip, lakes, train wrecks and rubble piles that can be arranged for many types of challenging tests.

Exercises included testing battery capacity by having robots perform figure eights on an undulating terrain and mobility tests in which robots ran through increasingly challenging exercises beginning with climbing steps and escalating to climbing ramps and then making it up steps with unequal gaps. A new mapping challenge introduced at this event tests how accurate a robot-generated map can be—the robot must traverse a simulated “wooded area” that has uneven terrain and PVC pipes for trees, and create a map using its sensors. Researchers came from across the globe to collect data to feed into their mapping algorithms. NIST researchers developing ultra-high-resolution three-dimensional sensors also participated.

Communications and manipulator tests were performed and discussed at the November exercise will be submitted to ASTM International as a potential rescue robot test standard.




To see the robots in action, three videos can be viewed at the Disaster City TEEX Web site: www.teexblog.blogspot.com/.

November 13, 2008

Single nanometer ion stream

Unlike the previous release, this one does touch on nanotech.

The release:

Cold atoms could replace hot gallium in focused ion beams

Scientists at the National Institute of Standards and Technology (NIST) have developed a radical new method of focusing a stream of ions into a point as small as one nanometer (one billionth of a meter).* Because of the versatility of their approach—it can be used with a wide range of ions tailored to the task at hand—it is expected to have broad application in nanotechnology both for carving smaller features on semiconductors than now are possible and for nondestructive imaging of nanoscale structures with finer resolution than currently possible with electron microscopes.

Researchers and manufacturers routinely use intense, focused beams of ions to carve nanometer-sized features into a wide variety of targets. In principle, ion beams also could produce better images of nanoscale surface features than conventional electron microscopy. But the current technology for both applications is problematic. In the most widely used method, a metal-coated needle generates a narrowly focused beam of gallium ions. The high energies needed to focus gallium for milling tasks end up burying small amounts in the sample, contaminating the material. And because gallium ions are so heavy (comparatively speaking), if used to collect images they inadvertently damage the sample, blasting away some of its surface while it is being observed. Researchers have tried using other types of ions but were unable to produce the brightness or intensity necessary for the ion beam to cut into most materials.

The NIST team took a completely different approach to generating a focused ion beam that opens up the possibility for use of non-contaminating elements. Instead of starting with a sharp metal point, they generate a small “cloud” of atoms and then combine magnetic fields with laser light to trap and cool these atoms to extremely low temperatures. Another laser is used to ionize the atoms, and the charged particles are accelerated through a small hole to create a small but energetic beam of ions. Researchers have named the groundbreaking device “MOTIS,” for “Magneto-Optical Trap Ion Source.” (For more on MOTs, see “Bon MOT: Innovative Atom Trap Catches Highly Magnetic Atoms,” NIST Tech Beat Apr. 1, 2008.)

“Because the lasers cool the atoms to a very low temperature, they’re not moving around in random directions very much. As a result, when we accelerate them the ions travel in a highly parallel beam, which is necessary for focusing them down to a very small spot,” explains Jabez McClelland of the NIST Center for Nanoscale Science and Technology. The team was able to measure the tiny spread of the beam and show that it was indeed small enough to allow the beam to be focused to a spot size less than 1 nanometer. The initial demonstration used chromium atoms, establishing that other elements besides gallium can achieve the brightness and intensity to work as a focused ion beam “nano-scalpel.” The same technique, says McClelland, can be used with a wide variety of other atoms, which could be selected for special tasks such as milling nanoscale features without introducing contaminants, or to enhance contrast for ion beam microscopy.




* J. L. Hanssen, S. B. Hill, J. Orloff and J. J. McClelland. Magneto-optical trap-based, high brightness ion source for use as a nanoscale probe. Nano Letters 8, 2844 (2008).

Nanoparticles in the home

No, not nanotech, just nanoscale particles released by household devices.

The release:

Nanoparticles in the home: More and smaller than previously detected

Extremely small nanoscale particles are released by common kitchen appliances in abundant amounts, greatly outnumbering the previously detected, larger-size nanoparticles emitted by these appliances, according to new findings* by researchers at the National Institute of Standards and Technology (NIST). So-called “ultrafine particles” (UFP) range in size from 2 to 10 nanometers. They are emitted by motor vehicles and a variety of indoor sources and have attracted attention because of increasing evidence that they can cause respiratory and cardiovascular illnesses.

NIST researchers conducted a series of 150 experiments using gas and electric stoves and electric toaster ovens to determine their impacts on indoor levels of nano-sized particles. Previous studies have been limited to measuring particles with diameters greater than 10 nm, but new technology used in these experiments allowed researchers to measure down to 2 nm particles—approximately 10 times the size of a large atom.

This previously unexplored range of 2 to 10 nm contributed more than 90 percent of all the particles produced by the electric and gas stovetop burners/coils. The gas and electric ovens and the toaster oven produced most of their UFP in the 10 nm to 30 nm range.

The results of this test should affect future studies of human exposure to particulates and associated health effects, particularly since personal exposure to these indoor UFP sources can often exceed exposure to the outdoor UFP.

Researchers will continue to explore the production of UFP by indoor sources. Many common small appliances such as hair dryers, steam irons and electric power tools include heating elements or motors that may produce UFP. People often use these small appliances at close range for relatively long times, so exposure could be large even if the emissions are low.

The experiments were conducted in a three-bedroom test house at NIST that is equipped to measure ventilation rates, environmental conditions and contaminant concentrations.




* L. Wallace, F. Wang, C. Howard-Reed and A. Persily. Contribution of gas and electric stoves to residential ultrafine particle concentrations between 2 and 64 nm: Size distributions and emission and coagulation rates. Environmental Science and Technology, DOI 10.1021/es801402v, published online Oct. 30, 2008.

October 29, 2008

Fast and cheap nanoscale dimensioning

The release from today:

Nanoscale dimensioning is fast, cheap with new NIST optical technique

This schematic shows how a TSOM image is acquired. Using an optical microscope, several images of a 60 nanometer gold particle sample (shown in red) are taken at different focal…
Click here for more information.

A novel technique* under development at the National Institute of Standards and Technology (NIST) uses a relatively inexpensive optical microscope to quickly and cheaply analyze nanoscale dimensions with nanoscale measurement sensitivity. Termed “Through-focus Scanning Optical Microscope” (TSOM) imaging, the technique has potential applications in nanomanufacturing, semiconductor process control and biotechnology.

Optical microscopes are not widely considered for checking nanoscale (below 100 nanometers) dimensions because of the limitation imposed by wavelength of light—you can’t get a precise image with a probe three times the object’s size. NIST researcher Ravikiran Attota gets around this, paradoxically, by considering lots of “bad” (out-of-focus) images. “This imaging uses a set of blurry, out-of-focus optical images for nanometer dimensional measurement sensitivity,” he says. Instead of repeatedly focusing on a sample to acquire one best image, the new technique captures a series of images with an optical microscope at different focal positions and stacks them one on top of the other to create the TSOM image. A computer program Attota developed analyzes the image.

Using an optical microscope, several images of a 60 nanometer gold particle sample are taken at different focal positions and stacked together. This computer-created image shows the resultant TSOM image….
Click here for more information.

While Attota believes this simple technique can be used in a variety of applications, he has worked with two. The TSOM image can compare two nanoscale objects such as silicon lines on an integrated circuit. The software “subtracts” one image from the other. This enables sensitivity to dimensional differences at the nanoscale—line height, width or side-wall angle. Each type of difference generates a distinct signal.

TSOM has also been theoretically evaluated in another quality control application. Medical researchers are studying the use of gold nanoparticles to deliver advanced pharmaceuticals to specific locations within the human body. Perfect size will be critical. To address this application, a TSOM image of a gold nanoparticle can be taken and compared to a library of simulated images to obtain “best-match” images with the intent of determining if each nanoparticle passes or fails.

This new imaging technology requires a research-quality optical microscope, a camera and a microscope stage that can move at preset distances. “The setup is easily under $50,000, which is much less expensive than electron or probe microscopes currently used for measuring materials at the nanoscale,” Attota explains. “This method is another approach to extend the range of optical microscopy from microscale to nanoscale dimensional analysis.” So far, sensitivity to a 3 nm difference in line widths has been demonstrated in the laboratory.




* R. Attota, T.A. Germer and R.M. Silver. Through-focus scanning-optical-microscope imaging method for nanoscale dimensional analysis, Optics Letters 33, 1990 (2008).


September 7, 2008

Nanoclusters of gold are valued catalysts

From the press release:

Electron micrographs showing inactive (left) and active (right) catalysts consisting of gold particles absorbed on iron oxide. The red circles indicate the presence of individual gold atoms. The yellow circles...
Electron micrographs showing inactive (left) and active (right) catalysts consisting of gold particles absorbed on iron oxide. The red circles indicate the presence of individual gold atoms. The yellow circles show the location of subnanometer gold clusters that can effectively catalyze the conversion of carbon monoxide to carbon dioxide. One nanometer is about half the size of a DNA molecule. (Color added for clarity)Credit: Lehigh University Center for Advanced Materials and Nanotechnology

NIST and partners identify tiny gold clusters as top-notch catalysts

For most of us, gold is only valuable if we possess it in large-sized pieces. However, the “bigger is better” rule isn’t the case for those interested in exploiting gold’s exceptional ability to catalyze a wide variety of chemical reactions, including the oxidation of poisonous carbon monoxide (CO) into harmless carbon dioxide at room temperatures. That process, if industrialized, could potentially improve the effectiveness of catalytic converters that clean automobile exhaust and breathing devices that protect miners and firefighters. For this purpose, nanoclusters—gold atoms bound together in crystals smaller than a strand of DNA—are the size most treasured.

Using a pair of scanning transmission electron microscopy (STEM) instruments for which spherical aberration (a system fault yielding blurry images) is corrected, researchers at the National Institute of Standards and Technology (NIST), Lehigh University (Bethlehem, Pa.) and Cardiff University (Cardiff, Wales, United Kingdom) for the first time achieved state-of-the-art resolution of the active gold nanocrystals absorbed onto iron oxide surfaces. In fact, the resolution was sensitive enough to even visualize individual gold atoms.

The work is reported in the Sept. 5, 2008, issue of Science.

Surface science studies have suggested that there is a critical size range at which gold nanocrystals supported by iron oxide become highly active as catalysts for CO oxidation. However, the theory is based on research using idealized catalyst models made of gold absorbed on titanium oxide. The NIST/Lehigh/Cardiff aberration-corrected STEM imaging technique allows the researchers to study the real iron oxide catalyst systems as synthesized, identify all of the gold structures present in each sample, and then characterize which cluster sizes are most active in CO conversion.

The research team discovered that size matters a lot—samples ranged from those with little or no catalytic activity (less than 1 percent CO conversion) to others with nearly 100 percent efficiency. Their results revealed that the most active gold nanoclusters for CO conversion are bilayers approximately 0.5-0.8 nanometer in diameter (40 times smaller than the common cold virus) and containing about 10 gold atoms. This finding is consistent with the previous surface science studies done on the gold-titanium oxide models.




A.A. Herzing, C.J. Kiely, A.F. Carley, P. Landon and G.J. Hutchings. Identification of active gold nanoclusters on iron oxide supports for CO oxidation. Science, Vol. 321, Issue 5894, Sept. 5, 2008.

June 26, 2008

Quantum images

From KurzweilAI.net:

Physicists Produce Quantum-Entangled Images
PhysOrg.com, June 25, 2008

Researchers from the National Institute of Standards and Technology (NIST) and the University of Maryland (UM) have produced “quantum images,” pairs of information-rich visual patterns whose features are entangled (linked by the laws of quantum physics).


Matching up both quantum images and subtracting their fluctuations, their noise is lower (so their information content potentially higher) than it is from any two classical images.

In addition to promising better detection of faint objects and improved amplification and positioning of light beams, the researchers’ technique for producing quantum images may someday be useful for storing patterns of data in quantum computers and transmitting large amounts of highly secure encrypted information.
Read Original Article>>

April 29, 2008

Nanoassembler prototype announced

From KurzweilAI.net:

US researchers have built a proto-prototype nano assembler
Nanowerk, April 28, 2008

Researchers at the National Institute of Standards and Technology (NIST) have developed an early prototype for a nanoassembler.

The NIST system consists of four Microelectromechanical Systems (MEMS) devices positioned around a centrally located port on a chip into which the starting materials can be placed. Each nanomanipulator is composed of a positioning mechanism with an attached nanoprobe.

By simultaneously controlling the position of each of these nanoprobes, the team can use them to cooperatively assemble a complex structure on a very small scale, using using a scanning electron microscope for real-time imaging of the nanomanipulation procedures.

The researchers suggest it should be possible to have multiple nanoassemblers working simultaneously to manufacture next-generation nanoelectronics.

Read Original Article>>


March 11, 2008

Tracking nanoparticles in three dimensions

From KurzweilAI.net:

All Done With Mirrors: NIST Microscope Tracks Nanoparticles In 3-D
Photonics Online, Mar. 10, 2008A new microscope design allows nanotechnology researchers at the National Institute of Standards and Technology (NIST) to track the motions of nanoparticles in solution as they dart around in three dimensions.
Four side views of a nanoparticle floating in solution (left) are reflected up. A microscope above the well sees the real particle (center, right) and four reflections that show the particle‘s vertical position. A simple calculation correlates the horizontal and vertical images to determine each particle‘s 3-D path.

The technology may lead to a better understanding of the dynamics of nanoparticles in fluids and, ultimately, process control techniques for “directed self-assembly.” This capitalizes on physical properties and chemical affinities of nanoparticles in solutions to induce them to gather and arrange themselves in desired structures at desired locations.

Potential products include extraordinarily sensitive chemical and biological sensor arrays, and new medical and diagnostic materials based on quantum dots and other nanoscale materials.
Read Original Article>>