David Kirkpatrick

November 12, 2010

Robots with an, ahem, personal touch

Probably got you with the title. Here’s news from the Georgia Institute of Technology.

From the link:

A robot known as “Cody” successfully wiped away blue candy from a test user’s legs and arms without being too forceful, researchers from the Georgia Institute of Technology (led by assistant professor Charlie Kemp) reported at the 2010 IEEE International Conference on Intelligent Robots and Systems (IROS) conference last month.

September 8, 2010

Graphene research may lead to electronics improvement

A fairly radical improvement. Try highly efficient, very-low-heat producing and smaller electronics devices. I enjoy blogging about nanotech research with real promise for market applications.

From the link:

NIST recently constructed the world’s most powerful and stable scanning-probe microscope, with an unprecedented combination of low temperature (as low as 10 millikelvin, or 10 thousandths of a degree above absolute zero), ultra-high vacuum and high . In the first measurements made with this instrument, the team has used its power to resolve the finest differences in the electron energies in graphene, atom-by-atom.

“Going to this resolution allows you to see new physics,” said Young Jae Song, a postdoctoral researcher who helped develop the instrument at NIST and make these first measurements.

And the new physics the team saw raises a few more questions about how the electrons behave in graphene than it answers.

Because of the geometry and electromagnetic properties of graphene’s structure, an electron in any given energy level populates four possible sublevels, called a “quartet.” Theorists have predicted that this quartet of levels would split into different energies when immersed in a magnetic field, but until recently there had not been an instrument sensitive enough to resolve these differences.

“When we increased the magnetic field at extreme low temperatures, we observed unexpectedly complex quantum behavior of the electrons,” said NIST Fellow Joseph Stroscio.

What is happening, according to Stroscio, appears to be a “many-body effect” in which electrons interact strongly with one another in ways that affect their energy levels.

August 11, 2010

Better understanding graphene

Yes, graphene is something of a miracle material (hit this link for my extensive graphene blogging), and yes it’s proving to be very vexing material as well. There’s a whole lot of promise, but not so much practice because graphene is proving to be a very fickle material. Research like this from the Georgia Institute of Technology is particularly important because unlocking the secret life of graphene will allow for increasing practical applications. Better understanding will lead to better utilization.

The release:

Study of electron orbits in multilayer graphene finds unexpected energy gaps

Electron transport

IMAGE: Stacking of graphene sheets creates regions where the moiré alignment is of type AA (all atoms have neighbors in the layer below), AB (only A atoms have neighbors) or BA…

Click here for more information.

Researchers have taken one more step toward understanding the unique and often unexpected properties of graphene, a two-dimensional carbon material that has attracted interest because of its potential applications in future generations of electronic devices.

In the Aug. 8 advance online edition of the journal Nature Physics, researchers from the Georgia Institute of Technology and the National Institute of Standards and Technology (NIST) describe for the first time how the orbits of electrons are distributed spatially by magnetic fields applied to layers of epitaxial graphene.

The research team also found that these electron orbits can interact with the substrate on which the graphene is grown, creating energy gaps that affect how electron waves move through the multilayer material. These energy gaps could have implications for the designers of certain graphene-based electronic devices.

“The regular pattern of energy gaps in the graphene surface creates regions where electron transport is not allowed,” said Phillip N. First, a professor in the Georgia Tech School of Physics and one of the paper’s co-authors. “Electron waves would have to go around these regions, requiring new patterns of electron wave interference. Understanding such interference will be important for bi-layer graphene devices that have been proposed, and may be important for other lattice-matched substrates used to support graphene and graphene devices.”

In a magnetic field, an electron moves in a circular trajectory – known as a cyclotron orbit – whose radius depends on the size of the magnetic field and the energy of electron. For a constant magnetic field, that’s a little like rolling a marble around in a large bowl, First said.

“At high energy, the marble orbits high in the bowl, while for lower energies, the orbit size is smaller and lower in the bowl,” he explained. “The cyclotron orbits in graphene also depend on the electron energy and the local electron potential – corresponding to the bowl – but until now, the orbits hadn’t been imaged directly.”

Placed in a magnetic field, these orbits normally drift along lines of nearly constant electric potential. But when a graphene sample has small fluctuations in the potential, these “drift states” can become trapped at a hill or valley in the material that has closed constant potential contours. Such trapping of charge carriers is important for the quantum Hall effect, in which precisely quantized resistance results from charge conduction solely through the orbits that skip along the edges of the material.

IMAGE: This graphic shows electrons that move along an equipotential, while those that follow closed equipotentials (as in a potential-energy valley) become localized (right). The arrows denote the magnetic field,…

Click here for more information.

The study focused on one particular electron orbit: a zero-energy orbit that is unique to graphene. Because electrons are matter waves, interference within a material affects how their energy relates to the velocity of the wave – and reflected waves added to an incoming wave can combine to produce a slower composite wave. Electrons moving through the unique “chicken-wire” arrangement of carbon-carbon bonds in the graphene interfere in a way that leaves the wave velocity the same for all energy levels.

In addition to finding that energy states follow contours of constant electric potential, the researchers discovered specific areas on the graphene surface where the orbital energy of the electrons changes from one atom to the next. That creates an energy gap within isolated patches on the surface.

“By examining their distribution over the surface for different magnetic fields, we determined that the energy gap is due to a subtle interaction with the substrate, which consists of multilayer graphene grown on a silicon carbide wafer,” First explained.

In multilayer epitaxial graphene, each layer’s symmetrical sublattice is rotated slightly with respect to the next. In prior studies, researchers found that the rotations served to decouple the electronic properties of each graphene layer.

“Our findings hold the first indications of a small position-dependent interaction between the layers,” said David L. Miller, the paper’s first author and a graduate student in First’s laboratory. “This interaction occurs only when the size of a cyclotron orbit – which shrinks as the magnetic field is increased – becomes smaller than the size of the observed patches.”

The origin of the position dependent interaction is believed to be the “moiré pattern” of atomic alignments between two adjacent layers of graphene. In some regions, atoms of one layer lie atop atoms of the layer below, while in other regions, none of the atoms align with the atoms in the layer below. In still other regions, half of the atoms have neighbors in the underlayer, an instance in which the symmetry of the carbon atoms is broken and the Landau level – discrete energy level of the electrons – splits into two different energies.

Experimentally, the researchers examined a sample of epitaxial graphene grown at Georgia Tech in the laboratory of Professor Walt de Heer, using techniques developed by his research team over the past several years.

They used the tip of a custom-built scanning-tunneling microscope (STM) to probe the atomic-scale electronic structure of the graphene in a technique known as scanning tunneling spectroscopy. The tip was moved across the surface of a 100-square nanometer section of graphene, and spectroscopic data was acquired every 0.4 nanometers.

The measurements were done at 4.3 degrees Kelvin to take advantage of the fact that energy resolution is proportional to the temperature. The scanning-tunneling microscope, designed and built by Joseph Stroscio at NIST’s Center for Nanoscale Science and Technology, used a superconducting magnet to provide the magnetic fields needed to study the orbits.

According to First, the study raises a number of questions for future research, including how the energy gaps will affect electron transport properties, how the observed effects may impact proposed bi-layer graphene coherent devices – and whether the new phenomenon can be controlled.

“This study is really a stepping stone in long path to understanding the subtleties of graphene’s interesting properties,” he said. “This material is different from anything we have worked with before in electronics.”

###

In addition to those already mentioned, the study also included Walt de Heer, Kevin D. Kubista, Ming Ruan, and Markus Kinderman from Georgia Tech and Gregory M. Rutter from NIST. The research was supported by the National Science Foundation, the Semiconductor Research Corporation and the W.M. Keck Foundation. Additional assistance was provided by Georgia Tech’s Materials Research Science and Engineering Center (MRSEC)

July 15, 2010

Start cowering under your afghans …

… the robots are coming. (Just to let you know, the title refers to an old Saturday Night Live short selling insurance to the elderly for protection from robots.)

The Conference on Artificial Intelligence is making the case for robotics as a major growth industry in the very near future.

From the link:

“Early on there was this dream that robots could be generally intelligent; that they would rival and surpass humans in their abilities to do things,” Leslie Kaelbling, a professor of computer science and engineering at MIT, said at the conference. “The current commercial reality is pretty different.”

A lot of AI research fragmented in directions away from robotics, creating algorithms that underpin business intelligence, finance, Web and other uses. AI got separated from robotics because the machines are a pain: physical and unreliable. However, “They are getting better,” Kaelbling said.

Today, robotics researchers have computers that are faster, machinery that is more reliable, and many of the algorithms used in routine robotic tasks have already been built, said Kaelbling, who asked this research community whether it was time to give robotics another try.

December 8, 2009

Nanotech in space

Well, theoretically in space in the form of improving ion-propulsion systems. In reality if interstellar, or even travel within the solar system beyond Mars, has any hope feasibility, it’s going to require a major breakthrough in getting from point A (ostensibly Earth) and point B. It’ll be interesting to see where emerging science like nanotech will take us. We’re already seeing actual medical, electronics and other uses for nanotechnology. I do a lot of nanotech blogging because the field is so exciting and still harbors untold potential.

From the first link:

Ion-propulsion systems have propelled a handful of Earth-orbiting and interplanetary spacecraft over the past 50 years. Now researchers at Georgia Institute of Technology are developing more efficient ion thrusters that use carbon nanotubes for a vital component

Ion propulsion works by accelerating electrically charged, or ionized, particles to propel a spacecraft. One of the most common ion engines, known as a “Hall Effect” thruster, ionizes gas using electrons trapped in a magnetic field. The resulting ions are then accelerated using the potential maintained between an anode and a cathode. But some of the emitted electrons must also be used to neutralize the ions in the plume emitted from the spacecraft, to prevent the spacecraft from becoming electrically charged. Existing Hall Effect thrusters must use about 10 percent of the spacecraft’s xenon gas propellant to create the electrons needed to both run the engine and neutralize the ion beam.

The Georgia Tech researchers created a field emission cathode for the thruster using carbon nanotubes. In this type of cathode, electrons are emitted after they tunnel through a potential barrier. The carbon nanotube design is especially efficient because nanotubes are incredibly strong and electrically conductive. “By using carbon nanotubes, we can get all the electrons we need without using any propellant,” says Mitchell Walker, principal investigator of the project and an assistant professor in the High-Power Electric Propulsion Laboratory at Georgia Tech. This means that 10 percent more of the ion thruster’s propellant is available for the actual mission, extending a spacecraft’s lifetime.


Efficient emitters: A micrograph of square arrays of carbon nanotubes on a one centimeter by one centimeter silicon wafer. The arrays are designed for use in an experimental cathode.
Credit: Georgia Institute of Technology

June 5, 2009

Graphene beats copper in IC connections

It’s been a while since I’ve had the chance to blog about graphene, but here is the latest on the carbon nanomaterial.  (Be sure to hit the second link for images.)

The release:

Graphene May Have Advantages Over Copper for Future IC Interconnects

New Material May Replace Traditional Metal at Nanoscale Widths

Atlanta (June 4, 2009) —The unique properties of thin layers of graphite—known as graphene—make the material attractive for a wide range of potential electronic devices. Researchers have now experimentally demonstrated the potential for another graphene application: replacing copper for interconnects in future generations of integrated circuits.

In a paper published in the June 2009 issue of the IEEE journal Electron Device Letters, researchers at the Georgia Institute of Technology report detailed analysis of resistivity in graphene nanoribbon interconnects as narrow as 18 nanometers.

The results suggest that graphene could out-perform copper for use as on-chip interconnects—tiny wires that are used to connect transistors and other devices on integrated circuits. Use of graphene for these interconnects could help extend the long run of performance improvements for silicon-based integrated circuit technology.

“As you make copper interconnects narrower and narrower, the resistivity increases as the true nanoscale properties of the material become apparent,” said Raghunath Murali, a research engineer in Georgia Tech’s Microelectronics Research Center and the School of Electrical and Computer Engineering. “Our experimental demonstration of graphene nanowire interconnects on the scale of 20 nanometers shows that their performance is comparable to even the most optimistic projections for copper interconnects at that scale. Under real-world conditions, our graphene interconnects probably already out-perform copper at this size scale.”

Beyond resistivity improvement, graphene interconnects would offer higher electron mobility, better thermal conductivity, higher mechanical strength and reduced capacitance coupling between adjacent wires.

“Resistivity is normally independent of the dimension—a property inherent to the material,” Murali noted. “But as you get into the nanometer-scale domain, the grain sizes of the copper become important and conductance is affected by scattering at the grain boundaries and at the side walls. These add up to increased resistivity, which nearly doubles as the interconnect sizes shrink to 30 nanometers.”

The research was supported by the Interconnect Focus Center, which is one of the Semiconductor Research Corporation/DARPA Focus Centers, and the Nanoelectronics Research Initiative through the INDEX Center.

Murali and collaborators Kevin Brenner, Yinxiao Yang, Thomas Beck and James Meindl studied the electrical properties of graphene layers that had been taken from a block of pure graphite. They believe the attractive properties will ultimately also be measured in graphene fabricated using other techniques, such as growth on silicon carbide, which now produces graphene of lower quality but has the potential for achieving higher quality.

Because graphene can be patterned using conventional microelectronics processes, the transition from copper could be made without integrating a new manufacturing technique into circuit fabrication.

“We are optimistic about being able to use graphene in manufactured systems because researchers can already grow layers of it in the lab,” Murali noted. “There will be challenges in integrating graphene with silicon, but those will be overcome. Except for using a different material, everything we would need to produce graphene interconnects is already well known and established.”

Experimentally, the researchers began with flakes of multi-layered graphene removed from a graphite block and placed onto an oxidized silicon substrate. They used electron beam lithography to construct four electrode contacts on the graphene, then used lithography to fabricate devices consisting of parallel nanoribbons of widths ranging between 18 and 52 nanometers. The three-dimensional resistivity of the nanoribbons on 18 different devices was then measured using standard analytical techniques at room temperature.

The best of the graphene nanoribbons showed conductivity equal to that predicted for copper interconnects of the same size. Because the comparisons were between non-optimized graphene and optimistic estimates for copper, they suggest that performance of the new material will ultimately surpass that of the traditional interconnect material, Murali said.

“Even graphene samples of moderate quality show excellent properties,” he explained. “We are not using very high levels of optimization or especially clean processes. With our straightforward processing, we are getting graphene interconnects that are essentially comparable to copper. If we do this more optimally, the performance should surpass copper.”

Though one of graphene’s key properties is reported to be ballistic transport—meaning electrons can flow through it without resistance—the material’s actual conductance is limited by factors that include scattering from impurities, line-edge roughness and from substrate phonons—vibrations in the substrate lattice.

Use of graphene interconnects could help facilitate continuing increases in integrated circuit performance once features sizes drop to approximately 20 nanometers, which could happen in the next five years, Murali said. At that scale, the increased resistance of copper interconnects could offset performance increases, meaning that without other improvements, higher density wouldn’t produce faster integrated circuits.

“This is not a roadblock to achieving scaling from one generation to the next, but it is a roadblock to achieving increased performance,” he said. “Dimensional scaling could continue, but because we would be giving up so much in terms of resistivity, we wouldn’t get a performance advantage from that. That’s the problem we hope to solve by switching to a different materials system for interconnects.”

March 27, 2009

Nanogenerators

Very cool nanotech. Not sure how close this is to market, but man it’s very cool.

The release:

New nanogenerator may charge iPods and cell phones with a wave of the hand

IMAGE: Pictured is a schematic illustration shows the microfiber-nanowire hybrid nanogenerator, which is the basis of using fabrics for generating electricity.

Click here for more information. 

SALT LAKE CITY, March 26, 2009 — Imagine if all you had to do to charge your iPod or your BlackBerry was to wave your hand, or stretch your arm, or take a walk? You could say goodbye to batteries and never have to plug those devices into a power source again.

In research presented here today at the American Chemical Society’s 237th National Meeting, scientists from Georgia describe technology that converts mechanical energy from body movements or even the flow of blood in the body into electric energy that can be used to power a broad range of electronic devices without using batteries.

“This research will have a major impact on defense technology, environmental monitoring, biomedical sciences and even personal electronics,” says lead researcher Zhong Lin Wang, Regents’ Professor, School of Material Science and Engineering at the Georgia Institute of Technology. The new “nanogenerator” could have countless applications, among them a way to run electronic devices used by the military when troops are far in the field.

The researchers describe harvesting energy from the environment by converting low-frequency vibrations, like simple body movements, the beating of the heart or movement of the wind, into electricity, using zinc oxide (ZnO) nanowires that conduct the electricity. The ZnO nanowires are piezoelectric — they generate an electric current when subjected to mechanical stress. The diameter and length of the wire are 1/5,000th and 1/25th the diameter of a human hair.

In generating energy from movement, Wang says his team concluded that it was most effective to develop a method that worked at low frequencies and was based on flexible materials. The ZnO nanowires met these requirements. At the same time, he says a real advantage of this technology is that the nanowires can be grown easily on a wide variety of surfaces, and the nanogenerators will operate in the air or in liquids once properly packaged. Among the surfaces on which the nanowires can be grown are metals, ceramics, polymers, clothing and even tents.

“Quite simply, this technology can be used to generate energy under any circumstances as long as there is movement,” according to Wang.

To date, he says that there have been limited methods created to produce nanopower despite the growing need by the military and defense agencies for nanoscale sensing devices used to detect bioterror agents. The nanogenerator would be particularly critical to troops in the field, where they are far from energy sources and need to use sensors or communication devices. In addition, having a sensor which doesn’t need batteries could be extremely useful to the military and police sampling air for potential bioterrorism attacks in the United States, Wang says.

While biosensors have been miniaturized and can be implanted under the skin, he points out that these devices still require batteries, and the new nanogenerator would offer much more flexibility.

A major advantage of this new technology is that many nanogenerators can produce electricity continuously and simultaneously. On the other hand, the greatest challenge in developing these nanogenerators is to improve the output voltage and power, he says.

Last year Wang’s group presented a study on nanogenerators driven by ultrasound. Today’s research represents a much broader application of nanogenerators as driven by low-frequency body movement.

 

###

 

The study was funded by the Defense Advanced Research Projects Agency, the Department of Energy, the National Institutes of Health and the National Science Foundation.

The American Chemical Society is a nonprofit organization chartered by the U.S. Congress. With more than 154,000 members, ACS is the world’s largest scientific society and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in Washington, D.C., and Columbus, Ohio.

 

December 7, 2008

Making headway toward quantum networks

Quantum computing is coming. Get ready.

The release:

New record for information storage and retrieval lifetime advances quantum networks

Quantum memory boost

IMAGE: Ran Zhao and Yaroslav Dudin, graduate students in the Georgia Tech School of Physics, adjust optics in a system used to study quantum memory.

Click here for more information. 

Physicists have taken a significant step toward creation of quantum networks by establishing a new record for the length of time that quantum information can be stored in and retrieved from an ensemble of very cold atoms. Though the information remains usable for just milliseconds, even that short lifetime should be enough to allow transmission of data from one quantum repeater to another on an optical network.

The new record – 7 milliseconds for rubidium atoms stored in a dipole optical trap – is scheduled to reported December 7 in the online version of the journal Nature Physics by researchers at the Georgia Institute of Technology. The previous record for storage time was 32 microseconds, a difference of more than two orders of magnitude.

“This is a really significant step for us, because conceptually it allows long memory times necessary for long-distance quantum networking,” said Alex Kuzmich, associate professor in the Georgia Tech School of Physics and a co-author of the paper. “For multiple architectures with many memory elements, several milliseconds would allow the movement of light across a thousand kilometers.”

The keys to extending the storage time included the use of a one-dimensional optical lattice to help confine the atoms and selection of an atomic phase that is insensitive to magnetic effects. The research was sponsored by the National Science Foundation, the A.P. Sloan Foundation and the U.S. Office of Naval Research.

IMAGE: A research group from the Georgia Institute of Technology poses with optical equipment used to study quantum memory.

Click here for more information. 

The general purpose of quantum networking or quantum computing is to distribute entangled qubits – two correlated data bits that are either “0” or “1” – over long distances. The qubits would travel as photons across existing optical networks that are part of the global telecommunications system.

Because of loss in the optical fiber that makes up networks, repeaters must be installed at regular intervals – about every 100 kilometers – to boost the signal. Those repeaters will need quantum memory to receive the photonic signal, store it briefly and then produce a photonic signal that will carry the information to the next node, and on to its final destination.

For their memory, the Georgia Tech researchers used an ensemble of rubidium-87 atoms that is cooled to almost absolute zero to minimize atomic motion. To store information, the entire atomic ensemble is exposed to laser light carrying a signal, which allows each atom to participate in the storage as part of a “collective excitation.”

In simple terms, each atom “sees” the incoming signal – which is a rapidly oscillating electromagnetic field – slightly differently. Each atom is therefore imprinted with phase information that can later be “read” from the ensemble with another laser.

IMAGE: Associate professor Alex Kuzmich and research scientist Stewart Jenkins, both from the Georgia Tech School of Physics, adjust optics in a system used to study quantum memory.

Click here for more information. 

Even though they are very cold, the atoms of the ensemble are free to move in a random way. Because each atom stores a portion of the quantum information and that data’s usefulness depends on each atom’s location in reference to other atoms, significant movement of the atoms could destroy the information.

“The challenge for us in implementing these long-lived quantum memories is to preserve the phase imprinting in the atomic ensemble for as long as possible,” explained Stewart Jenkins, a School of Physics research scientist who participated in the research. “It turns out that is difficult to do experimentally.”

To extend the lifetime of their memory, the Georgia Tech researchers took two approaches. The first was to confine the atoms using an optical lattice composed of laser beams. Because of the laser frequencies chosen, the atoms are attracted to specific locations within the lattice, though they are not held tightly in place.

Because the ensemble atoms are affected by environmental conditions such as magnetism, the second strategy was to use atoms that had been pumped to the so-called “clock transition state” that is relatively insensitive to magnetic fields.

“The most critical aspect to getting these long coherence times was the optical lattice,” Jenkins explained. “Although atoms had been confined in optical lattices before, what we did was to use this tool in the context of implementing quantum memory.”

Other research teams have stored quantum information in single atoms or ions. This simpler approach allows longer storage periods, but has limitations, he said.

“The advantage of using these ensembles as opposed to single atoms is that if we shine into them a ‘read’ laser field, because these atoms have a particular phase imprinted on them, we know with a high degree of probability that we are going to get a second photon – the idler photon – coming out in a particular direction,” Jenkins explained. “That allows us to put a detector in the right location to read the photon.”

Though the work significantly advances quantum memories, practical quantum networks probably are at least a decade away, Kuzmich believes.

“In practice, you will need to make robust repeater nodes with hundreds of memory elements that can be quickly manipulated and coupled to the fiber,” he said. “There is likely to be slow progress in this area with researchers gaining better and better control of quantum systems. Eventually, they will get good enough so we can make a jump to having systems that can work outside the laboratory environment.”

 

###

 

In addition to Kuzmich and Jenkins, the research team included Ran Zhao, Yaroslav Dudin, Corey Campbell, Dzmitry Matsukevich, and Brian Kennedy, a professor in the School of Physics.

September 26, 2008

Hewlett-Packard Laboratories plans exascale data centers

From KurzweilAI.net — The latest in supercomputng news is Hewlett-Packard Laboratories working with the Georgia Institute of Technology is planning exascale data centers utilizing farms of petaflop computers.

HP Labs aims at exascale computing
EE Times, Sep. 19, 2008

Hewlett-Packard Laboratories and Georgia Institute of Technologyare planning to develop exascale datacenters with farms of petaflop-caliber computers to achieve 1,000-fold increases over the world’s fastest computers, using virtualized multi-core processors with special-purpose chips, like graphics accelerators.

Enhanced large-scale applications include climate modeling, biological simulations, drug discovery, national defense, energy assurance and advanced materials development.

An exaflop is 1000 petaflops or 1018 flops (floating point operations/seccond). As noted in Ray Kurzweil’s The Singularity is Near, estimates of humanbrain equivalence range from 1014(Moravec) to 1016(Kurzweil). – Ed.

 
Read Original Article>>

July 28, 2008

Nanomagnets fighting cancer

From KurzweilAI.net — Nanoparticle-sized magnets specially coated to “catch” ovarian cancer cells are a new cancer-fighting treatment.

 

Magnets Capture Cancer Cells
Technology Review, July 22, 2008

Georgia Institute of Technology researchers have developed magnetic nanoparticles (coated with a specialized targeting peptide molecule) designed to latch onto ovarian cancer cells in mice and drag them out of the abdominal fluid to prevent metastasis.


Nanoparticles (red) on cancer cell

See Also New Nano Weapon against Cancer

 
Read Original Article>>