David Kirkpatrick

October 11, 2010

Congrats to Sully

Filed under: et.al., Media, Politics — Tags: , , , — David Kirkpatrick @ 10:33 am

Many thanks and congratulations to Andrew Sullivan for reaching ten years blogging at his Daily Dish. It’s simply one of the best, and most honest, political (and, of course, more) blogs out there. He wears his heart on his sleeve most of the time and every once in a while can make a fairly harsh snap judgement on any number of topics, but one thing Sullivan has always done is remain intellectually curious and open. As he himself has put it more than once, you can watch him change his mindset on topics in real-time over weeks and months of blog posts. The Daily Dish has long been a daily read for me, and I doubt that changes anytime soon.

October 8, 2010

Watch out for Facebook’s “groups” overhaul

Filed under: Business, et.al., Media, Technology — Tags: , , , , , — David Kirkpatrick @ 9:56 am

Once again Facebook creates a PR headache for itself with the changes to Facebook groups. You just might find yourself part of a group you don’t really want to be a member of …

From the link:

That was followed by general confusion, with some reporting that Facebook’s new feature could be used to unilaterally add anyone to a group.

But that isn’t the case. The groups feature now lets users automatically add existing friends to groups, but they can’t do this with people they don’t know.

How did Zuckerberg get added to NAMBLA then? That’s all down to tech blogger Arrington. “I typed in his name and hit enter,’ Arrington wrote on TechCrunch. “He’s my Facebook friend, I therefore have the right to add him.”

Arrington added that “as soon as Zuckerberg unsubscribed I lost the ability to add him to any further groups at all, another protection against spamming and pranks.”

A Facebook spokeswoman confirmed that group members can only add their friends to the group. “If you have a friend that is adding you to groups you do not want to belong to, or they are behaving in a way that bothers you, you can tell them to stop doing it, block them or remove them as a friend — and they will no longer ever have the ability to add you to any group,” she wrote in an e-mail. “If you don’t trust someone to look out for you when making these types of decisions on the site, we’d suggest that you shouldn’t be friends on Facebook.”

 

October 2, 2010

The Geological Society of America goes 3D

I think the title says it all …

The release:

GSA Press Release – October 2010 Geosphere Highlights

Boulder, CO, USA – This month’s themed issue, “Advances in 3D imaging and analysis of geomaterials,” edited by Guilherme A.R. Gualda, Don R. Baker, and Margherita Polacci, features papers from the 2009 AGU Joint Assembly session “Advances in 3-D Imaging and Analysis of Rocks and Other Earth Materials.” Studies include 3-D imaging and analysis techniques for Wild 2 comet material returned from the NASA Stardust mission and the first 3-D X-ray scans of crystals from the Dry Valleys, Antarctica.

Keywords: Voxels, microtomography, fractures, NASA Stardust Mission, Wild 2, aerogel, Dry Valleys, Antarctica, geophysics, microearthquakes, Mexico, zircon dating, database, InSAR.

Highlights are provided below. Review abstracts for this issue at http://geosphere.gsapubs.org/.

Non-media requests for articles may be directed to GSA Sales and Service, gsaservice@geosociety.org .

***************
Introduction: Advances in 3D imaging and analysis of geomaterials
Guilherme A.R. Gualda, Vanderbilt University, Earth & Environmental Sciences, Station B #35-1805, Nashville, Tennessee 37235, USA

Excerpt: Beginning in the 1970s, the availability of computers led to the development of procedures for computer-assisted acquisition and reconstruction of 3-D tomographic data, in particular using X-rays. X-ray tomography is now a mature technique that is used routinely. It has been applied to a wide array of geomaterials, from rocks to fossils to diverse experimental charges, to name a few. The ability to create 3-D maps with millions to billions of volume elements (voxels) created the challenge of processing and analyzing such large amounts of data. While qualitative observations in 3-D yield significant insights into the nature of geomaterials and geological processes, it is in the pursuit of quantitative data that 3-D imaging shows its greatest potential. The continued improvements in computer capabilities have led to ever more sophisticated procedures for 3D image analysis. The papers in this issue encompass a wide range of topics, from applications of established techniques to a variety of materials, the development of new imaging techniques, and the description of improved imaging and analysis techniques.

**********
3D imaging of volcano gravitational deformation by computerized X-ray micro-tomography
M. Kervyn et al., Dept. of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, 9000 Gent, Belgium

Volcanoes are known to be unstable constructs that can deform gravitationally when they build upon weak sedimentary layers. The structures and velocity of deformation depend on the volcano loading and the properties of the underlying layers. These processes can be studied with scaled laboratory experiments in which volcanoes are simulated by a mixture of sand and plaster. Silicone is used to simulate the weak underlying layers. This team from Belgium and France, lead by M. Kervyn of Ghen University, presents the results of imaging such experiments with X-rays. The micro-tomography technology used in imaging these experiments enables the virtual re-construction of the 3-D shape of the deformed experiment. Virtual cross-sections through the experiment provide a new way to characterize the faults and fissures forming within the experimental volcano during its deformation. Results from a range of experiments with different geometrical characteristics provide a better understanding of the impact of such gravitational deformation, currently recorded at several well-known volcanoes on Earth (e.g. Etna, Kilauea), on the construct’s structure at depth and its potential zones of weakness.

**********
Three-dimensional measurement of fractures in heterogeneous materials using high-resolution X-ray computed tomography
Richard A. Ketcham et al., Dept. of Geological Sciences, Jackson School of Geosciences, 1 University Station C1100, The University of Texas at Austin, Austin, Texas 78712-0254, USA

When present, fractures tend to dominate fluid flow though rock bodies, and characterizing fracture networks is necessary for understanding these flow regimes. Specialized CAT scanning has long been an important tool in imaging fractures in 3-D in rock samples. However, a number of factors have reduced the fidelity of such data, including the natural heterogeneity of real rocks and the limited resolution of CAT scanning. Richard A. Ketcham of The University of Texas at Austin and colleagues present new, general methods for overcoming these problems and extracting the best-quality information possible concerning fracture aperture, roughness, and orientation, even in highly heterogeneous rocks. The methods are also general enough that they can be applied to similar situations, such as measuring mineral veins. This work was funded in part by U.S. National Science Foundation grants EAR-0113480 and EAR-0439806.

**********
Laser scanning confocal microscopy of comet material in aerogel
Michael Greenberg and Denton S. Ebel, Dept. of Earth and Planetary Sciences, American Museum of Natural History, Central Park West at 79th Street, New York, New York 10024, USA

The NASA Stardust mission returned extraterrestrial material from the comet Wild 2 — the first solid sample-return mission since the Apollo era. Particles from the tail of Wild 2 were captured in aerogel, low-density, translucent, silica foam at a relative velocity of 6.1 km per second. Upon impact into the aerogel, particles from the tail of the comet were fragmented, melted, and ablated, creating cavities, or tracks — each of which is unique to the original particle before capture. Michael Greenberg and Denton S. Ebel of the American Museum of Natural History present nondestructive 3-D imaging and analysis techniques for comet material returned from the NASA Stardust mission. The methods described in this paper represent the highest resolution 3-D images of Stardust material to date. The procedures described here will easily extend to other translucent samples in the geosciences.

**********

Quantum dots may lead to ultraefficient solar cells

This sounds promising.

From the link (emphasis mine):

Although researchers have steadily increased the amount of electricity that solar cells can produce, they face fundamental limits because of the physics involved in converting photons to electrons in semiconductor materials. Now researchers at the University of Wyoming have demonstrated that by using novel nanomaterials called quantum dots, it might be possible to exceed those limits and produce ultraefficient solar cells.

The theoretical limitation of solar cells has to do with the widely varying amounts of energy from photons in sunlight. The amount varies depending on the color of the light. No matter how energetic the incoming photons are, however, solar cells can only convert one photon into one electron with a given amount of energy. Any extra energy is lost as heat. Scientists have hypothesized that quantum dots, because of their unusual electronic properties, could convert some of this extra energy into electrons. They’ve calculated that this approach could increase the theoretical maximum efficiency of solar cells by about 50 percent.

Solar dots: A micrograph shows lead-sulfide quantum dots, each about five nanometers across, coating an electrode of titanium dioxide.
Credit: Science

September 30, 2010

Metamaterials and warp drives

Filed under: et.al., Science — Tags: , , , , , — David Kirkpatrick @ 2:20 pm

It’s almost time to call metamaterials simply that science fiction stuff. Usually you hear about metamaterials around these parts in posts about actual invisibility cloaking technology, and here’s one about metamaterials and warp drives. Metamaterials — turning science fiction into science fact …

From the link:

That means physicists can use metamaterials to simulate the universe itself and all the weird phenomenon of general relativity. We’ve looked at various attempts to recreate black holes, the Big Bang and even multiverses.

But there’s another thing that general relativity appears to allow: faster than light travel. In 1994, the Mexican physicist, Michael Alcubierre, realised that while relativity prevents faster-than-light travel relative to the fabric of spacetime, it places no restriction on the speed at which regions of spacetime can move relative to each other.

That suggests a way of building a warp drive. Alcubierre imagined a small volume of flat spacetime in which a spacecraft sits, surrounded by a bubble of spacetime that shrinks in the direction of travel, bringing your destination nearer, and stretches behind you. He showed that this shrinking and stretching could enable the bubble–and the spaceship it contained–to move at superluminal speeds.

Today, Igor Smolyaninov at the University of Maryland, points out that if these kinds of bubbles are possible in spacetime, then it ought to be possible to simulate them inside a metamaterial.

September 29, 2010

Data mining Twitter

Filed under: Business, Media, Technology — Tags: , , , , — David Kirkpatrick @ 8:11 pm

A report from inside the Twitterverse.

From the link:

Twitter messages might be limited to 140 characters each, but all those characters can add up. In fact, they add up to 12 terabytes of data every day.

“That would translate to four petabytes a year, if we weren’t growing,” said Kevin Weil, Twitter’s analytics lead, speaking at the Web 2.0 Expo in New York. Weil estimated that users would generate 450 gigabytes during his talk. “You guys generate a lot of data.”

This wealth of information seems overwhelming but Twitter believes it contains a lot of insights that could be useful to it as a business. For example, Weil said the company tracks when users shift from posting infrequently to becoming regular participants, and looks for features that might have influenced the change. The company has also determined that users who access the service from mobile devices typically become much more engaged with the site. Weil noted that this supports the push to offer Twitter applications for Android phones, iPhones, Blackberries, and iPads. And Weil said Twitter will be watching closely to see if the new design of its website increases engagement as much as the company hopes it will.

September 27, 2010

China, already cracking the rare earth metal whip

Filed under: Business, Politics, Technology — Tags: , , , , — David Kirkpatrick @ 1:57 pm

I know I’m over half a week late on this (and yes, I’m aware I haven’t blogged well over a week — been crazy around these parts of late), but since I covered the topic earlier this month I thought it was interesting it’s already hit the front pages.

The issue is China essentially controlling the world’s supply of 17 rare earth metals — critical for the manufacture of electronics and military parts, to name two key examples — and how that power might be wielded. I blogged that everyone fretting about Chinese ownership of U.S. Treasuries was completely misplacing their concern. That advice has already been borne out now that China has used that control as a political bludgeon against Japan. I’m betting this is just the opening kickoff of a very serious game of political football. (Couldn’t help the metaphor there, I’m still pretty excited the NFL season is in full swing.)

From the second link:

Sharply raising the stakes in a dispute over Japan’s detention of a Chinese fishing trawler captain, the Chinese government has blocked exports to Japan of a crucial category of minerals used in products like hybrid cars, wind turbines and guided missiles.

Chinese customs officials are halting shipments to Japan of so-called rare earth elements, preventing them from being loading aboard ships at Chinese ports, industry officials said on Thursday.

September 17, 2010

The best malware ever?

Filed under: Business, Technology — Tags: , , , , , — David Kirkpatrick @ 11:15 am

This is a development that can only be described as frightening.

From the link:

The Stuxnet worm is a “groundbreaking” piece of malware so devious in its use of unpatched vulnerabilities, so sophisticated in its multi-pronged approach, that the security researchers who tore it apart believe it may be the work of state-backed professionals.

“It’s amazing, really, the resources that went into this worm,” said Liam O Murchu, manager of operations with Symantec’s (SYMC) security response team.

“I’d call it groundbreaking,” said Roel Schouwenberg, a senior antivirus researcher at Kaspersky Lab. By comparison, other notable attacks, like the one dubbed “Aurora” that hacked Google’s (GOOG) network, and those of dozens of other major companies, was child’s play.

O Murchu and Schouwenberg should know: They work for the two security companies that discovered Stuxnet exploited not just one zero-day Windows bug, but four, an unprecedented number for a single piece of malware.

Stuxnet, which was first reported in mid-June by VirusBlokAda, a little-known security firm based in Belarus, gained notoriety a month later when Microsoft (MSFT) confirmed that the worm was actively targeting Windows PCs that managed large-scale industrial-control systems in manufacturing and utility firms.

September 16, 2010

NASA’s LRO finds diversity in the moon’s past

Here’s a release hot from the inbox. (I’m in light blogging mode for the middle of this week due to multiple projects, so I’m taking the easy way out here. Of course presenting the entire release is standard procedure with this blog anyway, so, um, enjoy!)

The release:

NASA’s LRO Exposes Moon’s Complex, Turbulent Youth

GREENBELT, Md., Sept. 16 /PRNewswire-USNewswire/ — The moon was bombarded by two distinct populations of asteroids or comets in its youth, and its surface is more complex than previously thought, according to new results from NASA’s Lunar Reconnaissance Orbiter (LRO) spacecraft featured in three papers appearing in the Sept. 17 issue of Science.

(Logo: http://photos.prnewswire.com/prnh/20081007/38461LOGO)
(Logo: http://www.newscom.com/cgi-bin/prnh/20081007/38461LOGO)

In the first paper, lead author James Head of Brown University in Providence, R.I., describes results obtained from a detailed global topographic map of the moon created using LRO’s Lunar Orbiter Laser Altimeter (LOLA). “Our new LRO LOLA dataset shows that the older highland impactor population can be clearly distinguished from the younger population in the lunar ‘maria’ — giant impact basins filled with solidified lava flows,” says Head. “The highlands have a greater density of large craters compared to smaller ones, implying that the earlier population of impactors had a proportionally greater number of large fragments than the population that characterized later lunar history.”

Meteorite impacts can radically alter the history of a planet. The moon, Mars, and Mercury all bear scars of ancient craters hundreds or even thousands of miles across. If Earth was subjected to this assault as well — and there’s no reason to assume our planet was spared — these enormous impacts could have disrupted the initial origin of life. Large impacts that occurred later appear to have altered life’s evolution. The approximately 110-mile-diameter, partially buried crater at Chicxulub, in the Yucatan Peninsula of Mexico, is from an impact about 65 million years ago that is now widely believed to have led or contributed to the demise of the dinosaurs and many other life forms.

Scientists trying to reconstruct the meteorite bombardment history of Earth face difficulty because impact craters are eroded by wind and water, or destroyed by the action of plate tectonics, the gradual movement and recycling of the Earth’s crust. However, a rich record of craters is preserved on the moon, because it has only an extremely thin atmosphere — a vacuum better than those typically used for experiments in laboratories on Earth. The moon’s surface has no liquid water and no plate tectonics. The only source of significant erosion is other impacts.

“The moon is thus analogous to a Rosetta stone for understanding the bombardment history of the Earth,” said Head. “Like the Rosetta stone, the lunar record can be used to translate the ‘hieroglyphics’ of the poorly preserved impact record on Earth.”

Even so, previous lunar maps had different resolutions, viewing angles, and lighting conditions, which made it hard to consistently identify and count craters. Head and his team used the LOLA instrument on board LRO to build a map that highlights lunar craters with unprecedented clarity. The instrument sends laser pulses to the lunar surface, measures the time that it takes for them to reflect back to the spacecraft, and then with a very precise knowledge of the orbit of the LRO spacecraft, scientists can convert this information to a detailed topographic map of the moon, according to Head.

Objects hitting the moon can be categorized in different “impactor populations,” where each population has its own set of characteristics. Head also used the LOLA maps to determine the time when the impactor population changed. “Using the crater counts from the different impact basins and examining the populations making up the superposed craters, we can look back in time to discover when this transition in impactor populations occurred. The LRO LOLA impact crater database shows that the transition occurred about the time of the Orientale impact basin, about 3.8 billion years ago. The implication is that this change in populations occurred around the same time as the large impact basins stopped forming, and raises the question of whether or not these factors might be related. The answers to these questions have implications for the earliest history of all the planets in the inner solar system, including Earth,” says Head.

In the other two Science papers, researchers describe how data from the Diviner Lunar Radiometer Experiment instrument on LRO are showing that the geologic processes that forged the lunar surface were complex as well. The data have revealed previously unseen compositional differences in the crustal highlands, and have confirmed the presence of anomalously silica-rich material in five distinct regions.

Every mineral, and therefore every rock, absorbs and emits energy with a unique spectral signature that can be measured to reveal its identity and formation mechanisms. For the first time ever, LRO’s Diviner instrument is providing scientists with global, high-resolution infrared maps of the moon, which are enabling them to make a definitive identification of silicate minerals commonly found within its crust. “Diviner is literally viewing the moon in a whole new light,” says Benjamin Greenhagen of NASA’s Jet Propulsion Laboratory in Pasadena, Calif., lead author of one of the Diviner Science papers.

Lunar geology can be roughly broken down into two categories – the anorthositic highlands, rich in calcium and aluminium, and the basaltic maria, which are abundant in iron and magnesium. Both of these crustal rocks are what’s deemed by geologists as ‘primitive'; that is, they are the direct result of crystallization from lunar mantle material, the partially molten layer beneath the crust.

Diviner’s observations have confirmed that most lunar terrains have spectral signatures consistent with compositions that fall into these two broad categories. However they have also revealed that the lunar highlands may be less homogenous than previously thought.

In a wide range of terrains, Diviner revealed the presence of lunar soils with compositions more sodium rich than that of the typical anorthosite crust. The widespread nature of these soils reveals that there may have been variations in the chemistry and cooling rate of the magma ocean which formed the early lunar crust, or they could be the result of secondary processing of the early lunar crust.

Most impressively, in several locations around the moon, Diviner has detected the presence of highly silicic minerals such as quartz, potassium-rich, and sodium-rich feldspar — minerals that are only ever found in association with highly evolved lithologies (rocks that have undergone extensive magmatic processing).

The detection of silicic minerals at these locations is a significant finding for scientists, as they occur in areas previously shown to exhibit anomalously high abundances of the element thorium, another proxy for highly evolved lithologies.

“The silicic features we’ve found on the moon are fundamentally different from the more typical basaltic mare and anorthositic highlands,” says Timothy Glotch of Stony Brook University in Stony Brook, N.Y., lead author of the second Diviner Science paper. “The fact that we see this composition in multiple geologic settings suggests that there may have been multiple processes producing these rocks.”

One thing not apparent in the data is evidence for pristine lunar mantle material, which previous studies have suggested may be exposed at some places on the lunar surface. Such material, rich in iron and magnesium, would be readily detected by Diviner.

However, even in the South Pole Aitken Basin (SPA), the largest, oldest, and deepest impact crater on the moon — deep enough to have penetrated through the crust and into the mantle — there is no evidence of mantle material.

The implications of this are as yet unknown. Perhaps there are no such exposures of mantle material, or maybe they occur in areas too small for Diviner to detect.

However, it’s likely that if the impact that formed this crater did excavate any mantle material, it has since been mixed with crustal material from later impacts inside and outside SPA. “The new Diviner data will help in selecting the appropriate landing sites for potential future robotic missions to return samples from SPA. We want to use these samples to date the SPA-forming impact and potentially study the lunar mantle, so it’s important to use Diviner data to identify areas with minimal mixing,” says Greenhagen.

The research was funded by NASA’s Exploration Systems Missions Directorate at NASA Headquarters in Washington. LRO was built and is managed by NASA’s Goddard Space Flight Center in Greenbelt, Md. LOLA was built by NASA Goddard. David E. Smith from the Massachusetts Institute of Technology and NASA Goddard is the LOLA principal investigator. The Diviner instrument was built and is managed by NASA’s Jet Propulsion Laboratory in Pasadena, Calif. UCLA is the home institution of Diviner’s principal investigator, David Paige.

For images and more information about LRO, visit:

http://www.nasa.gov/lro

Photo:  http://www.newscom.com/cgi-bin/prnh/20081007/38461LOGO
PRN Photo Desk photodesk@prnewswire.com
http://photos.prnewswire.com/prnh/20081007/38461LOGO
Source: NASA

Web Site:  http://www.nasa.gov/

September 14, 2010

Broadband in the U.S. is overpriced

Filed under: Business, Technology — Tags: , , , — David Kirkpatrick @ 2:55 pm

Not too surprising given the near monopoly status of the industry.

From the link:

The reasons for the stagnation of U.S. broadband are multifactorial, but one of the authors, Shane Greenstein, argues that the 2003 decision allowing the broadband industry to regulate itself has caused much of the stagnation.

(For perspective, check out how much faster most of Europe and Asia is than the U.S., when it comes to broadband.)

Greenstein says that by now, broadband companies should have paid off almost all the costs associated with building out their infrastructure.

“We are approaching the end of the first buildout, so competitive pressures should have led to price drops by now, if there are any. Like many observers, I expected to see prices drop by now, and I am surprised they have not,”Greenstein told Kelogg Insight, a house organ for the university.

This means that broadband companies are now operating their broadband as almost “pure profit,” devoting only a small fraction of subscriber revenues to maintenance.

Without new entries on the market — most urban areas have at most two different broadband suppliers to choose from, the phone company and the cable company — Greenstein argues there is no incentive to lower prices.

September 13, 2010

In advance of favorable midterm elections …

Filed under: Business, Politics — Tags: , , , , , — David Kirkpatrick @ 5:21 pm

… the “party of ‘no'” says, “maybe.”

Good news on the tax front. This at least hints the GOP isn’t willing to blow up tax cuts for a very huge majority of taxpayers just to side with the top two percent (or thereabouts) of households.

From the link:

The top Republican in the House of Representatives offered a hint of compromise on the divisive issue of taxes on Sunday, saying he would support extending tax cuts for the middle class even if cuts for the wealthy are allowed to expire.

Representative John Boehner said President Barack Obama’s proposal to renew lower tax rates for families making less than $250,000 but let the lower rates for wealthier Americans expire was “bad policy” — but he will support it if he must.

“If the only option I have is to vote for some of those tax reductions, I’ll vote for it,” Boehner said on CBS’s “Face the Nation” program.

“If the only option I have is to vote for those at 250 and below, of course I’m going to do that,” he said. “But I’m going to do everything I can to fight to make sure that we extend the current tax rates for all Americans.”

Singularity University’s Graduate Studies Program student projects

Via KurzweilAI.net — I blogged about today’s webinar last week, and here’s a summary of the student projects from this year’s Singularity University.

From the first link:

Singularity University webinar today: sneak preview

September 13, 2010 by Edito

Former astronaut Dan Barry, M.D., PhD, faculty head of Singularity University, will join Singularity University co-founders Dr. Ray Kurzweil and Dr. Peter H. Diamandis on Monday, September 13, at 9:30am PT/12:30pm ET, in a live video webinar briefing to unveil this summer’s Graduate Studies Program student projects.

The projects aim to impact a billion people within ten years.

A Q&A session will follow the briefing. The briefing is free and is open to media and the public — visit http://briefing.singularityu.org/ to register.

Here are some of the team projects to be profiled in the webinar.

Achieving the benefits of space at a fraction of the cost

The space project teams have developed imaginative new solutions for space and spinoffs for Earth. The AISynBio project team is working with leading NASA scientists to design bioengineered organisms that can use available resources to mitigate harsh living environments (such as lack of air, water, food, energy, atmosphere, and gravity) – on an asteroid, for example, and also on Earth .

The SpaceBio Labs team plans to develop methods for doing low-cost biological research in space, such as 3D tissue engineering and protein crystallization.

The Made in Space team plans to bring 3D printing to space to make space exploration cheaper, more reliable, and fail-safe (“send the bits, not the atoms”).  For example, they hope to replace some of the $1 billion worth of spare parts and tools that are on the International Space Station.

The Cheap Access to Space team is working with NASA Ames and CalTech engineers and scientists on a radical space propulsion system using beamed microwave energy to dramatically reduce the cost of a space launch by a factor of ten.

Solving key problems for a billion people on Earth

Back on Earth, a number of teams are working on solving global problems of waste, energy, hunger, and water.

The three Upcycle teams have developed synergistic solutions to eliminate waste and reduce energy use.

The Fre3dom team is planning to bring 3D printing to the developing world to allow local communities to make their own much-needed spare parts using bioplastics.

The BioMine team is developing environmentally regenerative, safe, efficient and scalable biological methods for the extraction of metals from electronic waste. This is a multidisciplinary team with technical expertise ranging from synthetic biology and chemical engineering to computer science and biotech IP, and they are leveraging exponential advances in bioengineering, functional genomics, bioinformatics and computational modeling.

The i2cycle team focuses on developing global industrial ecosystems by upcycling one manufacturer’s waste (such as glass and ceramics) into raw material for another manufacturer (such as manufacturing tiles), conserving resources and energy in the process.

[+]

The AmundA team is developing a Web-based tool that offers data such as electricity demand and energy resources  to guide suppliers in finding optimum, lower-cost, energy generation solutions.  They hope to  help 1.5 billion potential customers in the developing world gain access to electricity.

The H2020 team is building an intelligent, web-based platform to provide information on water to people. For example, they will use smart phones to crowd-source data about water problems,  such as pollution or shortages, in communities at the “bottom of the pyramid,” and will use AI to match problems with solutions.

The Naishio (“no salt” in Japanese) team, inspired by lecturers such as Dean Kamen, plans to use nanofilters to achieve very low cost and compact, but high-volume desalination. They have a designed a filtration cube measuring just 6.5 inches per side that could produce 100,000 gallons of purified water per day.

The Food for Cities program is planning to grow all the vegetables you need in a box barely larger than your refrigerator, using “aeroponics,” which could feed a billion people healthy food at low cost.

And the Know (Knowledge, Opportunity, Network for Women) team seeks to empower young women across the world by providing them with mentors and resources.

Full disclosure: writer and KurzweilAI editor Amara D. Angelica is an advisor to Singularity University.

September 12, 2010

Mapping the internet

Research out of the San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in collaboration with the Universitat de Barcelona in Spain and the University of Cyprus.

The release, ahem, article:

SDSC Collaboration Aims to Create First Accurate Geometric Map of the Internet

September 09, 2010

By Jan Zverina

The San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in a collaboration with researchers from Universitat de Barcelona in Spain and the University of Cyprus, have created the first geometric “atlas” of the Internet as part of a project to prevent our most ubiquitous form of communication from collapsing within the next decade or so.

In a paper published this week in Nature Communications, CAIDA researcher Dmitri Krioukov, along with Marián Boguñá (Universitat de Barcelona) and Fragkiskos Papadopoulos (University of Cyprus), describe how they discovered a latent hyperbolic, or negatively curved, space hidden beneath the Internet’s topology, leading them to devise a method to create an Internet map using hyperbolic geometry. In their paper, Sustaining the Internet with Hyperbolic Mapping, the researchers say such a map would lead to a more robust Internet routing architecture because it simplifies path-finding throughout the network.

“We compare routing in the Internet today to using a hypothetical road atlas, which is really just a long encoded list of road intersections and connections that would require drivers to pore through each line to plot a course to their destination without using any geographical, or geometrical, information which helps us navigate through the space in real life,” said Krioukov, principal investigator of the project.

Now imagine that a road – or in the case of the Internet, a connection – is closed for some reason and there is no geographical atlas to plot a new course, just a long list of connections that need to be updated. “That is basically how routing in the Internet works today – it is based on a topographical map that does not take into account any geometric coordinates in any space,” said Krioukov, who with his colleagues at CAIDA have been managing a project called Archipelago, or Ark, that constantly monitors the topology of the Internet, or the structure of its interconnections.

Like many experts, however, Krioukov is concerned that existing Internet routing, which relies on only this topological information, is not really sustainable. “It is very complicated, inefficient, and difficult to scale to the rapidly growing size of the Internet, which is now accessed by more than a billion people each day. In fact, we are already seeing parts of the Internet become intermittently unreachable, sinking into so-called black holes, which is a clear sign of instability.”

Krioukov and his colleagues have developed an in-depth theory that uses hyperbolic geometry to describe a negatively curved shape of complex networks such as the Internet. This theory appears in paper Hyperbolic Geometry of Complex Networks, published by Physical Review E today. In their Nature Communications paper, the researchers employ this theory, Ark’s data, and statistical inference methods to build a geometric map of the Internet. They show that routing using such a map would be superior to the existing routing, which is based on pure topology.

Instead of perpetually accessing and rebuilding a reference list of all available network paths, each router in the Internet would know only its hyperbolic coordinates and the coordinates of its neighbors so it could route in the right direction, only relaying the information to its closest neighbor in that direction, according to the researchers. Known as “greedy routing”, this process would dramatically increase the overall efficiency and scalability of the Internet. “We believe that using such a routing architecture based on hyperbolic geometry will create the best possible levels of efficiency in terms of speed, accuracy, and resistance to damage,” said Krioukov.

However the researchers caution that actually implementing and deploying such a routing structure in the Internet might be as challenging, if not more challenging, than discovering its hidden space. “There are many technical and non-technical issues to be resolved before the Internet map that we found would be the map that the Internet uses,” said Krioukov.

The research was in part funded by the National Science Foundation, along with Spain’s   Direcção Geral de Ensino Superior (DGES), Generalitat de Catalunya, and by Cisco Systems. The Internet mapping paper as published in Nature Communications can be found here. The Physical Review E paper can be found here.

Sunday NFL football …

Filed under: Sports — Tags: , , , — David Kirkpatrick @ 10:34 am

…. is back!

Thursday’s game was great (especially since the Saints won) and the Monday doubleheader will be even better, but there’s nothing like National Football League games on Sunday.

Stem cell therapy potential — here comes the science

Stem cell research is back all over the news again with court rulings and counter rulings making the subject either okay, or not okay, for federal funding. It’s a crazy debate to my mind because stem cell research has the potential to improve the health of many, many people and it’s a philosophical crime for it to be held hostage to the mythology of theocons. And even if the research is held back in the United States by lack of government money, it will be going on around the world and just pushing the U.S. that much farther behind in the cutting edge of medical research.

It’s a hot topic all the time, especially right now, but really what is the potential of stem cell research? Helpfully here’s news out of Elsevier Health Sciences with some expert opinion on the subject.

From the first link, the release:

What Progress Has Been Made, What Is Its Potential?

New York, NY, September 9, 2010 – The use of stem cells for research and their possible application in the treatment of disease are hotly debated topics. In a special issue of Translational Research published this month an international group of medical experts presents an in-depth and balanced view of the rapidly evolving field of stem cell research and considers the potential of harnessing stem cells for therapy of human diseases including cardiovascular diseases, renal failure, neurologic disorders, gastrointestinal diseases, pulmonary diseases, neoplastic diseases, and type 1 diabetes mellitus.

Personalized cell therapies for treating and curing human diseases are the ultimate goal of most stem cell-based research. But apart from the scientific and technical challenges, there are serious ethical concerns, including issues of privacy, consent and withdrawal of consent for the use of unfertilized eggs and embryos. “Publication of this special issue could not have been more timely, given the recent federal district court injunction against federal support for human embryonic stem cell research,” said Dr. Jeffrey Laurence, M.D., Professor of Medicine at Weill Cornell Medical College and Editor in Chief of Translational Research. “This court order stops all pending federal grants and contracts, as well as their peer review, suspending over 20 major research programs and over $50 million in federal funding for them,” he noted. As Dr. Francis Collins, NIH director, stated, “This decision has the potential to do serious damage to one of the most promising areas of biomedical research, just at the time when we were really gaining momentum.”

Through a series of authoritative articles authors highlight basic and clinical research using human embryonic and adult stem cells. Common themes include preclinical evidence supporting the potential therapeutic use of stem cells for acute and chronic diseases, the challenges in translating the preclinical work to clinical applications, as well as the results of several randomized clinical trials. Authors stress that considerable preclinical work is needed to test the potential of these approaches for translation to the clinical setting.

In considering the potential for clinical applications, some common challenges and questions persist. The issue focuses on critical questions such as whether the use of any stem cell population will increase the risk of cancer in the recipient and whether the goal of stem cell therapy is to deliver cells that can function as organ-specific cells.

Writing in a commentary on advances and challenges in translating stem cell therapies for clinical diseases, Michael A. Matthay, MD, Cardiovascular Research Institute, University of California San Francisco, notes that “the progress that has been achieved in the last 30 years in using allogeneic and autologous hematopoietic stem cells for the effective treatment of hematologic malignancies should serve as a model of how clinical applications may yet be achieved with embryonic stem cells, induced pluripotent stem cells, endothelial progenitor cells, and mesenchymal stem cells. Although several challenges exist in translating stem cell therapy to provide effective new treatments for acute and chronic human diseases, the potential for developing effective new cell-based therapies is high.”

KEY POINTS:

Bone marrow and circulating stem/progenitor cells for regenerative cardiovascular therapy
Mohamad Amer Alaiti, Masakazu Ishikawa, and Marco A. Costa
Despite initial promising pilot studies, only small improvements in a few clinical outcomes have been seen using stem cell therapies to treat heart disease in the acute or chronic setting. But new research, and a multitude of new pilot studies, may alter this scenario.

New therapies for the failing heart: trans-genes versus trans-cells
Vincenzo Lionetti, and Fabio A. Recchia
This review presents key aspects of cardiac gene therapy and stem cell therapy for the failing heart. Recent discoveries in stem cell biology may revitalize gene therapy and, vice versa.

Endothelial lineage cell as a vehicle for systemic delivery of cancer gene therapy
Arkadiusz Z. Dudek
Rather than focusing on the cancer cell itself, attention to blood vessels feeding the cancerous cells, lined by endothelial cells, presents a new avenue of cancer therapy. The author discusses recent evidence that endothelial progenitor cells may be useful in treating primary and metastatic tumors. Targeted cancer gene therapy using endothelial lineage cells to target tumor sites and produce a therapeutic protein has proven feasible.

Pluripotent stem cell-derived natural killer cells for cancer therapy
David A. Knorr, and Dan S. Kaufman
The potential value as well as challenges of using human embryonic stem cells and induced pluripotent stem cells is to provide platforms for new cell-based therapies to treat malignant diseases are discussed.

Translation of stem cell therapy for neurological diseases
Sigrid C. Schwarz, and Johannes Schwarz
Early clinical work to develop cell-based therapy for neurologic disorders such as Parkinson’s disease is discussed.

Stem cell technology for the treatment of acute and chronic renal failure
Christopher J. Pino, and H. David Humes
The authors cover the relative potential and success to date of embryonic or induced pluripotent stem cells as therapies for regenerating functional kidney tissue.

Stem cell approaches for the treatment of type 1 diabetes mellitus
Ryan T. Wagner, Jennifer Lewis, Austin Cooney, and Lawrence Chan
The authors provide a thorough discussion of the potential of using either embryonic stem cells or induced pluripotent stem cells to generate functional islet cells, the cells of the pancreas which normally make insulin, but fail to do so in severe forms of diabetes.

Intestinal stem cells and epithelial–mesenchymal interactions in the crypt and stem cell niche
Anisa Shaker, and Deborah C. Rubin
Both preclinical and early clinical trials have been carried out with allogeneic bone marrow-derived mesenchymal stem cells to treat steroid refractory acute and chronic inflammatory bowel diseases, particularly Crohn’s disease.

Stem cells and cell therapy approaches in lung biology and diseases
Viranuj Sueblinvong, and Daniel J. Weiss,
Cell-based therapies with embryonic or adult stem cells have emerged as potential novel approaches for several devastating and otherwise incurable lung diseases, including emphysema, pulmonary fibrosis, pulmonary hypertension, and acute respiratory distress syndrome.

###

The articles appear in Translational Research, The Journal of Laboratory and Clinical Medicine, Volume 156, Issue 3 (September 2010) entitled Stem Cells: Medical Promises and Challenges, published by Elsevier. The entire issue will be available online via Open Access for a 3-month period beginning September 20, 2010 at www.translationalres.com.

September 11, 2010

Remember

Filed under: et.al., Politics — Tags: , , , , , — David Kirkpatrick @ 11:46 am

That image is not easy to look at, but I think it’s important to remember what it meant for America and to protect this date from demagogues and politicization. This event brought international terrorism to U.S. soil in a manner that dwarfed all previous attempts, and in consequence it, at least for a while, united all citizens of America.

The aftermath of what happened to U.S. polity and policy after 9/11 can be debated, but for a few weeks in September 2001, this was truly a nation united.

Here is bit from a comment on a blog post of mine from February 2, 2008 (the post from January 31) on how I felt on September 11, 2001:

Sure that Tuesday morning I was blindingly angry. I was woken in a vacation condo on the beach in Panama City Beach, Florida, to hear the World Trade Center towers were both struck by planes. When the media began reporting celebrations in Afghanistan I immediately thought of bin Laden (didn’t think of al Qaeda per se, but I was aware of bin Laden pre-9/11). My next thought was we should nuke that country back from its then (and now) Middle Age society to the Stone Age, or maybe to time before humans walked in Afghanistan.

That was my heart. I feel no less strongly about Islamic terrorism today than I did at that moment. I do know I think the US did very well for itself before 9/11, and to me nothing occurred that warrants changing our fundamental approach to the world.

September 10, 2010

Graphene could speed up DNA sequencing

I’ve blogged on this topic before (and on this very news bit in the second post from the link), but this just reiterates the versatility of graphene and why the material has so many scientists, researchers and entrepreneurs so excited.

From the second link:

By drilling a tiny pore just a few-nanometers in diameter, called a , in the graphene membrane, they were able to measure exchange of ions through the pore and demonstrated that a long  can be pulled through the graphene nanopore just as a thread is pulled through the eye of a needle.

“By measuring the flow of ions passing through a nanopore drilled in graphene we have demonstrated that the thickness of graphene immersed in liquid is less then 1 nm thick, or many times thinner than the very thin membrane which separates a single animal or human cell from its surrounding environment,” says lead author Slaven Garaj, a Research Associate in the Department of Physics at Harvard. “This makes graphene the thinnest membrane able to separate two liquid compartments from each other. The thickness of the membrane was determined by its interaction with water molecules and ions.”

If you still have your wisdom teeth …

… you may be carrying a little personal stem cell bank around in your mouth.

Deceptive robots

Via KurzweilAI.net — Not too sure if I like this idea. Seems like we’re already heading down the path of breaking Asimov’s robotic laws with a lot of milbots in development and practice.

From the link:

We have developed  algorithms that allow a robot to determine whether it should deceive a human or other intelligent machine and we have designed techniques that help the robot select the best deceptive strategy to reduce its chance of being discovered,” said Ronald Arkin, a Regents professor in the Georgia Tech School of Interactive Computing.

The results of robot experiments and theoretical and cognitive deception modeling were published online on September 3 in the International Journal of Social Robotics. Because the researchers explored the phenomenon of robot deception from a general perspective, the study’s results apply to robot-robot and human-robot interactions. This research was funded by the Office of Naval Research.

In the future, robots capable of deception may be valuable for several different areas, including military and search and rescue operations. A search and rescue robot may need to deceive in order to calm or receive cooperation from a panicking victim. Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe.

“Most social robots will probably rarely use deception, but it’s still an important tool in the robot’s interactive arsenal because robots that recognize the need for deception have advantages in terms of outcome compared to robots that do not recognize the need for deception,” said the study’s co-author, Alan Wagner, a research engineer at the Georgia Tech Research Institute.

A glimmer of economic hope …

Filed under: Business — Tags: , , , , , — David Kirkpatrick @ 12:33 pm

… new jobless benefits claims down. Of course with this unemployment there’s just not that many jobs to lose thusly creating the newly jobless.

Good news from the U.S. Court of Appeals

Federally funded stem cell research back in business. Of course it’s stupid this is even a issue, much less a political football. I wrote out, and deleted, two sentences of snark about christianist theocons, but maybe those thoughts are better left to your imagination. Let’s just say I think the groups pushing against stem cell research are a serious threat to my life, liberty and pursuit of happiness and everyone would be better off if they could just form their own society on an island somewhere and institute whatever manner of holy book law they wanted to live under.

From the link:

A federal appeals court here ruled Thursday that federal financing of embryonic stem cell research could continue while the court considers a judge’s order last month that banned the government from underwriting the work.

The ruling by the United States Court of Appeals could save research mice from being euthanized, cells in petri dishes from starving and scores of scientists from a suspension of paychecks, according to arguments the Obama administration made in the case.

It could also allow the National Institutes of Health to provide $78 million to 44 scientists whose research the agency had previously agreed to finance.

The stay also gives Congress time to consider legislation that would render the ban, and the court case behind it, largely moot, a prospect that some embattled Democrats have welcomed. Despite staunch opposition by some critics, embryonic stem cell research is popular, and a legislative fight on the issue could prove a tonic for Democrats battling a tough political environment.

Single ions crossing a nano bridge

Filed under: Science — Tags: , , , , , — David Kirkpatrick @ 11:11 am

Don’t see any current practical applications — aside from desalination — on this right now (but now with a proof-of-concept I bet this’ll be leveraged in new research), but it is impressively cool.

From the link:

In the Sept. 10 issue of Science, MIT researchers report that charged molecules, such as the sodium and  that form when salt is dissolved in water, can not only flow rapidly through carbon nanotubes, but also can, under some conditions, do so one at a time, like people taking turns crossing a bridge. The research was led by associate professor Michael Strano.

The new system allows passage of much smaller molecules, over greater distances (up to half a millimeter), than any existing nanochannel. Currently, the most commonly studied nanochannel is a silicon nanopore, made by drilling a hole through a silicon membrane. However, these channels are much shorter than the new nanotube channels (the nanotubes are about 20,000 times longer), so they only permit passage of large molecules such as DNA or polymers — anything smaller would move too quickly to be detected.

Strano and his co-authors — recent PhD recipient Chang Young Lee, graduate student Wonjoon Choi and postdoctoral associate Jae-Hee Han — built their new nanochannel by growing a nanotube across a one-centimeter-by-one-centimeter plate, connecting two water reservoirs. Each reservoir contains an electrode, one positive and one negative. Because electricity can flow only if protons — positively charged , which make up the electric current — can travel from one electrode to the other, the researchers can easily determine whether  are traveling through the nanotube.

September 9, 2010

Lasing nanoparticles around the room

Filed under: Science — Tags: , , , , , — David Kirkpatrick @ 1:30 pm

Via KurzweilAI.net — This is a pretty astounding feat.

From the link:

Researchers from Australian National University have developed the ability to move particles  over distances of up to 1.5 meters, using a hollow laser beam to trap light-absorbing particles in a “dark core.” The particles are then moved up and down the beam of light, which acts like an optical “pipeline.”

“When the small particles are trapped in this dark core very interesting things start to happen,” said Professor Andrei Rode. “As gravity, air currents, and random motions of air molecules around the particle push it out of the center, one side becomes illuminated by the laser while the other lies in darkness. This creates a tiny thrust, known as a photophoretic force that effectively pushes the particle back into the darkened core. In addition to the trapping effect, a portion of the energy from the beam and the resulting force pushes the particle along the hollow laser pipeline.”

Practical applications for this technology include directing and clustering nanoparticles in air, micro-manipulation of objects, sampling of atmospheric aerosols, and low-contamination/non-touch handling of sampling materials for transport of dangerous substances and microbes in small amounts, he said.

More info: Australian National University news

The public is a bit wary of synthetic biology

I’m a boundary-pusher in scientific research — I love nanotechnology, stem cell research, genetic research, robotics applications, and of course, I love the promise of synthetic biology. This poll finds only one-third of of surveyed adults want to see the field banned until it’s better understood, but a majority do want to see more government oversight.

The release:

The Public Looks At Synthetic Biology — Cautiously

WASHINGTON, DC: Synthetic biology—defined as the design and construction of new biological parts, devices, and systems or re-design of existing natural biological systems for useful purposes—holds enormous potential to improve everything from energy production to medicine, with the global market projected to reach $4.5 billion by 2015. But what does the public know about this emerging field, and what are their hopes and concerns? A new poll of 1,000 U.S. adults conducted by Hart Research Associates and the Synthetic Biology Project at the Woodrow Wilson Center finds that two-thirds of Americans think that synthetic biology should move forward, but with more research to study its possible effects on humans and the environment, while one-third support a ban until we better understand its implications and risks. More than half of Americans believe the federal government should be involved in regulating synthetic biology.

“The survey clearly shows that much more attention needs to be paid to addressing biosafety and biosecurity risks,” said David Rejeski, Director of the Synthetic Biology Project. “In addition, government and industry need to engage the public about the science and its applications, benefits, and risks.”

The poll findings reveal that the proportion of adults who say they have heard a lot or some about synthetic biology has almost tripled in three years, (from 9 percent to 26 percent). By comparison, self-reported awareness of nanotechnology increased from 24 percent to 34 percent during the same three-year period.

Although the public supports continued research in the area of synthetic biology, it also harbors concerns, including 27 percent who have security concerns (concerns that the science will be used to make harmful things), 25 percent who have moral concerns, and a similar proportion who worry about negative health consequences for humans. A smaller portion, 13 percent, worries about possible damage to the environment.

“The survey shows that attitudes about synthetic biology are not clear-cut and that its application is an important factor in shaping public attitudes towards it,” said Geoff Garin, President of Hart Research. Six in 10 respondents support the use of synthetic biology to produce a flu vaccine. In contrast, three-fourths of those surveyed have concerns about its use to accelerate the growth of livestock to increase food production. Among those for whom moral issues are the top concern, the majority views both applications in a negative light.

The findings come from a nationwide telephone survey of 1,000 adults and has a margin of error of ± 3.1 percentage points. This is the fifth year that Hart Research Associates has conducted a survey to gauge public opinion about nanotechnology and/or synthetic biology for the Woodrow Wilson International Center for Scholars.

###

The report can be found at: www.synbioproject.org

The Woodrow Wilson International Center for Scholars of the Smithsonian Institution was established by Congress in 1968 and is headquartered in Washington, D.C. It is a nonpartisan institution, supported by public and private funds and engaged in the study of national and world affairs.

Latest Beige Book outlook not so bright

Filed under: Business, Politics — Tags: , , , , , — David Kirkpatrick @ 11:15 am

The Great Recession, the near-depression, economic downturn — whatever you want to label the economy of the last years with, it all comes down to it’s not good, hasn’t really gotten appreciably better for Main Street and doesn’t really seem like tangible recovery is even visible on the horizon. So it’s another fall of keeping the chin up and tightening the belt a little bit more once again.

From the link:

The mixed picture is in line with government data released last month that showed U.S. gross domestic product, the broadest measure of economic activity, was much weaker in the second quarter than previously estimated.

The nation’s GDP was revised sharply lower to an annual growth rate of 1.6% in the three months ending in June. The initial reading had been for a 2.4% growth rate in the period.

Fed chairman Ben Bernanke acknowledged in a speech late last month that the U.S. economic recovery has lost considerable steam. But he said the central bank is prepared to use “unconventional measures” to boost the economy if the outlook were to “deteriorate significantly.”

In its Aug. 10 policy statement, the Fed announced plans last month to begin reinvesting proceeds from securities in its $2 trillion portfolio in to U.S. Treasurys. The central bank had bought billions worth of government debt two years ago to keep interest rates low on home and other consumer loans. But minuets from the August meeting subsequently showed that Fed officials were unusually divided over the policy.

Full expensing of capital equipment in 2011

Now this is a tax break businesses of all sizes can get behind.

From the link:

White House Press Secretary Robert Gibbs said the president will announce a plan to offer businesses the chance to deduct 100 percent of the cost of qualifying capital equipment purchases made in 2011, double the amount allowed in the Emergency Economic Stabilization Act of 2008 (Pub. L. No. 110-343) and proposed for the small business bill (H.R. 5297) that is currently stalled in the Senate.

“We’re saying that for 2011, we believe that 50 [percent] should go to 100 [percent]. It builds off of an effort to get capital off the sidelines and into the economy,” Gibbs told reporters.

From the same link, here’s another excellent pro-business proposal:

The president also is pushing to make the research and development tax credit a permanent feature of the tax code, rather than continue to ask Congress to pass it as a temporary extension every year.

Making the R&D credit permanent and expanding it would cost about $100 billion over 10 years—a key reason Congress has not already done so—but Obama has argued that the change will benefit the economy by reducing business uncertainty about the credit’s future.

Update — here’s more on the R&D tax credit from Robert Atkinson, president of the Information Technology & Innovation Foundation:

It is welcome news that President Obama will ask Congress to expand and make the research and development (R&D) tax credit permanent. This will better enable the U.S. compete globally and make it clear that the United States has finally gotten off the sidelines in the fight for global economic competitiveness.

While expanding the credit from 14 to 17, as has been reported, makes sense. ITIF thinks an even more generous credit makes even better sense. ITIF estimates that expanding the credit from 14 percent to 20 per¬cent would create 162,000 jobs in the short to moderate run and an additional, but unspecified, number of jobs in the longer run – many of them high-skill, high wage jobs.

September 8, 2010

Graphene research may lead to electronics improvement

A fairly radical improvement. Try highly efficient, very-low-heat producing and smaller electronics devices. I enjoy blogging about nanotech research with real promise for market applications.

From the link:

NIST recently constructed the world’s most powerful and stable scanning-probe microscope, with an unprecedented combination of low temperature (as low as 10 millikelvin, or 10 thousandths of a degree above absolute zero), ultra-high vacuum and high . In the first measurements made with this instrument, the team has used its power to resolve the finest differences in the electron energies in graphene, atom-by-atom.

“Going to this resolution allows you to see new physics,” said Young Jae Song, a postdoctoral researcher who helped develop the instrument at NIST and make these first measurements.

And the new physics the team saw raises a few more questions about how the electrons behave in graphene than it answers.

Because of the geometry and electromagnetic properties of graphene’s structure, an electron in any given energy level populates four possible sublevels, called a “quartet.” Theorists have predicted that this quartet of levels would split into different energies when immersed in a magnetic field, but until recently there had not been an instrument sensitive enough to resolve these differences.

“When we increased the magnetic field at extreme low temperatures, we observed unexpectedly complex quantum behavior of the electrons,” said NIST Fellow Joseph Stroscio.

What is happening, according to Stroscio, appears to be a “many-body effect” in which electrons interact strongly with one another in ways that affect their energy levels.

The latest on the White House proposed tax cuts and infrastructure spending

Filed under: Business, Politics — Tags: , , , , — David Kirkpatrick @ 1:20 pm

Can Obama pull off these fiscal moves this year? Opinions are a mixed bag, but the quick answer is probably not.

From the link:

Tax experts and economists offered mixed reviews about the feasibility of the Obama administration’s attempt to pass additional tax cuts now with the legislative year winding down, even as the president declined Sept. 3 to specify what proposals his administration will advance.

On Aug. 30, the president announced that he will propose a series of targeted tax cuts and infrastructure investments in the coming days and weeks, some of which will be new.

“I will be addressing a broader package of ideas next week,” the president told reporters at the White House Sept. 3. “We are confident that we are moving in the right direction, but we want to keep this recovery moving stronger and accelerate the job growth that’s needed so desperately all across the country.”

Singularity University to announce session breakthroughs September 13

Via KurzweilAI.net — I blogged about one of the breakthroughs yesterday, and the university leader’s are going to announce the entire group next Monday.

From the first link:

Singularity University to Unveil Breakthrough Solutions for ‘Global Grand Challenges’ at Sept. 13 Briefing

September 8, 2010 by Editor

This summer, 80 students from 35 nations were challenged to apply innovations in exponentially advancing technologies to solve some of the world’s “grand challenges” with a focus on food, water, energy, upcycle, and space industries.

On Monday, September 13, at 9:30am PT/12:30pm ET, in a webinar briefing, Singularity University co-founders Dr. Ray Kurzweil, Dr. Peter H. Diamandis, and faculty head Dr. Dan Barry will unveil for the first time multiple solutions in each problem space, each aiming to impact a billion people within ten years.

A Q&A session will follow the briefing. The Briefing is open to media and the public, but space is limited. You can visit http://briefing.singularityu.org/ to register for the webinar briefing.

Singularity University (SU) is an interdisciplinary university whose mission is to assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies in order to address humanity’s grand challenges. With the support of a broad range of leaders in academia, business and government, SU hopes to stimulate groundbreaking, disruptive thinking and solutions aimed at solving some of the planet’s most pressing challenges. SU is based at the NASA Ames campus in Silicon Valley. For more information, go to www.singularityu.org and follow SU on Twitter and Facebook.

« Newer PostsOlder Posts »

The Silver is the New Black Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 25 other followers