David Kirkpatrick

December 4, 2010

History sniffing, one more online privacy issue

I have to admit I had never heard of history sniffing before reading this story. Makes me doubly glad I use Chrome for my browser.

From the link:

The Web surfing history saved in your Web browser can be accessed without your permission. JavaScript code deployed by real websites and online advertising providers use browser vulnerabilities to determine which sites you have and have not visited, according to new research from computer scientists at the University of California, San Diego.

The researchers documented  code secretly collecting browsing histories of  through “history sniffing” and sending that information across the network. While history sniffing and its potential implications for privacy violation have been discussed and demonstrated, the new work provides the first empirical analysis of history sniffing on the real Web.

“Nobody knew if anyone on the Internet was using history sniffing to get at users’ private browsing history. What we were able to show is that the answer is yes,” said UC San Diego  science professor Hovav Shacham.
The  from the UC San Diego Jacobs School of Engineering presented this work in October at the 2010 ACM Conference on Computer and Communications Security (CCS 2010) in a paper entitled, “An Empirical Study of Privacy-Violating Information Flows in JavaScript Web Applications”.

History Sniffing

History sniffing takes place without your knowledge or permission and relies on the fact that browsers display links to sites you’ve visited differently than ones you haven’t: by default, visited links are purple, unvisited links blue. History sniffing JavaScript code running on a Web page checks to see if your browser displays links to specific URLs as blue or purple.

History sniffing can be used by website owners to learn which competitor sites visitors have or have not been to. History sniffing can also be deployed by advertising companies looking to build user profiles, or by online criminals collecting information for future phishing attacks. Learning what banking site you visit, for example, suggests which fake banking page to serve up during a phishing attack aimed at collecting your bank account login information.


Advertisements

September 12, 2010

Mapping the internet

Research out of the San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in collaboration with the Universitat de Barcelona in Spain and the University of Cyprus.

The release, ahem, article:

SDSC Collaboration Aims to Create First Accurate Geometric Map of the Internet

September 09, 2010

By Jan Zverina

The San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in a collaboration with researchers from Universitat de Barcelona in Spain and the University of Cyprus, have created the first geometric “atlas” of the Internet as part of a project to prevent our most ubiquitous form of communication from collapsing within the next decade or so.

In a paper published this week in Nature Communications, CAIDA researcher Dmitri Krioukov, along with Marián Boguñá (Universitat de Barcelona) and Fragkiskos Papadopoulos (University of Cyprus), describe how they discovered a latent hyperbolic, or negatively curved, space hidden beneath the Internet’s topology, leading them to devise a method to create an Internet map using hyperbolic geometry. In their paper, Sustaining the Internet with Hyperbolic Mapping, the researchers say such a map would lead to a more robust Internet routing architecture because it simplifies path-finding throughout the network.

“We compare routing in the Internet today to using a hypothetical road atlas, which is really just a long encoded list of road intersections and connections that would require drivers to pore through each line to plot a course to their destination without using any geographical, or geometrical, information which helps us navigate through the space in real life,” said Krioukov, principal investigator of the project.

Now imagine that a road – or in the case of the Internet, a connection – is closed for some reason and there is no geographical atlas to plot a new course, just a long list of connections that need to be updated. “That is basically how routing in the Internet works today – it is based on a topographical map that does not take into account any geometric coordinates in any space,” said Krioukov, who with his colleagues at CAIDA have been managing a project called Archipelago, or Ark, that constantly monitors the topology of the Internet, or the structure of its interconnections.

Like many experts, however, Krioukov is concerned that existing Internet routing, which relies on only this topological information, is not really sustainable. “It is very complicated, inefficient, and difficult to scale to the rapidly growing size of the Internet, which is now accessed by more than a billion people each day. In fact, we are already seeing parts of the Internet become intermittently unreachable, sinking into so-called black holes, which is a clear sign of instability.”

Krioukov and his colleagues have developed an in-depth theory that uses hyperbolic geometry to describe a negatively curved shape of complex networks such as the Internet. This theory appears in paper Hyperbolic Geometry of Complex Networks, published by Physical Review E today. In their Nature Communications paper, the researchers employ this theory, Ark’s data, and statistical inference methods to build a geometric map of the Internet. They show that routing using such a map would be superior to the existing routing, which is based on pure topology.

Instead of perpetually accessing and rebuilding a reference list of all available network paths, each router in the Internet would know only its hyperbolic coordinates and the coordinates of its neighbors so it could route in the right direction, only relaying the information to its closest neighbor in that direction, according to the researchers. Known as “greedy routing”, this process would dramatically increase the overall efficiency and scalability of the Internet. “We believe that using such a routing architecture based on hyperbolic geometry will create the best possible levels of efficiency in terms of speed, accuracy, and resistance to damage,” said Krioukov.

However the researchers caution that actually implementing and deploying such a routing structure in the Internet might be as challenging, if not more challenging, than discovering its hidden space. “There are many technical and non-technical issues to be resolved before the Internet map that we found would be the map that the Internet uses,” said Krioukov.

The research was in part funded by the National Science Foundation, along with Spain’s   Direcção Geral de Ensino Superior (DGES), Generalitat de Catalunya, and by Cisco Systems. The Internet mapping paper as published in Nature Communications can be found here. The Physical Review E paper can be found here.

April 22, 2010

Is graphene pliable?

Looks like more so than carbon nanotubes. This attribute is key to using the material in electronic devices such as actuators, valves in labs-on-a-chip and electronic paper.

From the link:

Physicists at UC San Diego and Boston University think so. In a paper published in the journal Physical Review B, the scientists say the propensity of graphene—a single layer of  arranged in a — to stick to itself and form carbon “nanoscrolls” could be controlled electrostatically to form a myriad of new devices.

Unlike carbon nanotubes—cylindrical molecules of pure carbon with novel properties that have become the focus of much of the attention of new application in electronics and materials development— nanoscrolls retain open edges and have no caps.

“As a result, nanoscrolls can change their shape and their inner and outer diameters, while nanotubes cannot,” said Michael Fogler, an associate professor of physics at UCSD and the first author of the paper.

March 3, 2010

Preserving digital knowledge

This is a much larger issue than most people realize, and I’ve blogged on this exact topic just over a month ago. The problem is different formats and hardware advancement that can render data unreadable. Not because the data file is corrupted (even though that’s a real issue as well), but because there’s no device that can access the data on an archaic storage medium. I agree with the opening sentence of this release — it is one of the most pressing challenges of the information age.

The release:

Blue Ribbon Task Force Report:
Preserving Our Digital Knowledge
Base Must be a Public Priority

Dollars Won’t Do It Alone: Deluge of Digital Data Needs Economically Sustainable Plans

February 26, 2010

By Jan Zverina

Addressing one of the most urgent societal challenges of the Information Age – ensuring that valued digital information will be accessible not just today, but in the future – requires solutions that are at least as much economic and social as technical, according to a new report by a Blue Ribbon Task Force.

The Final Report from the Blue Ribbon Task Force on Sustainable Digital Preservation and Access, called “Sustainable Economics for a Digital Planet: Ensuring Long-term Access to Digital Information”, is the result of a two-year effort focusing on  the critical economic challenges of  preserving an ever-increasing amount of information in a world gone digital. The full report is available online
at http://brtf.sdsc.edu/biblio/BRTF_Final_Report.pdf

“The Data Deluge is here.  Ensuring that our most valuable information is available both today and tomorrow is not just a matter of finding sufficient funds,” said Fran Berman, vice president for research at Rensselaer Polytechnic Institute, and co-chair of the Task Force. “It’s about creating a “data economy” in which those who care, those who will pay, and those who preserve are working in coordination.”

The challenge in preserving valuable digital information – consisting of text, video, images, music, sensor data, etc. generated throughout all areas of our society – is real and growing at an exponential pace. A recent study by the International Data Corporation (IDC) found that a total of 3,892,179,868,480,350,000,000 (that’s roughly 3.9 trillion times a trillion) new digital information bits were created in 2008. In the future, the digital universe is expected to double in size every 18 months, according to the IDC report.

While much has been written on the digital preservation issue as a technical challenge, the Blue Ribbon Task Force report focuses on the economic aspect; i.e. how stewards of valuable, digitally-based information can pay for preservation over the longer term. The report provides general principles and actions to support long-term economic sustainability; context-specific recommendations tailored to specific scenarios analyzed in the report; and an agenda for priority actions and next steps, organized according to the type of decision maker best suited to carry that action forward. Moreover, the report is intended to serve as a foundation for further study in this critical area.

In addition to releasing its report, the Task Force earlier this month announced plans for a one-day symposium to provide a forum for discussion on economically sustainable digital preservation practices. The symposium, to be held April 1 in Washington D.C., will include a spectrum of national leaders from the Executive Office of the President of the United States, the Academy of Motion Picture Arts and Sciences, the Smithsonian Museum, Nature Magazine, Google, and other organizations for whom digital information is fundamental for success.

Value, Incentives, and Roles & Responsibilities
The report of the Blue Ribbon Task Force focuses on four distinct scenarios, each having ever-increasing amounts of preservation-worthy digital assets in which there is a public interest in long-term preservation:  scholarly discourse , research data, commercially-owned cultural content (such as digital movies and music), and collectively-produced Web content (such as blogs).

“Valuable digital information spans the spectrum from official e-documents to some YouTube videos. No one economic model will cost-effectively support them all, but all require cost-effective economic models,” said Berman, who was director of the San Diego Supercomputer Center at the University of California, San Diego, before joining Rensselaer last year.

The report categorizes the economics of digital preservation into three “necessary conditions” closely aligned with the needs of stakeholders: recognizing the value of data and selecting materials for longer-term preservation; providing incentives for decision makers to preserve data directly or provide preservation services for others; and articulating the roles and responsibilities among those involved in the preservation process. The report further aligns those conditions with the basic economic principle of supply and demand, and warns that without well-articulated demand for access to preserved digital assets, there will be no supply of preservation services.

“Addressing the issues of value, incentives, and roles and responsibilities helps us understand who benefits from long-term access to digital materials, who should be responsible for preservation, and who should pay for it,” said Brian Lavoie, research scientist at OCLC and Task Force co-chair. “Neglecting to account for any of these conditions significantly reduces the prospects of achieving sustainable digital preservation activities over the long run.”

Task Force Recommendations
The Blue Ribbon panel report cites several specific recommendations for decision makers and stakeholders to consider as they seek economically sustainable preservation practices for digital information. While the report covers these recommendations in detail, below is a summary listing key areas of priority for near-term action:

Organizational Action

  • develop public-private partnerships, similar to ones formed by the Library of Congress
  • ensure that organizations have access to skilled personnel, from domain experts to legal and business specialists
  • create and sustain secure chains of stewardship between organizations over  the long term
  • achieve economies of scale and scope wherever possible

Technical Action

  • build capacity to support stewardship in all areas
  • lower the costs of preservation overall
  • determine the optimal level of technical curation needed to create a flexible strategy for all types of digital material

Public Policy Action

  • modify copyright laws to enable digital preservation
  • create incentives and requirements for private entities to preserve on behalf of the public (financial incentives, handoff requirements)
  • sponsor public-private partnerships
  • clarify rights issues associated with Web-based materials

Education and Public Outreach Action

  • promote education and training for 21st century digital preservation (domain-specific skills, curatorial best practices, core competencies in relevant science, technology, engineering, and mathematics knowledge)
  • raise awareness of the urgency to take timely preservation actions

The report concluded that sustainable preservation strategies are not built all at once, nor are they static.

“The environment in which digital preservation takes place can be very dynamic,” said OCLC’s Brian Lavoie. “Priorities change, policies change, stakeholders change. A key element of a robust sustainability strategy is to anticipate the effect of these changes and take steps to minimize the risk that long-term preservation goals will be impacted by short-term disruptions in resources, incentives, and other economic factors. If we can do this, we will have gone a long way toward ensuring that society’s valuable digital content does indeed survive.”

About the Blue Ribbon Task Force on Sustainable Digital Preservation and Access
The Blue Ribbon Task Force on Sustainable Digital Preservation and Access was launched in late 2007 by the National Science Foundation and The Andrew W. Mellon Foundation, in partnership with the Library of Congress, the Joint Information Systems Committee of the United Kingdom, the Council on Library and Information Resources, and the National Archives and Records Administration. The Task Force was commissioned to explore the economic sustainability challenge of digital preservation and access.  An Interim report discussing the economic context for preservation, Sustaining the Digital Investment:  Issues and Challenges of Economically Sustainable Digital Preservation, is available at the Task Force website, http://brtf.sdsc.edu .  Please visit the website for more information about the Task Force and its upcoming symposium, called A National Conversation on the Economic Sustainability of Digital Information, to take place April 1, 2010 in Washington D.C. A similar symposium will be held in the United Kingdom on May 6, 2010, at the Wellcome Collection Conference Centre, in London. Space is limited so early registration is advised.  More information is available at

http://www.jisc.ac.uk/whatwedo/programmes/preservation/BRTFUKSymposium.aspx

February 26, 2010

Active video games are good for the elderly

I’ve blogged about the utility of Nintendo’s Wii gaming system before and here’s new research showing that exergames, that is, video games that combine gaming with exercise, can stave off depression in older adults. I love the Wii and heartily recommend the Wii fit plus for all ages to have fun while engaging in light to moderate exercise. It’s great for improving your core strength, muscle tone and balance. (Here’s a link to Amazon for the Wii Fit Plus with Balance Board)

The release, from the second link:

Video games may help combat depression in older adults

IMAGE: Dilip V. Jeste, M.D., is a researcher at the University of California, San Diego.

Click here for more information.

Research at the Sam and Rose Stein Institute for Research on Aging at the University of California, San Diego School of Medicine suggests a novel route to improving the symptoms of subsyndromal depression (SSD) in seniors through the regular use of “exergames” – entertaining video games that combine game play with exercise. In a pilot study, the researchers found that use of exergames significantly improved mood and mental health-related quality of life in older adults with SSD.

The study, led by Dilip V. Jeste, MD, Distinguished Professor of psychiatry and neurosciences at UCSD School of Medicine, Estelle and Edgar Levi Chair in Aging, and director of the UC San Diego Sam and Rose Stein Institute for Research on Aging, appears in the March issue of the American Journal of Geriatric Psychiatry.

SSD is much more common than major depression in seniors, and is associated with substantial suffering, functional disability, and increased use of costly medical services. Physical activity can improve depression; however, fewer than five percent of older adults meet physical activity recommendations.

“Depression predicts nonadherence to physical activity, and that is a key barrier to most exercise programs,” Jeste said. “Older adults with depression may be at particular risk for diminished enjoyment of physical activity, and therefore, more likely to stop exercise programs prematurely.”

In the study, 19 participants with SSD ranging in age from 63 to 94 played an exergame on the Nintendo Wii video game system during 35-minute sessions, three times a week. After some initial instruction, they chose one of the five Nintendo Wii Sports games to play on their own – tennis, bowling, baseball, golf or boxing.

Using the Wii remote – a wireless device with motion-sensing capabilities – the seniors used their arm and body movements to simulate actions engaged in playing the actual sport, such as swinging the Wii remote like a tennis racket. The participants reported high satisfaction and rated the exergames on various attributes including enjoyment, mental effort, and physical limitations.

“The study suggests encouraging results from the use of the exergames,” Jeste said. “More than one-third of the participants had a 50-percent or greater reduction of depressive symptoms. Many had a significant improvement in their mental health-related quality of life and increased cognitive stimulation.”

Jeste said feedback revealed some participants started the study feeling nervous about how they would perform in the exergames and the technical aspects of game play. However, by the end of the study, most participants reported that learning and playing the videogames was satisfying and enjoyable.

“The participants thought the exergames were fun, they felt challenged to do better and saw progress in their game play,” Jeste said. “Having a high level of enjoyment and satisfaction, and a choice among activities, exergames may lead to sustained exercise in older adults.” He cautioned, however, that the findings were based on a small study, and needed to be replicated in larger samples using control groups. He also stressed that exergames carry potential risks of injury, and should be practiced with appropriate care.

###

Additional authors include Dori Rosenberg, Jennifer Reichstadt, Jacqueline Kerr and Greg Norman, UCSD Department of Family and Preventative Medicine; and Colin A. Depp, Ipsit V. Vahia and Barton W. Palmer, UCSD Department of Psychiatry.

The study was funded in part by grants from the National Institute of Mental Health, the UCSD Sam and Rose Stein Institute for Research on Aging, and the Department of Veterans Affairs.

December 14, 2009

Thirty four gigabytes of data

Filed under: et.al., Media, Technology — Tags: , , — David Kirkpatrick @ 4:56 pm

That’s how much the average American personally consumes each day. Information in the form of data-rich video gets the lions share of blame.

From the link:

An average American digests a whopping 34 gigabytes of information outside of work every day, according to a new study from the University of California, San Diego. The UCSD researchers estimate we each ingest about 100,500 words daily from various forms of media. In all, it’s about 350 percent more data than we were swallowing down just three decades ago.

November 21, 2009

Carbon nanotube supercapacitors

Flawed carbon nanotubes may lead to supercapacitors.

From the link:

Most people would like to be able to charge their cell phones and other personal electronics quickly and not too often. A recent discovery made by UC San Diego engineers could lead to carbon nanotube-based supercapacitors that could do just this.

In recent research, published in , Prabhakar Bandaru, a professor in the UCSD Department of Mechanical and Aerospace Engineering, along with graduate student Mark Hoefer, have found that artificially introduced defects in nanotubes can aid the development of supercapacitors.

“While batteries have large , they take a long time to charge; while electrostatic capacitors can charge quickly but typically have limited capacity. However, supercapacitors/electrochemical capacitors incorporate the advantages of both,” Bandaru said.

Of course I mostly ran this post just to add to the excuse for running this awesome image of a carbon nanotube. Earlier this week I featured an incredible image of graphene. We’re getting some just simply amazing looks into the atomic world right now. And it’ll only get better.

Carbon nanotubes could serve as supercapacitor electrodes with enhanced charge and energy storage capacity (inset: a magnified view of a single carbon nanotube).

Credit: UC San Diego

October 30, 2008

A molecular clock

This is a cool story from Technology Review.

From the link:

A Fast, Programmable Molecular Clock

The bacteria-based timepiece could be used as a biosensor for changing environmental conditions.
Wednesday, October 29, 2008
By Emily Singer

UC San Diego bioengineers have created the first stable, fast, and programmable genetic clock that reliably keeps time by the blinking of fluorescent proteins inside E. coli cells. The clock’s blink rate changes when the temperature, energy source, or other environmental conditions change. Shown here is a microfluidic system capable of controlling the environmental conditions of the E. coli cells with great precision–one of the keys to this advance.
Credit: UC San Diego Jacobs School of Engineering

A molecular timepiece that ticks away the time with a flash of fluorescent protein could provide the basis for novel biosensors. The clock, or synthetic gene oscillator, is a feat of synthetic biology–a fledgling field in which researchers engineer novel biological “parts” into organisms.

To create the clock, scientists genetically engineered a molecular oscillator composed of multiple gene promoters, which turn genes on in the presence of certain chemicals, and genes themselves, one of which codes for a fluorescent protein. When expressed in E. coli bacteria, the feedback system turns the fluorescent gene on and off at regular intervals.

May 15, 2008

Nanowire solar cells and black holes

From KurzweilAI.net, nanotech that may boost solar efficiency and black holes may have an escape hatch of sorts

Nanowires may boost solar cell efficiency, engineers say
PhysOrg.com, May 14, 2008

University of California, San Diego electrical engineers have created experimental solar cells spiked with nanowires that could lead to highly efficient thin-film solar cells of the future.

 
Read Original Article>>

Physicists Demonstrate How Information Can Escape From Black Holes
PhysOrg.com, May 14, 2008

Physicists at Penn State and the Raman Research Institute in India have discovered such a mechanism by which information can be recovered from black holes.

They suggest that singularities do not exist in the real world. “Information only appears to be lost because we have been looking at a restricted part of the true quantum-mechanical space-time,” said Madhavan Varadarajan, a professor at the Raman Research Institute. “Once you consider quantum gravity, then space-time becomes much larger and there is room for information to reappear in the distant future on the other side of what was first thought to be the end of space-time.”

 
Read Original Article>>