David Kirkpatrick

September 12, 2010

Mapping the internet

Research out of the San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in collaboration with the Universitat de Barcelona in Spain and the University of Cyprus.

The release, ahem, article:

SDSC Collaboration Aims to Create First Accurate Geometric Map of the Internet

September 09, 2010

By Jan Zverina

The San Diego Supercomputer Center and Cooperative Association for Internet Data Analysis (CAIDA) at the University of California, San Diego, in a collaboration with researchers from Universitat de Barcelona in Spain and the University of Cyprus, have created the first geometric “atlas” of the Internet as part of a project to prevent our most ubiquitous form of communication from collapsing within the next decade or so.

In a paper published this week in Nature Communications, CAIDA researcher Dmitri Krioukov, along with Marián Boguñá (Universitat de Barcelona) and Fragkiskos Papadopoulos (University of Cyprus), describe how they discovered a latent hyperbolic, or negatively curved, space hidden beneath the Internet’s topology, leading them to devise a method to create an Internet map using hyperbolic geometry. In their paper, Sustaining the Internet with Hyperbolic Mapping, the researchers say such a map would lead to a more robust Internet routing architecture because it simplifies path-finding throughout the network.

“We compare routing in the Internet today to using a hypothetical road atlas, which is really just a long encoded list of road intersections and connections that would require drivers to pore through each line to plot a course to their destination without using any geographical, or geometrical, information which helps us navigate through the space in real life,” said Krioukov, principal investigator of the project.

Now imagine that a road – or in the case of the Internet, a connection – is closed for some reason and there is no geographical atlas to plot a new course, just a long list of connections that need to be updated. “That is basically how routing in the Internet works today – it is based on a topographical map that does not take into account any geometric coordinates in any space,” said Krioukov, who with his colleagues at CAIDA have been managing a project called Archipelago, or Ark, that constantly monitors the topology of the Internet, or the structure of its interconnections.

Like many experts, however, Krioukov is concerned that existing Internet routing, which relies on only this topological information, is not really sustainable. “It is very complicated, inefficient, and difficult to scale to the rapidly growing size of the Internet, which is now accessed by more than a billion people each day. In fact, we are already seeing parts of the Internet become intermittently unreachable, sinking into so-called black holes, which is a clear sign of instability.”

Krioukov and his colleagues have developed an in-depth theory that uses hyperbolic geometry to describe a negatively curved shape of complex networks such as the Internet. This theory appears in paper Hyperbolic Geometry of Complex Networks, published by Physical Review E today. In their Nature Communications paper, the researchers employ this theory, Ark’s data, and statistical inference methods to build a geometric map of the Internet. They show that routing using such a map would be superior to the existing routing, which is based on pure topology.

Instead of perpetually accessing and rebuilding a reference list of all available network paths, each router in the Internet would know only its hyperbolic coordinates and the coordinates of its neighbors so it could route in the right direction, only relaying the information to its closest neighbor in that direction, according to the researchers. Known as “greedy routing”, this process would dramatically increase the overall efficiency and scalability of the Internet. “We believe that using such a routing architecture based on hyperbolic geometry will create the best possible levels of efficiency in terms of speed, accuracy, and resistance to damage,” said Krioukov.

However the researchers caution that actually implementing and deploying such a routing structure in the Internet might be as challenging, if not more challenging, than discovering its hidden space. “There are many technical and non-technical issues to be resolved before the Internet map that we found would be the map that the Internet uses,” said Krioukov.

The research was in part funded by the National Science Foundation, along with Spain’s   Direcção Geral de Ensino Superior (DGES), Generalitat de Catalunya, and by Cisco Systems. The Internet mapping paper as published in Nature Communications can be found here. The Physical Review E paper can be found here.

September 3, 2010

Balancing national security and privacy on the internet

An interesting breakdown on the current state of online privacy versus national security.

From the link:

In the wake of revelations that the US military network was compromised in 2008, and that US digital interests are under a relative constant threat of attack, the Pentagon is establishing new cyber security initiatives to protect the Internet. The Pentagon strategy–which is part digital NATO, part digital civil defense, and part Big Brother–may ruffle some feathers and raise concerns that the US Internet is becoming a military police state.

The mission of the United States Department of Defense is to provide military forces needed to deter war and protect the security of the nation. The scope of that mission includes emerging threats and the need to deter cyber war and protect the digital security of the nation as well. To fulfill that mission in an increasingly connected world, and with a rising threat of digital attack, the Pentagon wants to expand its sphere of influence.

This really is a tough issue. Certainly you want the nation to be safe, but at the same time the internet is largely a borderless “pseudo-nation” and clamping down too hard — not unlike the great firewall of China — can stifle much of what makes the net great. No easy answers here, but dramatically increasing the power of the government — particularly the military — over the private sector is not an acceptable solution.

August 18, 2010

The long arm of the internet reaches 5B devices

Filed under: Business, Media, Technology — Tags: , , , , , , , — David Kirkpatrick @ 1:28 pm

Yes, that header is correct — this month will see the five billionth device connected to the world via the internet Something to think about there. From the early days of ARPANET up to today’s World Wide Web full of commercialization, social media, viral video and everything else you can track down in the online world, human communication has gone through an honest revolution. A revolution I doubt very many of us would want to see rolled back.

From the first link:

Sometime this month, the 5 billionth device will plug into the Internet. And in 10 years, that number will grow by more than a factor of four, according to IMS Research, which tracks the installed base of equipment that can access the Internet.

On the surface, this second tidal wave of growth will be driven by cell phones and new classes of consumer electronics, according to an IMS statement. But an even bigger driver will be largely invisible: machine-to-machine communications in various kinds of smart grids for energy management, surveillance and public safety, traffic and parking control, and sensor networks.

Earlier this year, Cisco forecast equally steep growth rates in personal devices and overall Internet traffic. [See “Global IP traffic to increase fivefold by 2013, Cisco predicts“]

Today, there are over 1 billion computers that regularly connect to the Internet. That class of devices, including PCs and laptops and their associated networking gear, continues to grow.

July 9, 2010

Humanity as a giant superorganism

Filed under: et.al., Media, Technology — Tags: , , , , — David Kirkpatrick @ 1:21 pm

Via KurzweilAI.net — Are we turning into the Borg? (just kidding, there.)

Technology is weaving humans into electronic webs that resemble big brains — corporations, online hobby groups, far-flung N.G.O.s, suggests author Robert Wright. “And I personally don’t think it’s outlandish to talk about us being, increasingly, neurons in a giant superorganism; certainly an observer from outer space, watching the emergence of the Internet, could be excused for looking at us that way…. If we don’t use technology to weave people together and turn our species into a fairly unified body, chaos will probably engulf the world — because technology offers so much destructive power that a sharply divided human species can’t flourish.”

March 15, 2010

Sorry, but this is just stupid

Filed under: et.al., Media, Politics, Technology — Tags: , — David Kirkpatrick @ 5:32 pm

Sounds like the Norwegians are looking to save on prize money with this nomination.

From the link:

The Norwegian Nobel Institute yesterday announced there are 237 nominees for the 2010 Nobel Peace Prize. Though the the institute doesn’t normally disclose who made the list, an official did confirm to Computerworld that it includes the Internet.

That’s right. The Internet was nominated for the illustrious prize by the Italian edition of Wired magazine, according to the institute.

December 8, 2009

Google goes real-time …

… by adding Twitter and Friendfeed to search results, plus making updates in seconds rather than minutes.

From the link:

It seems that Google’s recent deal with Twitter is already bearing fruit. Today, Google announced that in response to English searches it will now return a “latest results” section that will include posts from Twitter and Friendfeed, along with seconds-old headlines from newspapers and blogs.

It’ll be interesting to see how well this content will supplement Google’s regular results, which change at a much slower pace. There isn’t much room in a 140-character Twitter post to provide the context that search engines typically use to judge relevancy.

October 30, 2009

A cloud computing primer

Filed under: Politics, Technology — Tags: , , , — David Kirkpatrick @ 1:05 pm

I’ve done plenty of blogging about cloud computing, but as the buzzword gets more and more mainstream, more people become curious. This article lays out the basics, pros and cons of cloud computing for anyone looking for a quick primer.

From the second link:

What exactly are we talking about? The “cloud” is an IT term for the Internet, and cloud computing, or cloud integration, means storing and having access to your computer data and software on the Internet, rather than running it on your personal computer or office server. In fact, if you use programs such as Gmail or Google docs (GOOG), you may not realize you are already doing cloud computing.

Part of the confusion is that the terminology is rather vaporous, particularly for non-tech-savvy types, including many small business owners. And it does represent a major shift in how businesses and individuals use and store digital information. We’ll go through some pros and cons that may help you decide whether this is right for your firm.

October 26, 2009

Happy birthday web browser

Well, technically happy birthday almost two weeks ago on October 13. The browser turns 15. Yep, if the web browser — that digital tool so old it’s losing teeth and has hair growing out its ears — couldn’t even get a driver’s license if it were a person. Innovation is fast and furious and little things like this bring that point home every once in a while.

First came ARPANET back in the late 1960s, which led to the internet leading to the more user friendly subset of the internet known as the World Wide Web and those easy-to-use GUIs and the dawn of the age of the web browser. And now we’re about to be browsing sites written in HTML5.

From the very first link:

The Web browser turns 15 on Oct. 13, 2009 — a key milestone in the history of the Internet. That’s when the first commercial Web browser — eventually called Netscape Navigator – was released as beta code. While researchers including World Wide Web inventorTim Berners-Lee and a team at the National Center for Supercomputing Applications created Unix browsers between 1991 and 1994, Netscape Navigator made this small piece of desktop software a household name. By allowing average users to view text and images posted on Web sites, Netscape Navigator helped launch the Internet era along with multiple browser wars, government-led lawsuits and many software innovations

September 29, 2009

Congress, the federal government and internet security

Filed under: Media, Politics, Technology — Tags: , , , , — David Kirkpatrick @ 10:19 pm

I’m sympathetic to reality of cyberattack against the government, but I’m guessing it’s needless to say I’m against any form of government control over internet traffic.

From the link:

There is no kill switch for the Internet, no secret on-off button in an Oval Office drawer.

Yet when a Senate committee was exploring ways to secure computer networks, a provision to give the president the power to shut down Internet traffic to compromised Web sites in an emergency set off alarms.

Corporate leaders and privacy advocates quickly objected, saying the government must not seize control of the Internet.

Lawmakers dropped it, but the debate rages on. How much control should federal authorities have over the Web in a crisis? How much should be left to the private sector? It does own and operate at least 80 percent of the Internet and argues it can do a better job.

“We need to prepare for that digital disaster,” said Melissa Hathaway, the former White House cybersecurity adviser. “We need a system to identify, isolate and respond to cyberattacks at the speed of light.”

So far at least 18 bills have been introduced as Congress works carefully to give federal authorities the power to protect the country in the event of a massive cyberattack. Lawmakers do not want to violate personal and corporate privacy or squelching innovation. All involved acknowledge it isn’t going to be easy.

September 1, 2009

The internet turns forty

People carry on about how it’s well past the year 2000 and just exactly where is the future we all imagined — flying cars, jet packs, the works.

Well, think about what someone from 1985 would say about pretty much everyone carrying tiny devices that combine cordless phones, mini-televisions, the internet, etc. Put their jaw back in place and go to a hoary old desktop computer with a broadband connection. The computer might look somewhat similar, but even a user of the internet (probably a scientist or academic) from that year would be bowled over by the sheer volume of information, rich media and connectivity availble today.

From the link:

Goofy videos weren’t on the minds of Len Kleinrock and his team at UCLA when they began tests 40 years ago on what would become the Internet. Neither was social networking, for that matter, nor were most of the other easy-to-use applications that have drawn more than a billion people online.

Instead the researchers sought to create an open network for freely exchanging information, an openness that ultimately spurred the innovation that would later spawn the likes of YouTubeFacebookand the World Wide Web.

There’s still plenty of room for innovation today, yet the openness fostering it may be eroding. While the Internet is more widely available and faster than ever, artificial barriers threaten to constrict its growth.

Call it a mid-life crisis.

A variety of factors are to blame. Spam and hacking attacks force network operators to erect security firewalls. Authoritarian regimes block access to many sites and services within their borders. And commercial considerations spur policies that can thwart rivals, particularly on mobile devices like the iPhone.

“There is more freedom for the typical Internet user to play, to communicate, to shop — more opportunities than ever before,” saidJonathan Zittrain, a law professor and co-founder of Harvard’s Berkman Center for Internet & Society. “On the worrisome side, there are some longer-term trends that are making it much more possible (for information) to be controlled.”

Few were paying attention back on Sept. 2, 1969, when about 20 people gathered in Kleinrock’s lab at the University of California, Los Angeles, to watch as two bulky computers passed meaningless test data through a 15-foot gray cable.

February 15, 2009

The internet and social research

I’ve blogged on this topic in the past, and I find the idea and practice of using the World Wide Web for research purposes very interesting. It seems there would be some significant hurdles in terms of scientific rigorousness, but it’s still pretty cool and very possibly a very powerful tool in the social research toolbox.

A release from yesterday:

Internet emerges as social research tool

Panel discusses use of the Web in social science study

IMAGE: This is Thomas Dietz, Director, Environmental Science and Policy Program and Assistant Vice President for Environmental Research, Michigan State University.
Click here for more information. 

CHICAGO — For the past two decades, the Internet has been used by many as an easy-to-use tool that enables the spread of information globally. Increasingly, the Web is moving beyond its use as an electronic “Yellow Pages” and online messaging platform to a virtual world where social interaction and communities can inform social science and its applications in the real world.

“Although social scientists, engineers and physical scientists have studied the World Wide Web as an entity in and of itself for some time, there is now a growing group of social scientists who are learning how to use the World Wide Web as a tool for research rather than as a subject of research,” said Thomas Dietz, Michigan State University researcher and director of the university’s Environmental Science and Policy Program.

Today, at the American Association for the Advancement of Science annual meeting in Chicago, a panel of scientists organized by Dietz planned to examine various aspects of using the World Wide Web as a tool for research.

University of Michigan political science professor Arthur Lupia was to kick off the session by discussing how new virtual communities are improving surveys and transforming social science.

“Lupia is one of the world’s leaders related to survey research on the Web,” Dietz said. “His focus is on learning to use the Web as a way of soliciting people’s opinions and getting factual information from them via online surveys.”

Adam Henry, a doctoral fellow in the Sustainability Science Program at Harvard University’s Center for International Development, was scheduled next to discuss measuring social networks using the World Wide Web.

“Henry is developing very innovative ways to identify networks that are actual face-to-face relationships by tracking evidence streams on the Web,” Dietz said. “In other words, it’s not simply about who’s connected to whom on Facebook or Twitter, but who’s doing research with whom in the real world. It’s using the virtual world to identify things that are going on in the real world rather than using the virtual world simply to look at the virtual world.”

William Bainbridge, program director for the National Science Foundation’s Human-Centered Computing Cluster, was to rounded out the presentation with a discussion on the role of social science in creating virtual worlds.

“Bainbridge is studying group formation and social change over time in virtual worlds such as ‘World of Warcraft’ and ‘Second Life’ to inform and build on what sociologists have studied for 150 years,”

Dietz said. “He contends that virtual worlds are excellent laboratories for observing and prototyping new social forms that can later be applied to the outside world.”

Following the presentations, National Science Foundation sociology director Patricia White was to discuss implications of this research related to the future of social science.

 

###

 

– by Val Osowski

Michigan State University has been advancing knowledge and transforming lives through innovative teaching, research and outreach for more than 150 years. MSU is known internationally as a major public university with global reach and extraordinary impact. Its 17 degree-granting colleges attract scholars worldwide who are interested in combining education with practical problem solving.

For MSU news on the Web, go to news.msu.edu.

November 18, 2008

NASA tests new deep space communication

Successfully and based on the structure of the internet. Pretty cool stuff.

The release from about ten minutes ago:

NASA Successfully Tests First Deep Space Internet

PASADENA, Calif., Nov. 18 /PRNewswire-USNewswire/ — NASA has successfully tested the first deep space communications network modeled on the Internet.

(Logo:  http://www.newscom.com/cgi-bin/prnh/20081007/38461LOGO )

Working as part of a NASA-wide team, engineers from NASA’s Jet Propulsion Laboratory in Pasadena, Calif., used software called Disruption-Tolerant Networking, or DTN, to transmit dozens of space images to and from a NASA science spacecraft located about 20 million miles from Earth.

“This is the first step in creating a totally new space communications capability, an interplanetary Internet,” said Adrian Hooke, team lead and manager of space-networking architecture, technology and standards at NASA Headquarters in Washington.

NASA and Vint Cerf, a vice president at Google Inc., in Mountain View, Calif., partnered 10 years ago to develop this software protocol. The DTN sends information using a method that differs from the normal Internet’s Transmission-Control Protocol/Internet Protocol, or TCP/IP, communication suite, which Cerf co-designed.

The Interplanetary Internet must be robust to withstand delays, disruptions and disconnections in space. Glitches can happen when a spacecraft moves behind a planet, or when solar storms and long communication delays occur. The delay in sending or receiving data from Mars takes between three-and-a-half to 20 minutes at the speed of light.

Unlike TCP/IP on Earth, the DTN does not assume a continuous end-to-end connection. In its design, if a destination path cannot be found, the data packets are not discarded. Instead, each network node keeps the information as long as necessary until it can communicate safely with another node. This store-and-forward method, similar to basketball players safely passing the ball to the player nearest the basket means information does not get lost when no immediate path to the destination exists. Eventually, the information is delivered to the end user.

“In space today, an operations team must manually schedule each link and generate all the commands to specify which data to send, when to send it, and where to send it,” said Leigh Torgerson, manager of the DTN Experiment Operations Center at JPL. “With standardized DTN, this can all be done automatically.”

Engineers began a month-long series of DTN demonstrations in October. Data were transmitted using NASA’s Deep Space Network in demonstrations occurring twice a week. Engineers use NASA’s Epoxi spacecraft as a Mars data-relay orbiter. Epoxi is on a mission to encounter Comet Hartley 2 in two years. There are 10 nodes on this early interplanetary network. One is the Epoxi spacecraft itself and the other nine, which are on the ground at JPL, simulate Mars landers, orbiters and ground mission-operations centers.

This month-long experiment is the first in a series of planned demonstrations to qualify the technology for use on a variety of upcoming space missions. In the next round of testing, a NASA-wide demonstration using new DTN software loaded on board the International Space Station is scheduled to begin next summer.

In the next few years, the Interplanetary Internet could enable many new types of space missions. Complex missions involving multiple landed, mobile and orbiting spacecraft will be far easier to support through the use of the Interplanetary Internet. It also could ensure reliable communications for astronauts on the surface of the moon.

The Deep Impact Networking Experiment is sponsored by the Space Communications and Navigation Office in NASA’s Space Operations Mission Directorate in Washington. NASA’s Science Mission Directorate and Discovery Program in Washington provided experimental access to the Epoxi spacecraft. The Epoxi mission team provided critical support throughout development and operations.

Photo:  http://www.newscom.com/cgi-bin/prnh/20081007/38461LOGO
AP Archive:  http://photoarchive.ap.org/
PRN Photo Desk photodesk@prnewswire.com
Source: NASA
 

Web Site:  http://www.nasa.gov/

October 30, 2008

Data mining internet users

The release from the The Netherlands Organisation for Scientific Research:

Two Dutch researchers analyse striking behaviour of websurfers

27 October 2008

What behaviour do website visitors exhibit? Do they buy a specific product mainly on Mondays? Do they always return at a certain time of day? Being able to recognise and make use of such patterns is lucrative business for companies. Edgar de Graaf discovered that interesting patterns often contain a time aspect. Jeroen De Knijf developed methods to detect relevant patterns quicker.

In subject jargon it is called data mining: looking for interesting relationships within large quantities of data. Many data-mining programs produce a flood of potentially interesting patterns: as a user, how can you then find what you are looking for? Furthermore, the files are not always set up for such search actions, as is the case on the Internet or for instance in bioinformatics. It usually concerns semi-structured files: they often contain, for example, hyperlinks to other files, and contain (partial) information in a range of formats, such as text, images and sound.

MISTA project

Edgar de Graaf and Jeroen De Knijf both worked within the NWO-funded MISTA project (Mining in Semi-Structured Data) on methods to find patterns more quickly and effectively within large quantities of semi-structured data. De Graaf discovered that some patterns are interesting because they occur in quick succession. Other patterns are striking because, for example, they occur weekly. According to De Graaf, this time aspect merits further investigation.

The patterns can best be presented visually so that the user can find the information sought at a single glance. To realise this De Graaf described various ways of presenting different types of information.

Wikipedia compressed

De Knijf demonstrated that the number of patterns can be drastically reduced by allowing the user to indicate in advance the minimum requirements that a pattern must satisfy. This allows the data-mining program to find the interesting patterns much faster.

A second method De Knijf devised to reduce the number of results is the compression of the entire collection of documents (for example, Wikipedia pages) into a single document. By building accurate models that only make use of the compressed document, De Knijf was able to demonstrate that this summary does indeed contain the essential information from the entire collection.

The research was funded from the Open Competition 2003 of NWO Physical Sciences. 

September 22, 2008

The internet is not making us more dumb

Filed under: Media, Technology — Tags: , , , , — David Kirkpatrick @ 1:48 pm

From KurzweilAI.net — this article directly challenges Nicolas Carr’s Atlantic piece from the July/August issue that opined the internet is dumbing down its users. I blogged on that bit here.

I fall on the side of this New York Times article. Google, the internet in general and other online tools are a great boon, not a hindrance for users. Collectively all these powerful applications and databases are liberating.

But don’t get me started on “text speak.”

Technology Doesn’t Dumb Us Down. It Frees Our Minds.
New York Times, Sep. 20, 2008

Over the course of human history, writing, printing, computing and Googling have only made it easier to think and communicate, says Times writer Damon Darlin, challenging an article in The Atlantic magazine called “Is Google Making Us Stupid?”

 
Read Original Article>>

August 14, 2008

Wikipedia to enter web search space

Filed under: Business, Media, Technology — Tags: , , , , , , — David Kirkpatrick @ 6:12 pm

Currently 90% of all web searches are conducted through Google, Yahoo and Microsoft. Jimmy Wales, founder of Wikipedia, wants to broaden the search marketplace, and take on some internet giants in the process.

From the PhysOrg.com link:

Wales said Wikia Search will run on an open platform, similar to the principles behind Wikipedia, the popular online encyclopedia in which entries can be made and edited by anyone with an Internet connection.

“All of the existing search engines are proprietary black boxes,” said Wales. “You have no idea how things are ranked and what’s going on.”

With Wikia Search, users “can participate in meaningful ways” when they browse the Internet, he said.

February 22, 2008

Rotten Neighbor dot com

Filed under: et.al., Media — Tags: , , — David Kirkpatrick @ 3:28 pm

I’m with Boing Boing’s Mark Frauenfelder. Nothing good can possibly come of Rotten Neighbor dot com.

Here’s some sample posts chosen by Mark (he stripped the names):

hot neighbor, San Diego, CA

hot guy ignores me. i tried being sweet. i tried being assertive. i tried downright full on suduction. he just blows off my advances like he dose not even know i am hitting on him, but really i know he is just mentally torturing me, that is why i am warning you all. i should just get over him but he is sooo hot!

The **** Family, Frisco, TX

**** and **** need to leave Frisco with their tail between their legs. they have caused two homeowners to go into foreclosure because they won’t pay the rent. the people have caused many people heartache and frisco would be better off without them. Go back to Tarrant County **** ****!

drug addicts, Akron, OH

a guy named **** lives here with a poor excuse for a spouse & 2 unfortunate children. he’s a bad meth abuser & smokes crack. keep your children away from this house

Horrible Neighbors Everywhere, Glastonbury Center, CT

Too many whining and complaining *****y neighbors throughout building. They need to mind their own business and go about their lives… Seriously, if you don’t like condo living then get the f— out!

noisey during sex, Honolulu, HI

there is some lady in my building who has loud sex on saturday afternoons. she lives on the pool side of our building and it is embarrassing hearing her loud moans while the kids are down at the pool.

mean old lady with no life!!, Austin, TX

this women has nothing better to do then yell at myself and y famil, thinks she owns the block, tries to get us in trouble with the police, city hall, etc. you name it she has tried it!! the women has no life, hateful, nasty, even does things to little kids!! physco!! we are her whole mission in life and that is really sad!!

home wreaker

a 22 year b**ch lives here, she’ll lay down with anyone, especially if he’s married, established, and as old as her father. keep clear

From the site’s “About” page:

RottenNeighbor.com is here to help. It’s the first real estate search engine of its kind, helping you find troublesome neighbors before you sign the paperwork on your new house, condo or apartment. RottenNeighbor is the largest site anywhere in the world covering the neighbor space, and we’re certain you’ll agree that the value it offers is unmatched anywhere else!Use RottenNeighbor to:

* Get access to detailed maps of states, counties, cities and neighborhoods, all searchable by zip code

* Find important neighborhood information by searching the site’s user-provided data

* Help others by uploading your own good or rotten neighbors to our database

The Silver is the New Black Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 26 other followers