Untitled Document
Not a member yet? Register for full benefits!

 A Sidewalk Disappearing Act

This story is from the category Display Technology
Printer Friendly Version
Email to a Friend (currently Down)



Date posted: 15/08/2010

In 2008, responding to privacy concerns, Google started to blur the faces of people caught by car-mounted cameras and shown in its Google Street View mapping service. Researchers in California believe they have now come up with a better solution--software that automatically removes any trace that a person was in a scene.

The approach protects people's privacy while also providing a cleaner street-level view, says Arturo Flores, a computer-science graduate student in the Artificial Intelligence Group at the University of California, San Diego. "Even with face blurring, it is still possible to identify a person," Flores says. Clothing, body shape, and height, combined with a location, can be enough to recognize someone, he says.

Google's Street View vans use nine roof-mounted cameras to take regular shots of the scene around them. These are then stitched together to produce a near-seamless panoramic view. But automatically removing people from thousands of varied images, each showing different scenes, is a challenge.

Flores's software first has to detect any pedestrians in a scene. This is done using a standard object-recognition algorithm called implicit shape model (ISM), which was developed at the Swiss Federal Institute of Technology. "The idea is to find a rough contour of pedestrians," says Bastian Leibe, a codeveloper of ISM who is now at RWTH Aachen University. Because there is so much variability in human appearance, the algorithm takes a probabilistic approach--looking for similarities between the shapes in images and hundreds of images of pedestrians that it has been trained to recognize.

Once a pedestrian has been identified and cut from an image, the hole left behind has to be filled in. Flores's software does this by using photographs captured before and after the image in question by Google's Street View vans. These images show a view of the background from slightly different angles--the algorithm can reorient the background and stitch it into the space left behind by the missing pedestrian.

Flores and his advisor, Serge Belongie, recently presented the work at the IEEE International Workshop on Mobile Vision in Chicago. Flores says some real-world images are simply too unusual for the software to process properly. Odd artifacts have been left behind in some shots, such as dogs on a leash without their owner, or pairs of shoes apparently abandoned on the sidewalk.

The system also struggles to generate a background when a pedestrian is walking in the same direction as the Google van, says Flores. "It just isn't possible to get an unobstructed view of the background," he says.

See the full Story via external site: www.technologyreview.com

Most recent stories in this category (Display Technology):

08/02/2017: New method improves accuracy of imaging systems

04/02/2017: New technology to watch the sea waves in 3D

11/01/2017: Telepresence used for Criminal Court Proceedings

16/09/2014: ‘Squid skin’ metamaterials project yields vivid color display

10/09/2014: 2D or 3D? New study shows no difference in emotional reactions between film formats

28/08/2014: Razor-sharp TV pictures

07/06/2014: Shatterproof screens that save smartphones

27/05/2014: New 'T-ray' tech converts light to sound for weapons detection, medical imaging