Untitled Document
Not a member yet? Register for full benefits!

Username
Password
 New Augmented Reality Position Tech: Beginning to blend?

This story is from the category Display Technology
Printer Friendly Version
Email to a Friend (currently Down)

 

 

Date posted: 03/02/2005

Live TV outside broadcasts that combine real action and computer-generated images could become possible for the first time, thanks to camera navigation technology now under development.

Real-Time Camera Localisation in Real Environments is a 3-year long project, by Dr Ian Reid and Dr Andrew Davison of Oxford University's Department of Engineering Science. Funded by from the Engineering and Physical Sciences Research Council (EPSRC), nearly ?255,000 has been dedicated to the task of combining live signal feeds with CGI to create augmented reality images.

The work is opening up the prospect of outdoor sporting, musical or other TV coverage that blends the excitement of being live with the spectacular visual impact that computer graphics can create. It can also be applied for consumer reality augmentation ? for example, AR games, in which the creatures clamber over the furniture in your room, yet never sink into it, no matter where you turn the camera.

The system is able to work out in real-time where a camera is and how it is moving, simultaneously constructing a detailed visual map of its surroundings. This enables computer graphics to be overlaid accurately onto live pictures as soon as they are produced. Previously the blending of live action and computer-generated images has only been possible in controlled studio environments.

To do this, first a video camera is swept over a scene, as demonstrated by this room with wall protrusions, lots of floor clutter, and plenty of pictures on the walls toprovide texture distraction.

The camera is connected directly to a small source of processing power - a laptop computer. This analyses the images it receives using bespoke software developed by the researchers.

Dr Andrew Davison was quoted as saying "This localisation and mapping technology turns a camera into a flexible, real-time position sensor. It has all kinds of potential applications."

Indeed, there are many potential uses of this system, from TV actors seeing their computer graphic colleagues standing next to them, to AR gaming, general life through augmented reality, etc.

Robotic vision systems may also benefit from this technology, as it shows them where they can, and can not navigate in a 3D area.

Click here for more images concerning this story.



See the full Story via external site: www.eurekalert.org



Most recent stories in this category (Display Technology):

08/02/2017: New method improves accuracy of imaging systems

04/02/2017: New technology to watch the sea waves in 3D

11/01/2017: Telepresence used for Criminal Court Proceedings

16/09/2014: ‘Squid skin’ metamaterials project yields vivid color display

10/09/2014: 2D or 3D? New study shows no difference in emotional reactions between film formats

28/08/2014: Razor-sharp TV pictures

07/06/2014: Shatterproof screens that save smartphones

27/05/2014: New 'T-ray' tech converts light to sound for weapons detection, medical imaging