Soon you might be able to watch a combo of real action and computer-generated images during live TV broadcasts, thanks to a camera navigation technology now under development.

Harnessing techniques from mathematics, computing and engineering, the new system is being developed at Oxford University with funding from the Engineering and Physical Sciences Research Council (EPSRC).



The technique is expected to open up the prospects of outdoor sporting, musical or other TV coverage that blends the excitement of being live with the spectacular visual impact that computer graphics can create. It can also be applied at the consumer level, like visualising interior design ideas by adding virtual furniture to the view of a room provided by a hand-held camera as it moves.



The system, being developed under the supervision of Dr Ian Reid and Dr Andrew Davison of Oxford University’s Department of Engineering Science, is able to work out in real-time where a camera is and how it is moving, simultaneously constructing a detailed visual map of its surroundings. This enables computer graphics to be overlaid accurately onto live pictures as soon as they are produced. Previously the blending of live action and computer-generated images has only been possible in controlled studio environments.



“This localisation and mapping technology turns a camera into a flexible, real-time position sensor. It has all kinds of potential applications,” Andrew said.



The system comprises a mobile video camera connected to a laptop computer, which analyses the images it receives using software developed by the researchers. As the camera moves, the system picks out landmarks as reference points and makes a map of their 3D locations against which to measure its position.



Along with TV and video applications, the technology under development could provide low-cost, high-performance navigation for domestic robots. It could also be incorporated into video games or wearable computing, e.g. for use in dangerous environments, where it could confirm the wearer’s location and allow relevant guidance to be overlaid onto their view of surroundings. (ANI)


More here.

0