Google created AI that just needs a few snapshots to make 3D models of its surroundings

 3dmodel-300x169

The algorithm only needs a couple perspectives to figure out what objects look like.

Google’s new type of artificial intelligence algorithm can figure out what things look like from all angles — without needing to see them.

After viewing something from just a few different perspectives, the Generative Query Network was able to piece together an object’s appearance, even as it would appear from angles not analyzed by the algorithm, according to research published today in Science. And it did so without any human supervision or training. That could save a lot of time as engineers prepare increasingly advanced algorithms for technology, but it could also extend the abilities of machine learning to give robots (military or otherwise) greater awareness of their surroundings.

Continue reading… “Google created AI that just needs a few snapshots to make 3D models of its surroundings”