MIT’s new AI for robots can ‘feel’ an object just by seeing it


Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a brand new AI that can feel objects just by seeing them – and vice versa.  The new AI can predict how it would feel to touch an object, just by looking at it. It can also create a visual representation of an object, just from the tactile data it generates by touching it. Yunzhu Li, CSAIL PhD student and lead author on the paper about the system, said the model can help robots handle real-world objects better:  By looking at the scene, our model can imagine the…

This story continues at The Next Web


from The Next Web

Comments