“In Canadian author Margaret Atwood’s book The Blind Assassin, she says that “touch comes before sight, before speech. It’s the first language and the last, and it always tells the truth.”
While our sense of touchgives us a channel to feel the physical world, our eyes help us immediately understand the full picture of these tactile signals.
Robots that have been programmed to see or feel can’t use these signals quite as interchangeably. To better bridge this sensory gap, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a predictive artificial intelligence (AI) that can learn to see by touching, and learn to feel by seeing…”