Friday, April 4, 2025

Ontogenetics and Olfaction


To help AIs understand the world, researchers put them in a robot
Feb 2025, Ars Technica

“The inspiration for our model came from developmental psychology. We tried to emulate how infants learn and develop language”

Researchers also tried teaching an AI using a video feed from a GoPro strapped to a human baby. The problem is babies do way more than just associate items with words when they learn. They touch everything - grasp things, manipulate them, throw stuff around, and this way, they learn to think and plan their actions in language. An abstract AI model couldn’t do any of that, so Vijayaraghavan’s team gave one an embodied experience - their AI was trained in an actual robot that could interact with the world.

(The writeup for this article by Jacek Krywko for Ars Technica is very good.)

This is the idea, but instead of just an RGB camera, we need proprioception and the emotions that go along with it, and we'll have artificial smelling entities. 

via Okinawa Institute of Science and Technology: Science Robotics, 2025. DOI: 10.1126/scirobotics.adp0751


No comments:

Post a Comment