IBM has launched a framework that allows Watson to have a physical shell and use it in various ways. Developers can integrate the AI from IBM into all kinds of things, including spaces, assembly lines, robots, connected cars, wearables, walls, digital avatars, and objects. The framework has support for a wide variety of sensors. Through these bodies, Watson can see, hear and even smell. There is support for infrared and sonar, and Watson can detect vibrations or track changes in temperatures.
project-intu
These sensors form the “input” through a physical body for the artificial intelligence, which can be configured to churn the information in various ways. The output can be through gestures, actuators, voice, scent emitters, light sequences, navigation or through other purpose built capabilities. This allows Watson to be embodied. The implementation uses a Unity 3D application, and extends the capabilities of the application to the real world.
The whole initiative is called “Project Intu”. A wide range of operating systems are supported including Raspberry Pi, Windows, Mac and Linux. The framework allows for integration of cognitive capabilities such as conversations, translations, image captioning, searches, or speech to text capabilities into meat-space things. For those who want to take a deep dive, Project Intu is available on GitHub, and developers will need to use the Intu Gateway.