Image result for ibm IBM has launched a framework that allows Watson to have a physical shell and use it in various ways. Developers can integrate the AI from IBM into all kinds of things, including spaces, assembly lines, robots, connected cars, wearables, walls, digital avatars, and objects. The framework has support for a wide variety of sensors. Through these bodies, Watson can see, hear and even smell. There is support for infrared and sonar, and Watson can detect vibrations or track changes in temperatures. These sensors form the “input” through a physical body for the artificial intelligence, which can be configured to churn the information in various ways. The output can be through gestures, actuators, voice, scent emitters, light sequences, navigation or through other purpose built capabilities. This allows Watson to be embodied. The implementation uses a Unity 3D application, and extends the capabilities of the application to the real world. The whole initiative is called “Project Intu”. A wide range of operating systems are supported including Raspberry Pi, Windows, Mac and Linux. The framework allows for integration of cognitive capabilities such as conversations, translations, image captioning, searches, or speech to text capabilities into meat-space things. For those who want to take a deep dive, Project Intu is available on GitHub, and developers will need to use the Intu Gateway.
Image result for ibm
IBM has launched a framework that allows Watson to have a physical shell and use it in various ways. Developers can integrate the AI from IBM into all kinds of things, including spaces, assembly lines, robots, connected cars, wearables, walls, digital avatars, and objects. The framework has support for a wide variety of sensors. Through these bodies, Watson can see, hear and even smell. There is support for infrared and sonar, and Watson can detect vibrations or track changes in temperatures.
These sensors form the “input” through a physical body for the artificial intelligence, which can be configured to churn the information in various ways. The output can be through gestures, actuators, voice, scent emitters, light sequences, navigation or through other purpose built capabilities. This allows Watson to be embodied. The implementation uses a Unity 3D application, and extends the capabilities of the application to the real world.
The whole initiative is called “Project Intu”. A wide range of operating systems are supported including Raspberry Pi, Windows, Mac and Linux. The framework allows for integration of cognitive capabilities such as conversations, translations, image captioning, searches, or speech to text capabilities into meat-space things. For those who want to take a deep dive, Project Intu is available on GitHub, and developers will need to use the Intu Gateway.
Axact

slice Team!

2015 copyrighted company it was founded and administrated by ceo mouli tharan it was the one and only website where u could have intresting life facts,we bring u some tech freaking news to inspire u,about us and join us and have fun to be with us and slice your life,i hope u have got started syl yourself now.

Post A Comment:

0 comments: