| project specification

Huggable Robot

Huggable is a robotic companion capable of active relational and affective touch-based interactions with a person. The robot is designed to function both as a fully autonomous robot as well as a semi-autonomous robot avatar. In the semi-autonomous case, the Huggable robot is remotely controlled by a human operator. In the head of the robot is a variety of microphones, two cameras in the eyes, and a speaker in the mouth. In the body, the robot features an inertial measurement unit based upon passive potentiometers in the hips and ankles for joint angle position detection, an embedded PC with wireless networking.

MIT

Specifications

Degrees Of Freedom (DOF) Total12
3
2
1
12
1
1
Actuators
Capacitive touch sensors12
Computation unit1

Overview

The robot is designed to function both as a fully autonomous robot as well as a semi-autonomous robot avatar. In the semi-autonomous case, the Huggable robot is remotely controlled by a human operator. 

In the head of the robot is a variety of microphones, two cameras in the eyes, and a speaker in the mouth. In the body, the robot features an inertial measurement unit based upon passive potentiometers in the hips and ankles for joint angle position detection, an embedded PC with wireless networking.

The robot features a high number of somatic sensors (electric field, temperature, and force) over the entire surface of the robot, underneath a soft silicone skin and fur fabric covering. 

The Huggable consists of a series of body regions – the arms, the legs, the head, and body. The body contains an embedded PC, somatic processing sensory circuit boards, batteries, and motor driver circuit boards.

The neck and shoulder mechanisms allow for active touch behaviors such as orienting towards touch, nuzzling, and hugging. The eyebrow and ear mechanisms are used for expression of internal state. Additional degrees of freedom may be added in the future such as body posture or other facial degrees of freedom.

Expressivity

The Huggable platform has twelve degrees of freedoms (DOFs) to perform animate and expressive motions: three for the head, two for each shoulder, one for each elbow, one for the waist, one for the muzzle and one for each ear. The head can rotate, nod and tilt. The arms can rotate and lift at the shoulders and can bend at the elbows. The ears and the waist can move forward and backward. The muzzle moves up and down when the robot talks.

Huggable is capable of manual look-at and point-at behaviors. Given an x, y coordinate command over wireless communication, the robot is able to look at or point at a specific location.  

Computation

Huggable uses an Android smart phone for its computation power. The smartphone is equipped with internal sensors, a microphone, camera, and an accelerometer. The wireless communication feature of the phone is usefed for connecting the stream of data from the internal sensors of the phone to a monitoring device or a teleoperation interface.

Sensors

The robot is equipped with twelve capacitive touch sensors around its body parts (front head, back head, right ear, left ear, left arm inside, left arm outside, right arm inside, right arm outside, right  side, left side, left leg and right left) and two pressure sensors on its paws. The capacitive sensors provide Boolean on/off outputs and the pressure sensors outputs an analog signal between to indicate how strongly a child is holding the robot’s paws. The haptic sensor data are communicated via IOIO board to the Android smart phone to be used for the robot to intelligently reaction the touch or to be the displayed for the remote operator controlling the robot from distance.

References

Design of the Huggable robot & the design decisions made during the development phase. Includes results from early pilot user studies that show the effects of choices and insights learned from real-world deployment for future work.

S. Jeong, K Don Santos, S. Graca, et al. - Published at IDC '15, 2015.

Master thesis which illustrates the mechanism of the hardware and software system extensively, followed by a description of the experimental study design that compares the impact of three different interventions.

S. Jeong. Thesis, 2014.

Outline of the development of the Huggable robot as a semi-autonomous robot avatar for remote interaction like family communication and education. Six important elements are highlighted to allow for the robot to function as a richly embodied communication channel.

J Ki Lee, et al. - Published at Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, August 2008

Describes how the semi-autonomous robot avatar version of the Huggable can be used as a research tool to help determine how robotic companions for eldercare applications should be designed. Four research scenarios are presented in which the Huggable can be used.

W. Dan Stiehl, J. Ki Lee, R. Toscano et al - Published in AAAI Fall Symposium: AI in Eldercare, 2008.

Focusses on the redesign of the head of the Huggable robot which has numerous design challenges like silent and back drivable transmissions in order to mimic the compliance of real creatures.

N. Akraboff. 2008.

Bachelor thesis describes the specific design challenges that enable the robot to be emotionally pleasant to interact with, bring value to care providers, and to create a computationally flexible platform that allows researchers to explore other applications for the Huggable robot.

L. Lalla. Thesis, 2008.

Describes the novel aspects of its design including the design of the sensitive skin for affective interaction, voice coil actuators, and results from classifying different kinds of social tactile interaction.

W. Dan Stiehl, J.Lieberman, C. Breazeal et al. IEEE International Workshop on Robot and Human Interactive Communication, August 2005.

The website gives a detailed overview of the project and its motivation. Describes the interface and lists the contributors to the project.

robotic.media.mit.edu/portfolio/huggabl

Tags

haptic feedbackhealthcarerehabilitation

Continue Reading

Built around a custom-made haptic feedback sleeve and an Intel RealSense camera in a 3D-printed glasses housing, this navigation system lets its user find their way even in absolute darkness.

A wearable navigation system, featuring a 3D camera, offers a hands-free navigation solution for the blind