Huggable Robot
Huggable is a robotic companion capable of active relational and affective touch-based interactions with a person. The robot is designed to function both as a fully autonomous robot as well as a semi-autonomous robot avatar. In the semi-autonomous case, the Huggable robot is remotely controlled by a human operator. In the head of the robot is a variety of microphones, two cameras in the eyes, and a speaker in the mouth. In the body, the robot features an inertial measurement unit based upon passive potentiometers in the hips and ankles for joint angle position detection, an embedded PC with wireless networking.
MIT
Specifications
Degrees Of Freedom (DOF) Total | 12 |
3 | |
2 | |
1 | |
12 | |
1 | |
1 | |
Actuators | |
Capacitive touch sensors | 12 |
Computation unit | 1 |
Overview
The robot is designed to function both as a fully autonomous robot as well as a semi-autonomous robot avatar. In the semi-autonomous case, the Huggable robot is remotely controlled by a human operator.
In the head of the robot is a variety of microphones, two cameras in the eyes, and a speaker in the mouth. In the body, the robot features an inertial measurement unit based upon passive potentiometers in the hips and ankles for joint angle position detection, an embedded PC with wireless networking.
The robot features a high number of somatic sensors (electric field, temperature, and force) over the entire surface of the robot, underneath a soft silicone skin and fur fabric covering.
The Huggable consists of a series of body regions – the arms, the legs, the head, and body. The body contains an embedded PC, somatic processing sensory circuit boards, batteries, and motor driver circuit boards.
The neck and shoulder mechanisms allow for active touch behaviors such as orienting towards touch, nuzzling, and hugging. The eyebrow and ear mechanisms are used for expression of internal state. Additional degrees of freedom may be added in the future such as body posture or other facial degrees of freedom.
Expressivity
The Huggable platform has twelve degrees of freedoms (DOFs) to perform animate and expressive motions: three for the head, two for each shoulder, one for each elbow, one for the waist, one for the muzzle and one for each ear. The head can rotate, nod and tilt. The arms can rotate and lift at the shoulders and can bend at the elbows. The ears and the waist can move forward and backward. The muzzle moves up and down when the robot talks.
Huggable is capable of manual look-at and point-at behaviors. Given an x, y coordinate command over wireless communication, the robot is able to look at or point at a specific location.
Computation
Huggable uses an Android smart phone for its computation power. The smartphone is equipped with internal sensors, a microphone, camera, and an accelerometer. The wireless communication feature of the phone is usefed for connecting the stream of data from the internal sensors of the phone to a monitoring device or a teleoperation interface.
Sensors
The robot is equipped with twelve capacitive touch sensors around its body parts (front head, back head, right ear, left ear, left arm inside, left arm outside, right arm inside, right arm outside, right side, left side, left leg and right left) and two pressure sensors on its paws. The capacitive sensors provide Boolean on/off outputs and the pressure sensors outputs an analog signal between to indicate how strongly a child is holding the robot’s paws. The haptic sensor data are communicated via IOIO board to the Android smart phone to be used for the robot to intelligently reaction the touch or to be the displayed for the remote operator controlling the robot from distance.
References
Design of the Huggable robot & the design decisions made during the development phase. Includes results from early pilot user studies that show the effects of choices and insights learned from real-world deployment for future work.
Master thesis which illustrates the mechanism of the hardware and software system extensively, followed by a description of the experimental study design that compares the impact of three different interventions.
Outline of the development of the Huggable robot as a semi-autonomous robot avatar for remote interaction like family communication and education. Six important elements are highlighted to allow for the robot to function as a richly embodied communication channel.
Describes how the semi-autonomous robot avatar version of the Huggable can be used as a research tool to help determine how robotic companions for eldercare applications should be designed. Four research scenarios are presented in which the Huggable can be used.
Focusses on the redesign of the head of the Huggable robot which has numerous design challenges like silent and back drivable transmissions in order to mimic the compliance of real creatures.
Bachelor thesis describes the specific design challenges that enable the robot to be emotionally pleasant to interact with, bring value to care providers, and to create a computationally flexible platform that allows researchers to explore other applications for the Huggable robot.
Describes the novel aspects of its design including the design of the sensitive skin for affective interaction, voice coil actuators, and results from classifying different kinds of social tactile interaction.
The website gives a detailed overview of the project and its motivation. Describes the interface and lists the contributors to the project.
Tags
haptic feedbackhealthcarerehabilitation