Skin-integrated electronics usher in a new age of human-machine interface for robotic virtual reality

Demonstrated by researchers in Hong Kong, this closed-loop human-machine interface (CL-HMI) system lets you take control of a robot via Bluetooth, Wi-Fi, or the Internet — and to feel what it feels.

author avatar

05 Feb, 2022

The future of healthcare could be robotic, thanks to novel skin-integrated movement sensors with tactile feedback.

The future of healthcare could be robotic, thanks to novel skin-integrated movement sensors with tactile feedback.

There was a time when whole-body tele-operation control of a virtual construct was the work of science fiction, as fans of the 1992 classic The Lawnmower Man will recall. Now, though, a real-world version of the same technology could be key to delivering easily-operated robotics for tasks including the capturing of biological samples from patients — providing protection for medical staff in the era of the SARS-CoV-2 pandemic.

In a paper published in Science Advances, a team from Chinese universities and the Hong Kong Center for Cerebro-cardiovascular Health Engineering have detailed a closed-loop human-machine interface (HMI) system which allows the user to wrap themselves in skin-integrated electronics for control of a virtual or robotic doppelgänger over Bluetooth, Wi-Fi, or the wider Internet — complete with full-body tactile feedback.

A question of functionality

Human-machine interfaces have been around as long as there have been machines, but the current state-of-the-art in control of virtual reality constructs and robotics is, the authors of the paper claim, lacking. “Most of them rely on cumbersome machines and rigid electronics,” they write, “that have constraints in terms of wearability, comfortability, and limited functionalities.”

The solution: Flexible closed-loop human-machine interfaces (CL-HMIs), built around skin-integrated electronics designed to capture a broad array of body movement data, designed specifically with robotic VR control in mind.

The team’s hardware is based around elastomeric silicon, designed to adhere to the skin in a comfortable and flexible fashion, with a layer of copper traces linked to chip-scale integrated circuits and other components for sensing, control, and communication. Custom-designed soft sensors complete the device, while another layer of silicon on top encapsulates everything.

An image showing the circuit layout and real-world examples of soft human-machine interface devices. The device is shown in three states: bent, stretched, and curved.These soft, flexible sensors could be key to natural, accurate control of robotic systems in virtual reality.

Worn on the skin, the electronics package monitors movement via sensing how the wearer’s limb bends; the same soft sensors, meanwhile, are fitted to the robot to be controlled where they act as pressure sensors. A series of vibratory actuators take received signals from the robot’s pressure sensors and relay them to the wearer — giving the operator, the authors claim, the linear feedback required to respond to rising pressure and modulate, for instance, a gripper’s closing.

A prototype of the device, worn on a volunteer’s arm, proved capable of operating for 68 minutes even while driving the haptic feedback vibrators at full capacity. During long-term testing, the volunteer showed that the package could remain mounted on an arm for a full eight hours without detaching — though the bending sensors managed between three hours and 20 minutes and just under six hours.

Wireless control

A key feature of the prototype as-developed is its ability to communicate in three different ways: Bluetooth, Wi-Fi, and remotely via the Internet. Bluetooth offers the most rapid response times, measured at under 4µs, but the shortest range; Wi-Fi boosts the range considerably but increases the response time to 350µs, still well below the 550µs of a typical human reaction time. Finally, in Internet mode, the sensor prototype can be located miles away from the receiving system at the cost of response times measured in milliseconds rather than microseconds.

To prove the prototype, the team performed a series of experiments beginning with asking a volunteer to drive a remote-controlled car via a CL-HMI mounted to their fingers — one finger for speed, one finger for direction. This was then extended to control of a prosthetic hand offering seven degrees of freedom (DoF) linked to seven bending sensors on the user’s fingers, wrist, and elbow.

An image of an experiment in which a robot arm is used to squeeze rubber cubes and sort them by hardness, using tactile feedback at the controller end.Pressure sensors on the robot feed back to vibration motors, giving the operator control while grasping.

At the same time, the robotic hand was fitted with the same soft-sensors in pressure-sensing mode — feedback from which was fed back to the operator through the vibration motors. Using this feedback, the volunteer was able to successfully grasp a balloon without damaging it and to sort visually-identical rubber cubes by hardness.

The researchers have a vision broader than this, however: Tying the CL-HMI system in with virtual reality robotics. “Combining the skin-integrated CL-HMI with visual information,” they write, “this two-mode system creates a advanced technology, referred as intelligent robotic VR, that exhibits huge potential in remote control technology.”

A 13-DoF humanoid robot, the team proposes, could be constructed with pressure sensors in the “forearms, upper arms, thighs, thigh sides, tummy, and crura” in addition to the hands, linked to an operator using a full-body CL-HMI combined with a virtual reality headset connected to cameras in the robot’s eyes.

Images of a controller, fitted with the human-machine interface hardware, operating a robotic arm to take a saliva sample from a volunteer and a compact humanoid robot to squat, clean, and tuck a patient into bed.The team believes it would be possible to extend the system to humanoid robotics for a range of tasks, including medical work.

“The user can remotely control an intelligent robot to conduct various complicated tasks,” the team explains, “including squatting, walking, cleaning room, and nursing patients, via the robotic VR system in Wi-Fi mode.”

The team’s work has been published in the journal Science Advances under open-access terms; a provisional patent application relating to the work has been filed by the City University of Hong Kong.

Reference

Yiming Liu, Chunki Yiu, Zhen Song, Ya Huang, Kuanming Yao, Tszhung Wong, Jingkun Zhou, Ling Zhao, Xingcan Huang, Sina Khazaee Nejad, Mengge Wu, Dengfeng Li, Jiahui He, Xu Guo, Junsheng Yu, Xue Feng, Zhaoqian Xie, Xinge Yu: Electronic skin as wireless human-machine interfaces for robotic VR, Science Advances Vol. 8, Issue 2. DOI 10.1126/sciadv.abl6700.

05 Feb, 2022

A freelance technology and science journalist and author of best-selling books on the Raspberry Pi, MicroPython, and the BBC micro:bit, Gareth is a passionate technologist with a love for both the cutting edge and more vintage topics.

Stay Informed, Be Inspired

Wevolver’s free newsletter delivers the highlights of our award winning articles weekly to your inbox.