Robotic VR system to teleoperate robots with potential for Covid-19 swab tests and other tasks

 

Robotic VR system for Covid-19 swab tests
The newly developed Robotic VR human-machine interface system can teleoperate the robot to imitate the user's actions to perform complicated tasks.

The coronavirus pandemic has accelerated the use of robots and an urgent need for an advanced human-machine interface (HMI) system that can seamlessly connect users and robots. A research team co-led by City University of Hong Kong (CityU) and collaborating institution recently developed an innovative HMI system, which consists of flexible, multi-layered electronic skin and provides both visual and haptic feedback to users. The system can teleoperate the robot to imitate the user's actions to perform complicated tasks. It demonstrates the potential for conducting Covid-19 swab tests and nursing patients with infectious diseases.

Robotic VR system for Covid-19 swab tests
The flexible, multi-layered electronic skin developed by Dr Yu Xinge and his team is the cornerstone of the new Robotic VR system. The sensors on the robotic hands provide haptic feedback to the user.

Dr Yu Xinge, Associate Professor in the Department of Biomedical Engineering (BME) at CityU, is one of the corresponding authors of the study. The research study was published in Science Advances under the title “Electronic Skin as Wireless Human Machine Interfaces for Robotic VR”.

The HMI system links users to robotics or computers and plays a significant role in teleoperating robotics. However, conventional HMIs are based on bulky, rigid and expensive machines, and the lack of adequate feedback to users limits their application for conducting complicated tasks.

Enables visual and haptic feedback in one go

In their latest research, the team presented a closed-loop HMI system based on soft, skin-integrated electronics, which enable wireless motion capture and haptic feedback via Bluetooth, Wi-Fi, and the internet.

The advanced HMI system, named Robotic VR, is an integration of visual, auditory and haptic virtual reality (VR) via skin-integrated electronics. It uses a layout of multilayer stacking, where the bottom layer of skin-tone elastomeric silicon serves as a soft adhesive interface, which can be mounted on the skin and joints of the user. The layers are interconnected with a collection of chip-scale integrated circuits and sensing components, including resistors, capacitors, a Bluetooth module, a microcontroller unit (MCU), and soft sensors and actuators developed by the team.

From motion sensing to action imitation

Robotic VR system for Covid-19 swab tests
(A): Schematic illustration of the Robotic VR system concept; a nurse is wearing the system to teleoperate a robot to read a thermometer with virtual and haptic feedback; (B) and (C) illustrate the system. (DOI number: 10.1126/sciadv.abl6700)

The sensors of the Robotic VR system can accurately detect and convert subtle human motion into electrical signals, which are processed by the MCU and wirelessly transmitted to the target robot. In this way, the user can teleoperate the robot to imitate his motion to accomplish tasks remotely. The pressure sensors on the robot can send feedback signals to control the vibration intensity of the haptic actuators through the Bluetooth module, thus providing haptic feedback to the user. The user can then precisely control and adjust the motion of the robot’s arm according to the intensity of the feedback.

The HMI system supports three wireless transmission methods – Bluetooth (up to tens of metres), WiFi (up to about 100 metres), and the Internet (worldwide) – which can be adjusted according to the practical applications.

“This new-generation flexible human-machine interface system enables teleoperating robotics for conducting complicated tasks,” said Dr Yu. He pointed out that the new system is stretchable and can be tightly mounted on human skin and even the whole human body for a long time. In addition, the interface provides both haptic and visual feedback systems, providing an immersive experience for users.

The human-machine interfaces could teleoperate various machines. With the advanced circuit design and outstanding mechanical characteristics, Dr Yu believes this newly invented HMI system has great potential for applications in commercial and biomedical fields. For example, it can be used to remotely drive unmanned cars. A user with a hand disability can remotely manipulate a robot to carry heavy goods using the HMI system. The sensors attached to the user can monitor and transmit muscle signals to control the robot.

An experimenter wears the Robotic VR system to teleoperate a prosthetic hand to grab a balloon with controlled force through haptic feedback.

Potential for teleoperating robots for complicated tasks

On the biomedical engineering side, doctors can remotely control robots to conduct surgery by wearing the HMI system with VR glasses. And medical workers can remotely manipulate the robot to look after infectious patients or collecting bio-samples, thus greatly decreasing the infection risk. The team conducted experiments to demonstrate the system’s potential applications, such as remotely controlling the robotic hand to collect throat swab samples for Covid-19 tests, and to teleoperate humanoid robots to clean a room and even provide patient care. The team is developing a next-generation system to provide the robotic collection of swab tests.

An experimenter wears the Robotic VR system to teleoperate a prosthetic hand to do throat swab sampling for Sars-Cov-2 test.

Dr Yu expects this new-generation HMI system to help provide a new approach in wirelessly connecting people to a robot or virtual character in the metaverse.

Robotic VR system for Covid-19 swab tests
Dr Yu Xinge (front row) and his research team at City University of Hong Kong. Back row (from left): Mr Yiu Chun-ki, Mr Liu Yiming, and Dr Huang Ya, who are PhD students and a post-doc fellow in Dr Yu’s research group.

The first authors of the research are Mr Liu Yiming, Mr Yiu Chun-ki and Dr Huan Ya from the BME, and Ms Song Zhen from Dalian University of Technology (DUT). The corresponding authors are Dr Yu and Professor Xie Zhaoqian from DUT. The key funding sources for the research include CityU, the Hong Kong Research Grant Council, and the Shenzhen Science and Technology Innovation Commission.

DOI number: 10.1126/sciadv.abl6700

Newsletter Subscription: Research 

* indicates required

Areas of Interest 

Contact Information

Back to top