Some say the next step in human evolution will be the integration of technology with flesh. Now, researchers have used virtual reality to test whether humans can feel embodiment — the sense that something is part of one’s body — toward prosthetic “hands” that resemble a pair of tweezers. They report June 6 in the journal iScience that participants felt an equal degree of embodiment for the tweezer-hands and were also faster and more accurate in completing motor tasks in virtual reality than when they were equipped with a virtual human hand.
“For our biology to merge seamlessly with tools, we need to feel that the tools are part of our body,” says first author and cognitive neuroscientist Ottavia Maddaluno, who conducted the work at the Sapienza University of Rome and the Santa Lucia Foundation IRCCS with Viviana Betti. “Our findings demonstrate that humans can experience a grafted tool as an integral part of their own body.”
Previous studies have shown that tool use induces plastic changes in the human brain, as does the use of anthropomorphic prosthetic limbs. However, an open scientific question is whether humans can embody bionic tools or prostheses that don’t resemble human anatomy.
To investigate this possibility, the researchers used virtual reality to conduct a series of experiments on healthy participants. In the virtual reality environment, participants had either a human-like hand or “bionic tool” resembling a large pair of tweezers grafted onto the end of their wrist. To test their motor ability and dexterity, participants were asked to pop bubbles of a specific color (by pinching them with their tweezers or between their index finger and thumb). For this simple task, the researchers found that participants were faster and more accurate at popping virtual bubbles when they had tweezer-hands.
Next, the team used a test called the “cross-modal congruency task” to compare implicit or unconscious embodiment for the virtual hand and bionic tool. During this test, the researchers applied small vibrations to the participants’ fingertips and asked them to identify which fingers were stimulated. At the same time, a flickering light was displayed on the virtual reality screen, either on the same finger as the tactile stimulus or on a different finger. By comparing the participants’ accuracy and reaction times during trials with matched and mismatched stimuli, the researchers were able to assess how distracted they were by the visual stimulus.
“This is an index of how much of a mismatch there is in your brain between what you feel and what you see,” says Maddaluno. “But this mismatch could only happen if your brain thinks that what you see is part of your own body; if I don’t feel that the bionic tool that I’m seeing through virtual reality is part of my own body, the visual stimulus should not give any interference.”
In both cases, participants were faster and more accurate at identifying which of their real fingers were stimulated during trials with matched tactile and visual stimuli, indicating that participants felt a sense of embodiment toward both the virtual human hand and the tweezer-hands.
However, there was a bigger difference between matched and mismatched trials when participants had tweezer- rather than human hands, indicating that the non-anthropomorphic prosthesis resulted in an even greater sense of embodiment. The researchers speculate that this is due to the tweezer-hands’ relative simplicity compared to a human-like hand, which might make it easy for the brain to compute and accept.
“In terms of the pinching task, the tweezers are functionally similar to a human hand, but simpler, and simple is also better computationally for the brain.” says Maddaluno.
They note that it could also relate to the “uncanny valley” hypothesis, since the virtual human hands might have been too eerily similar yet distinct for perfect embodiment.
In addition to the tweezer-hands, the researchers also tested a wrench-shaped bionic tool and a virtual human hand holding a pair of tweezers. They found evidence of embodiment in all cases, but the participants had higher embodiment and were more dexterous when the tweezers were grafted directly onto their virtual wrists than when they held them in their virtual hand.
Participants also displayed a higher sense of embodiment for the bionic tools when they had the opportunity to explore the virtual reality environment before undertaking the cross-modal congruency test. “During the cross-modal congruency task participants had to stay still, whereas during the motor task, they actively interacted with the virtual environment, and these interactions in the virtual environment induce a sense of agency,” says Maddaluno.
Ultimately, the researchers say that this study could inform robotics and prosthetic limb design. “The next step is to study if these bionic tools could be embodied in patients that have lost limbs,” says Maddaluno. “And we also want to investigate the plastic changes that this kind of bionic tool can induce in the brains of both healthy participants and amputees.”