News:

Skill.jobs Forum is an open platform (a board of discussions) where all sorts of knowledge-based news, topics, articles on Career, Job Industry, employment and Entrepreneurship skills enhancement related issues for all groups of individual/people such as learners, students, jobseekers, employers, recruiters, self-employed professionals and for business-forum/professional-associations.  It intents of empowering people with SKILLS for creating opportunities, which ultimately pursue the motto of Skill.jobs 'Be Skilled, Get Hired'

Acceptable and Appropriate topics would be posted by the Moderator of Skill.jobs Forum.

Main Menu

Feel What This Robot Feels Through Tactile Expressions

Started by Monirul Islam, May 19, 2018, 10:54:16 AM

Previous topic - Next topic

Monirul Islam

We humans think we're pretty clever with all of the different ways that we have of communicating with each other. We vocalize, we have expressive faces, we gesture. It seems like plenty of modes of communication, but we're missing out on a few that are routine for animals, including texture: Animals can express emotional states through skin changes, like when cats cause their hair to stand up, or when a blowfish inflates itself and gets all pokey. We can't make textural expressions like these (although it does sometimes happen to us involuntarily), but we can often do a reasonable job of interpreting them when we see them: Anything that grows spikes, for example, probably prefers not to be touched.

Guy Hoffman's Human-Robot Collaboration & Companionship (HRC2) Lab at Cornell University is working on a new robot that's designed to investigate this concept of textural communication, which really hasn't been explored in robotics all that much. The robot uses a pneumatically powered elastomer skin that can be dynamically textured with either goosebumps or spikes, which should help it communicate more effectively, especially if what it's trying to communicate is, "Don't touch me!"

The robot has two texture modules, one on each side, which are designed to be gripped while interacting with it. Each texture module consists of multiple texture units arranged in a grid, all of which actuate at the same time. The texture units are made of hollowed out elastomers connected to each other through a network of internal channels that can be filled with pressurized air. Adding air causes the texture units to inflate, popping up above the surface of the texture module to form shapes that can be both seen and felt.

Cornell robot with inflatable spikes
Image: Cornell
The researchers equipped their prototype robot with two different kinds of texture units: Goosebumps, which form rounded domes when inflated, and spikes, which include rigid little pokey bits. The spikes can't be completely flattened, but by reversing the pneumatic pressure, they can be sucked back down into the structure of the elastomer. All of the similar units in each module are connected to each other, such that a single module can be as spikey or as goosebumpy as you like.

Since humans don't have a lot of experience with tactile communication, the researchers had to figure out a way of translating goosebumps and spikes into something that humans could understand. That's part of the reason why the texture modules are mounted on each side of a little screen with an expressive face in this social robot prototype, to help the researchers figure out how well textures can be mapped onto emotions, with and without supplementary emotional expression from the robot. That's one of the things they're working on next.

Octos
Photos: Colin Ackerman
Tactile expression in the animal kingdom that even humans can understand. This octopus is saying, "Go away" by growing spikes.
A paper on this research was presented at the International Conference on Soft Robotics (RoboSoft), and for more details, we spoke with first author Yuhan Hu along with Professor Guy Hoffman via email.

IEEE Spectrum: Why is nonverbal communication important for robotics?

Yuhan Hu and Guy Hoffman: It's important because nonverbal behavior plays a central role in human communication. Research in human-robot interaction shows that a robot's ability to use nonverbal behavior to communicate affects their potential to be useful to people, and can also have psychological effects. Other reasons include that having a robot use nonverbal behaviors can help make it be perceived as more familiar and less machine-like.

Why do you think there hasn't been more experimentation of tactile expression as a way for robots to communicate?

Most social robots research focuses on facial expressions, large limb body movements, posture, or tones of voice. This is because they are the most prominent channels through which humans express non-verbally. So in a way, it is the path of highest familiarity for social robot designers. Skin texture change is more apparent in other animals. However, it should be noticed that we humans also present skin texture changes due to the external or internal stimuli, for example goosebumps.

Compared to the more traditional nonverbal channels there has also been much less research mapping skin texture changes to emotions, whereas facial expression and body movements have a rich history of emotion mapping, from the arts to the biological and social sciences.

Was there anything in particular that inspired you to develop this system?

We were inspired mostly by animals, which prominently display changes to their skin texture when they are facing an external or internal stimulus. For example cats raising their fur, the protrusion of spikes in many animals, and birds ruffling their feathers.

It is worth pointing out that Charles Darwin, who meticulously mapped nonverbal behaviors and emotions in his book "The Expression of the Emotions in Man and Animals," already noted:

"Hardly any expressive movement is so general as the involuntary erection of the hairs, feathers and other dermal appendages; for it is common throughout three of the great vertebrate classes."

This includes fish, reptiles, birds, and mammals, which indicates that it is an even more basic expression of emotion than facial expressions (arguably mostly used by mammals) and gestures.

Another reason we think this is a useful is that skin changes operate on two channels at once: They can be perceived visually and also haptically. This could offer new kinds of interactions between humans and robots.

While some animals use textural changes to actively express themselves, humans generally do not. Can you explain how you are able to translate textural changes into emotional expressions that humans can reliably identify?

Humans are an integral member of the family of animal species, and co-evolved with many of them. We can predict therefore that we have an ability to read some of the signals sent by other species. For example, we are less likely to get close to a spiky blowfish and may read distress in a ruffled bird. Thinking about robots as more than just human-replacement, we could use this new communication channel to give humans a signal whether the robot is in a positive state or not.

We are currently looking at the dynamics of the texture changes, including velocity, frequency, and amplitude, as well as spatial "rhythms" such as repetitive patterns. These naturally map to human experiences. For example, our heart rate and breath falls to a lower frequency level when we are in a more calm state.

Right now we are running experiments to test whether those texture expressions can be consistently mapped to emotions and reliably understood by users.

Are there potential applications that this kind of communication might be ideal for? Where do you think this technology would be most useful?

We hope that the tactile and more "evolutionarily basic" nature of texture change could first of all enhance expressing the effectiveness of robot's emotion expression, perhaps by generating subconscious or unconscious interactions. For example, if we see a sad face on a robot, we may be able to read the robot's state, but that sadness may have less of an emotional effect on us when compared to the sensation of sharp spikes under our palms.

We could also see applications where this kind of social expressiveness is useful when visual and auditory channels are not available. In military applications, for example, a silent and invisible mode of communicating between humans and robots may be necessary. Visibility may be blocked due to environmental conditions such as dust and smoke in emergency situations. Or this could open up an emotional communication channel with robots for individuals with visual impairments.

What are you working on next? Are there other kinds of tactile shapes you'd like to explore?

Our main effort right now is to map emotions to tactile expressions, and compare it to some other existing modalities, like facial expressions and gestures. We are also working on the mechanical design problem of integrating the fluidic actuators into a social robot form factor. We would then like to research new texture shapes and generalize this design process using computational methods.

Soruce: https://spectrum.ieee.org/automaton/robotics/robotics-hardware/feel-what-this-robot-feels-through-tactile-expressions