This week, we listened to Dr. Deborah Forster on Awareness-Through-Movement, Dr. Adam Burgasser of UCSD Physics on physics gestures, and Rachel Mayeri on primate/animal gestures. Rachel Mayeri’s Primate Cinema showed us the similarities between humans and animals through gestures and interpretive movement. It reminds us that facial and hand gestures are used in everyday conversations and interactions. The interactions shown in Primate Cinema however, shows us that even animals are capable of interacting with us through these means. Examples of this includes trained animals such as circus animals an household pets. Household pets exhibit emotions to their owners on a daily basis. When an animal is upset or happy, owner’s can usually decipher what they need such as food, water, or a bathroom walk. Animals in movies, although not real, can be realistic through the way they can display emotions to the audience without speaking a word. In the movie “Up,” Dug the dog is introduced. Only his gestures can be seen until he obtains a piece of hardware that allows him to talk. It is assumed that we need to hear animals speak our language in order to communicate with us, however that is not completely true. With his hardware equipped, he says things that you think would come from a dog’s mindset such as “Bird!” and other nonsensical banters. We understand these gestures because many facial and gesture movements are universal along with emotions, which are widely shown. In the short containing Scrat the squirrel from the animated film, “Ice Age,” we are told a story in which there are no words, but we can understand the emotions displayed by the squirrel in each scenario shown to us.
In this following video, Joyce Poole (Elephant biologist) does a run through of an elephants gestures. She explains the reasoning behind each gesture the elephant is making, including a playful stance and an intimidating stance. This shows how we as humans can interpret an animal’s intentions based on their movements or actions.
Human gestures have been around before the establishing of linguistics. When we imagine cavemen, we imagine a lot of grunts and hand movements which is probably not far off from the actual scenario’s that happened. Throughout history, people who are mute have adapted to using gestures for communication through the uses of sign language. Although it is a different type of gesture for communicating, it is adapted from various actions done in daily conversations. Babies are a good example of interpreting emotion and needs. If a baby is sad or happy, they will react either with laughter or a loud cry. They mimic the actions of their surrounding people in order to supplement their inability to talk.
As we decipher and research gestures based off of human and animal interactions, we can begin to embed those same gestures into our technology. Robot facial gestures are already being researched and is in heavy development. We have software that can recognize different emotions displayed in a live feed. Our next step in technology will include gesture-based mechanisms in many of our programs and products, and it will be inspired by gesture interaction between humans and animals alike.