In light of our discussion tomorrow of the dark side of social I decided to focus my blog today on the dark side of digital technology. What are the effects of being constantly surrounded by these intelligent devices, who are constantly listening to us. Specifically, the effects on children developing and growing up with these new advanced devices that are increasingly penetrating every aspect of our lives?
In September, the toy company Mattel received a lot of backlash on a new product it was developing, an interactive, AI gadget for children. The device was called Aristotle and it was “designed to comfort entertain teach and assist” (Alexa, Are You Safe For My Kids?) Furthermore, it would function to displace central parenting obligations like calming crying baby or reading a bedtime story, so the child forms a genuine attachment with the Aristotle device. The device would also collect information and data from the child and store it in the cloud to continue improving machine learning, like every AI system does. However, many people were concerned about such a product and a petition with 15,000 signatures to have Mattel kill the project, and the toy manufacture compelled.
Many people clearly thought this device crossed the creepy cool line, but it sounds very familiar to a certain intelligent personal assistance that are in over 39 million homes. Google Home’s and Alexa’s are always listening to its surroundings but they both only record conversations if their trigger words are spoken. All these recordings are sent up to the cloud so the device’s AI can improve by understanding how to help you better. With both smart speakers, you have access to all the device’s recordings and have the ability to delete conversations from the company’s cloud storage.
So, what is to radically different and creepy about Mattel’s Aristotle project that separates it from the Google Home and Alexa? Are children not constant interacting with these smart speakers at home? Are kids not forming attachments to these devices? Are the interactions with Google Home and Alexa benefited or unsafe for children?
Child psychologist, Shen and Rachel Severson, have published studies about children’s relationships with AI devices, and they do have some consequences.
An interesting finding from this study is that adult’s interactions with the smart speakers are affecting the ways children believe it is appropriate to interact with other human beings. For instance, the social cognitive theory states that observation and imitation are frequent sources new behaviors. Meaning that humans instinctively imitate the actions of others. This is especially true for developing children whose conceptions of correct social interactions is based off who they see their parents interact. Many people exchange with their smart speakers by crudely shouting out demands, which the device always responds to in a polite manner. It might not seem like a big deal to be rude to a machine, but young kids “attribute human characteristics to the device, thinking that Alexa has feelings and emotions. Some kids may even think there is an actual human inside the device” (Alexa, Are You Safe For My Kids?). Therefore, parent’s interactions with their smart speaker can directly influence their social awareness. So, if a parent is trying to teach their child to say please and thank you, they too should go the extra mile and be polite and show a little gratitude to their AI devices.
On the other hand, having these devices in homes can also greatly benefit from having an outlet to explore their curiosity with and ask questions even when their parents are not around. Children are naturally motivated to investigate the world around them and the robots are a great source for children to learn facts and information from.
Hence, naturally like all technological revelations come with pros and cons—both creepy and cool aspects. However, being aware of the social implications that these new technologies may have on ourselves and on those more susceptible than us are essential in interacting with this innovations and to know when to set limitations.