Alexa, are you my role model?

Photo by Kim Shepherd

During 2018’s Baby Shark trend, a video of 2-year-old Zoé Turner’s struggle to direct her family’s Amazon Echo to play the song became so popular that The Today Show invited the toddler’s family for a live appearance. Zoé’s communication efforts are endearing enough to tempt the crankiest heart to stop scrolling and enjoy a chuckle or an aww, but what’s also noteworthy is her confidence with this technology.

It’s not surprising, though, considering devices installed with digital assistants and voice Artificial Intelligence (AI) are estimated to outnumber us by 2021 (De Renesse, 2017) and have become ubiquitous in our children’s lives. 4 in 10 American homes with children age 2 to 8 have a smart speaker and 6 in 10 children interact with voice assistants (Common Sense Media, 2019). A YouTube search for babies and Alexa produces numerous videos featuring young children using these devices, showing that Zoé is not alone in her abilities.  

This technological familiarity, though, isn’t unique to her and her contemporaries. Children too young to read have long known how to use TV remotes so why wouldn’t they grasp how to talk to Alexa, Google, or Siri? What is new are incidences of toddlers mistaking household technology for people, people who guide them in behavioral development.  

Parents are reporting children forming unusual relationships with voice assistants. CNN contributor Samantha Murphy Kelly wrote that Alexa was among her son’s first 4 words. This summer, reports featured a baby named Caroline who believed her name was Alexa due to her parents’ frequent use of their Echo. Paul Lampkin, Wareable Media Group co-CEO and a former MSN tech editor, published an article on his smart home dedicated site, The Ambient, focused on his 2-year-old daughter Maia’s anthropomorphic treatment of AI.  

In it, Lampkin shared his mixture of amusement and unease at how his toddler refers to her family’s Google assistant as being “tired” if its battery runs low or that Alexa “lives” downstairs. Most parents assume children understand that AI and people are different (Common Sense Media, 2019). Lampkin’s experience, though, shows how, for younger kids, this distinction is murkier. His daughter understands voice assistants are part of devices, but, Lampkin says, “she still thinks they’re people compared to her VTech toys or Peppa Pig flip phone which make noises but are ultimately pretend.”  

Whether to recognize a voice assistant as a person seems like a debate from the movie A.I. Artificial Intelligence. For our brains, however, this debate doesn’t exist. They have no natural way of understanding that AI-generated voices aren’t real since technology’s evolution has outpaced ours. Our brains understand voice as signaling the physical presence of another person because for most of our species’ existence that has been the case, until recently. Recorded voices for radio, TV, or AI play tricks on our brains (Neupane, Saxena, Hirshfield, & Bratt, 2019). Voice assistants’ ability to react to our voices or to offer the illusion of having a conversation further our misinterpretation of them as real people, regardless of age.

Adults override this misinterpretation with the intellectual understanding that a plastic speaker isn’t a person. Young children, though, have imaginary friends, so the voice on the kitchen counter being real isn’t much of a stretch. “There are numerous anecdotes that young children think there’s a little person inside the device or there’s a person on the other end of the exchange, like a telephone,” says child psychologist Rachel Severson (Murphy Kelly, 2018).  

Photo by Alex Knight from Pexels

As Director of The Minds Lab at the University of Montana, Severson researches how children assign consciousness to humanoid technology and what impacts this behavior has. Her research with children age 9 to 15 and a robot named Robovie concluded that children assign mental states, social statuses, and moral deservedness to lifelike technology, even when they believe its status as technology doesn’t entitle it to civil rights or liberty. The younger a child is, the likelier they are to confer humanness to these types of electronic devices (Kahn, Kanda, Ishiguro, Freier, Severson, Gill, Ruckert, & Shen 2012). Therefore, young children’s ability to call voice assistants computers or robots isn’t an indicator of understanding, as they lack the social and emotional maturity to differentiate AI from people in the same sense adults do. 

Young children learn behaviors largely through mimicry, with parents, grandparents, and other caregivers asserting the most influence (Shrier, 2014). Worries about technology’s interference in this process caused screen time to become a topic of interest as children’s TV and internet use increased in the past 30 years. Alexa doesn’t have a screen, though, and voice time isn’t a concern the way screen time has been.  

Rather, parents view voice devices and their content as healthy alternatives to display devices because audio frees kids to engage in other activities concurrently, such as coloring (Kelly, 2018). Wall Street Journal contributor Alexandra Samuel went as far as to laud voice assistants as pseudo-co-parents in a recent article saying, “Every parent I know fantasizes about adding another parent to the mix to help out with the children. I have found that fantasy co-parent in a tidy package: our Amazon Echo.”  

This parental approval shows in downloads—kids’ apps are the fastest-growing category of skills Amazon offers for Alexa (Kelly, 2018). The company capitalized on this popularity and released the first kids’ edition of its Echo Dot in 2018. It enjoys a 70% 5-star rating with parental reviews chiming in, “this is great for kids. I would highly recommend. Music, games, stories…what’s not to love?!? Thanks, Amazon, for making an amazing screen-free product.” What’s not to love, though, may be this technology’s behavior. 

Companies market voice assistants on their ability to instantly give you what you want—Siri, what’s the score of the Cardinals game? Hey Google, set a timer. Alexa, play Baby Shark. The relationship a person has with a voice assistant is one of command and serve rather than one of nurturing respectfulness that parents prefer their young children learn. Neilson Norman Group found adults are prone to being as polite to voice assistants as they are to people, using “please” and “thank you” (Laubheimer & Budiu, 2018). Kids, however, are still learning consistency in their civility and they are prone to mistakes.  

When children don’t behave nicely, adults will remind them to use good manners. Alexa doesn’t care. Neither does Google nor Siri. Children are free to demand what they want from AI. Voice assistants may undermine the best parenting efforts in the same way a sibling’s or a peer’s disobedience tempts young kids into trying mischief themselves.  

Market research firm ChildWise reported in 2018 that voice devices are prone to making children rude and demanding (Baig, 2019). A child’s logic is simple: Alexa seems like a person, it lets me demand things, so I can get what I want without manners. Cue toddlers commanding other kids to give them the toys or grandparents to buy them candy.  

Photo by Tyler Lastovich from Pexels

Dr. Jenny Radesky (2019) is a developmental behavioral pediatrician at the University of Michigan Medical School who helped write the American Academy of Pediatrics’ 2016 children’s media use policy. She says manners are just one of the problems the command and serve nature of voice assistants can cause for young users. “When children come to think that everything is on-demand, it may make it harder to wait and accept the challenge of figuring things out by themselves—two really important challenges of childhood,” she explains. Thus, children who use relationships with voice assistants as models for behavior may come to lack patience and perseverance as they grow up. 

The way we identify this technology, though, is as impactful as how we interact with it. Alexa’s proper pronoun is it, but most of us default to she. Recent U.N. research found that voice assistants help perpetuate gender stereotypes. Their report explains that our default use of female voices and names for voice assistants cause us to personify them as female. This reinforces that, “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”  

Most voice assistants have a secondary male voice as an option—Microsoft’s Cortana being the major exception—but in the short lifespan of this technology, we’ve become accustomed to considering it female, so it’s unlikely many users bother to tinker with the voice settings. Our continued use of gendered pronouns for AI, though, is contributing to a persistent childhood problem.  

Despite perceived progress on gender equality, children are still besieged by the message that gender determines abilities and preferences. Research examining advertisements found that we gender toys more today than in the 1970s (Sweet, 2014). Linda Hawkins (2018), Co-director of the Gender and Sexuality Development Clinic at Children’s Hospital of Philadelphia, says children begin developing their gender identity at 3 to 5 years old and that society’s expectations—such as those expressed through gendered toys, for example—affect this development.  

“Gender exploration is like any other exploration,” she says. “Children try on all sorts of things, hats, boots, you name it, they’ll try it on and see if they like it personally and see if they like the response they get from people.” A child limited to hearing female voice assistants may assume an assistant job is for women only or, worse, that we restrict women to service-oriented positions rather than allowing them to have a range of roles.  Thus, representation of diverse gender expressions and roles matter for young children.  

Photo by Kim Shepherd

Awareness of the harm of enforcing gender constructs is becoming more common in America today. Some parents are raising their children gender-neutral in everything from clothes and toys to names and pronouns until the children choose their genders. Increased conversation surrounding gender identity means children today are likelier than ever to meet someone or have a peer who is transgender

If our AI relationships reinforce the gender binary, children won’t possess the tolerance and acceptance needed to live in a world where people break stereotypes. This biased understanding leads children to emotionally, socially, or physically hurt others and themselves when behavior doesn’t fit the prescribed definition (Hawkins, 2018). 

We haven’t doomed ourselves yet to a future of sexist, narcissistic preschoolers. On the positive side, voice assistants may teach children independence and choice through simple tasks such as picking music to play. Voice applications aid people with certain disabilities, meaning this generation is growing up in a world where basic designs often feature inclusivity. Individuals, such as Stephen Hawking, who use assistive devices that have voices are not so unfamiliar to a person in a world where every home has a voice device in it. Furthermore, voice assistants deliver helpful, educational content, much the way TV does with shows such as Sesame Street. But without screens to encourage a child to sit and watch content, voice devices may encourage more movement and play, in turn reducing childhood obesity.

That these advantages coexist with downfalls in the same product is unfortunate and warrants thoughtful consideration from users. It may not, though, necessitate us ditching this technology outright. Parents can still guide how voice assistants influence developing minds. Using gender-neutral pronouns to describe AI and choosing to rotate their voice options is a good start. Supervising young children when they interact with voice assistants can not only help avoid an unexpected $1000 Amazon order of Paw Patrol Playsets, but it also allows for reinforcement of manners where this technology may be lax.  

Debate exists, though, whether we should make children exhibit politeness toward voice assistants at all, considering we are trying to teach them the distinction between computers and people. But next-generation AI, like Google’s Duplex, that sounds authentic enough to fool strangers over the phone, renders us as incapable as small children of differentiating people from tech. While we await regulatory decision on whether to require AI to identify itself, avoiding poor child development, or self-induced embarrassment, requires us to offer politeness to every voice. “Parents should have house rules about kindness when interacting with any being, alive or AI,” echoes Radesky (2019). 

Experts also recommend against substituting human interactions with voice assistants. “We know that the single greatest factor in children’s development is their direct interaction with other human beings,” says pediatrician Dr. Dipesh Navsaria in the Campaign for a Commerical-Free Childhood’s warning against voice assistants for children. “When we promote products that put a device in between people and encourage electronic interaction rather than face-to-face, we’re doing children, parents, families, and our society a vast disservice.” 

A voice assistant is not equipped to be a caregiver or a teacher. While Alexa may be able to read a bedtime story to a child, it’s the moments and conversations between page turns that nurture children. Also, simply delivering information doesn’t increase intelligence or enrich a child’s understanding of the world. “Virtual assistants may have plenty of facts, but they can’t provide meaning—that’s what loved ones do, especially through storytelling,” says Radesky. She adds that AI lacks the emotional intuitiveness that humans possess. “As a parent, you sometimes realize that children’s rapid-fire questions have another meaning—an underlying anxiety that itself could be better addressed through pretend play, drawing, or a hug, rather than more facts.”

The under 5 crowd wasn’t the initial target demographic for voice assistants, but this age group’s curiosity and wide use of the technology are forcing companies to consider children in product design. To allay fears of rude behavior, Amazon and Google have introduced features—Magic Word and Pretty Please respectively—which make their voice assistants reinforce manners when children initiate commands. The Echo Dot Kids Edition features special parental controls for content, contacts and time limits, as well as a kid-friendly Alexa programmed to better understand children’s speech, reducing communication blunders like little Zoe’s. 

Photo by Kim Shepherd

Technology companies have responded to claims of gender bias by reprogramming their voice assistants to withhold responses to sexist or abusive language. Additionally, Amazon had Alexa self-identify as a feminist while Google randomized the gender of the initial voice it assigned new users and began referring to voice options by gender non-specific colors.  

The public, though, is urging technologists to improve on these efforts. Apple introduced a male Siri voice 2 years after the voice assistant launched, outpacing most competitors, but drew criticism in 2019 when internal documents leaked showing the company’s decision to have Siri avoid controversial topics such as feminism. Copenhagen Pride and VICE Media’s creative agency Virtue recently unveiled Q, the first genderless voice for voice assistants, challenging the tech industry to adopt more inclusive options for their products.  

While companies move to address existing fears involving children and voice technology, a new controversy is brewing, that of privacy. Amazon and Google pushed the sale of voice assistant equipped devices, even giving them away at times, to gain a larger data set with which to test and improve the technology. The prevalence of home units that collect data on child users raises questions of whether they violate the Children’s Online Privacy Protection Act (COPPA). This federal law mandates that parents have control over what information about children age 13 and under companies can collect via the internet. 

Many parents, though, didn’t consider this legal and ethical conflict or didn’t understand how this technology worked when they welcomed voice assistants into their homes. Today, however, 93% of parents want to know when these devices record them and their children (Common Sense Media, 2019). As such, companies are scrambling to avoid lawsuits and prove to families that their products are safe before these important research subjects opt-out. Whatever the decision of legal courts, or the court of public opinion, this issue highlights an emerging problem.  

Social media made millennials and Generation Z accept privacy as nonexistent online, and voice devices are doing the same for Generation Alpha at home. Parents fear the spying of multinational companies and hackers yet praise features that enable them to become the spies in their children’s lives. The Echo Dot Kids Edition, for example, allows parents to review all their children’s activity with the device, including voice recordings. What behavior will this generation exhibit towards future monitoring by family, partners, friends, or governments if we normalize this level of surveillance in childhood? 

Now that the blinders of trendiness have worn off, we must wrestle with these types of difficult questions. How easy or swift this contemplation will be, depends on consumers. Voice assistants’ novel appeal may wane, relegating them to the graveyard of futuristic technology like Google Glass, or popularity of the Internet of Things may soar them to levels of yet unseen integration in our lives. After all, nobody predicted 10 years ago that voice assistants would become stocking stuffers and household staples today.  

One day AI may be superior in values and behaviors, and we will be looking for it to teach all of us the ways of life à la the Minds of Iain M. Banks. Until then, however, our voices must rise above those of Alexa, Siri, and Google in our children’s lives.


References

Baig, E. C. (2019, October 10). Say thank you and please: Should you be polite with Alexa and the Google Assistant? USA Today. Retrieved from https://www.usatoday.com/story/tech/2019/10/10/do-ai-driven-voice-assistants-we-increasingly-rely-weather-news-homework-help-otherwise-keep-us-info/3928733002/

Common Sense Media (2019, March 28). Common Sense/SurveyMonkey poll reveals privacy is a top concern for families who use smart speakers and voice-activated assistants. Retrieved from https://www.commonsensemedia.org/about-us/news/press-releases/common-sensesurveymonkey-poll-reveals-privacy-is-a-top-concern-for

De Renesse, R. (2017, May 17). Virtual digital assistants to overtake world population by 2021. Ovum. Retrieved from https://ovum.informa.com/resources/product-content/virtual-digital-assistants-to-overtake-world-population-by-2021

Hawkins, L. (2018, January 16). When do children develop their gender identity? Children’s Hospital of Philadelphia. Retrieved from https://www.chop.edu/news/health-tip/when-do-children-develop-their-gender-identity

Kahn, P. H., Jr., Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., Ruckert, J. H., & Shen, S. (2012). “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology, 48, 303-314.

Kelly, H. (2018, April 12). ‘Alexa, play my kid a podcast.’ Parents look for screen-time alternatives. CNN. Retrieved from https://money.cnn.com/2018/04/12/technology/alexa-google-screen-time/index.html

Laubheimer, P., & Budiu, R. (2018, August 5). Intelligent assistants: Creepy, childish, or a tool? Users’ attitudes toward alexa, google assistant, and siri. Nielsen Norman Group. Retrieved from https://www.nngroup.com/articles/voice-assistant-attitudes/

Murphy Kelly, S. (2018, October 16). Growing up with Alexa: A child’s relationship with Amazon’s voice assistant. CNN. Retrieved from https://www.cnn.com/2018/10/16/tech/alexa-child-development/index.html

Neupane, A., Saxena, N., Hirshfield, L., & Bratt, S. E. (2019, February). The crux of voice (in)security: A brain study of speaker legitimacy detection. Network and Distributed System Security Symposium. Retrieved from https://www.ndss-symposium.org/ndss-paper/the-crux-of-voice-insecurity-a-brain-study-of-speaker-legitimacy-detection/

Radesky, J. (2019, January 22). Tips for using virtual assistants with kids. PBS. Retrieved from https://www.pbs.org/parents/thrive/tips-for-using-virtual-assistants-with-kids

Shrier, C. (2014, June 27). Young children learn by copying you! Michigan State University Extension. Retrieved from https://www.canr.msu.edu/news/young_children_learn_by_copying_you

Sweet, E. (2014, December 9). Toys are more divided by gender now than they were 50 years ago. The Atlantic. Retrieved from https://www.theatlantic.com/business/archive/2014/12/toys-are-more-divided-by-gender-now-than-they-were-50-years-ago/383556/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s