Life has become increasingly digital, and that includes the toys children play with.


Many electronic toys include internet connections or artificial intelligence software that learns about your child and personalizes play. These toys can offer features parents and kids want, but they also bring new types of risk -- namely, they can gather a lot of data about children and may share it with other companies, making your child's data more likely to be exposed during a breach or hack.1


Here's a deeper look into the ways parents, caregivers and anyone with a young one in their lives can navigate this new market and make smart decisions about smart toys.


What are smart toys?


These toys have high-tech features such as a WiFi or Bluetooth connection, along with microphones, cameras or sensors. Some even integrate artificial intelligence programs. Common types of smart toys include dolls, robots and interactive games.


Some smart toys have conversations with kids, using an internet connection to transmit a child's words to outside servers, where the manufacturer can use speech recognition or artificial intelligence technology to prompt the toy to talk back. Other smart toys have features like facial recognition, allowing a toy to recognize and greet a person by name.


What does that have to do with a child's data?


Some smart toys may collect a child's location; others can receive and store voice recordings. For example, the manufacturer of the software for the Fuzzible Friends toy (connected to Amazon's "Alexa" artificial intelligence software) states in its privacy policy that it may receive transcripts of a child's interactions.2 If your child were to share their age with the toy, for example, this would be included in the transcript.


Smart toy manufacturers may partner with other firms in order to process and store data, or share your child's data with other companies. The result is that kids may disclose a lot of information to a toy they consider a friend, not realizing it's a company on the other end doing the listening.


Is there a guarantee that this data will be kept safe?


Unfortunately, no -- if hacked, connected toys can be used to eavesdrop on kids. We found that a conversational doll, My Friend Cayla, had an unsecured Bluetooth connection, enabling anyone nearby to use the doll as a microphone and potentially talk to children.3 The FBI has even warned parents that toys with microphones may collect conversations that happen within earshot, even when a toy isn't being played with.4


Then there's the risk of a large-scale data breach that exposes the data of millions of kids at once. In 2015, the largest-ever hack of kids' information exposed the names, birthdays, genders and, in some cases, photos and voice recordings of millions of children.5


What other risks should I be aware of?


There are a number of other things to keep in mind about smart toys, whether you're considering one as a gift for a loved one or just care about data security and digital safety in our high-tech world. These include:


  • Costly in-app purchases that can be easy (and enticing) for a child to click on.
  • Toys with a companion website where kids can download potentially age-inappropriate content from other users.
  • The use of fictional characters to tout brands and products to a child -- also known as "stealth marketing."

Please feel free to share this information with anyone you know who may find it useful. And you can learn more about smart toys and other potentially hazardous products on our website.


Thank you,


Faye Park


Public Interest Research Group



1. R.J. Cross, "Smart Decisions about Smart Toys," PIRG, December 29, 2022.

2. R.J. Cross, "Smart Decisions about Smart Toys," PIRG, December 29, 2022.

3. R.J. Cross, "Smart Decisions about Smart Toys," PIRG, December 29, 2022.

4. "Consumer Notice: Internet-Connected Toys Could Present Privacy and Contact Concerns for Children," U.S. Federal Bureau of Investigation, July 17, 2017.

5. "VTech hack: Data of 6.4M kids exposed," CNBC, December 2, 2015.