Is AI Safe for Kids?
Artificial intelligence (AI) is one of the divisive topics of our day. The complex, somewhat opaque nature of generative AI, in its various incarnations, make it a hard concept for adults to grasp. If you are old enough to remember the arrival of the internet, then you may be able to draw some parallels to that learning experience. If not, don’t worry. Read on to discover where AI is taking us, and how this both benefits and puts children at risk. By learning more about AI, we can take better steps to educate children born into this brave new world on how to use AI to their benefit, without becoming overly reliant on technology.
In case you blinked and missed it, welcome to the salad days of the AI revolution. We are learning as we go, steered by the companies behind generative AI. As adults, we may choose to adapt to the new AI augmented world, or to reject it, possibly to our own peril.
On one hand, we are intrigued, even seduced by the benefits of automation and content generation that promise to endlessly amuse, as well as revolutionize the way we work.
On the other hand, ethical concerns including accountability and lack of transparency plague our embrace of AI. AI stirs up fears of job displacement, and the machines taking over. Ultimately, many fear the existential risk of humanity creating an intelligence that surpasses our own and displaces our species.
For children, born into a world where AI will increasingly impact all areas of their lives, the question at hand is greater than “What is the danger of AI for kids?” AI is here to stay, and children are already using it. As AI grows up alongside our kids, our duty is to address the ethical considerations needed to protect the rights of children born in “Generation AI.”
Children and AI
It is a common and commendable parental reflex to want to examine the dangers of AI. Keeping children safe as they grow up with AI is something that responsible parents and educators must address. So, what is artificial intelligence for kids, and what is this AI danger lurking in the computer or even in a toy?
Information collecting dolls?
Children are experts at creating personalities, stories, and interacting with inanimate objects like teddy bears, or twigs. How about smart toys? Smart, AI augmented toys offer adaptive, responsive play experiences and can actively educate children. Much of the toys’ capability to interact is reliant on collecting data. While some smart toy manufacturers pride themselves on ethical data collection, there have been other dolls that have not played so nice.
While the recent Barbie movie is in the limelight, let’s not forget Hello Barbie, the world’s first “interactive doll.” Released back in 2015 (since discontinued), Hello Barbie is an internet connected doll that records what children say, sends the recording to a server, and responds with a prerecorded message. Hello Barbie presented ethical and legal issues regarding privacy, and data collection, as well as the storage and security of the data, and the hackability of the doll’s microphone.
AI as an educational tool
Joking aside, AI enhanced toys will evolve in line with the growth of AI technology, and may fill increasingly important roles in both play and education. AI has great potential in personalized learning tools, which promise to improve educational outcomes across the globe. Another revolutionary aspect of AI is its ability to enable accessibility. This takes many forms, from AI bots providing mental health support, to smart phones acting as a language translator, or a text to speech tool, for example.
Parents vs AI
AI has a strong potential to undermine parents’ authority. In a world where speedy information is king, children might trust AI generated information more than their parents’ words. This could, in some instances, lead to conflicts, especially if the information provided by AI contradicts the views of parents, teachers and adults in positions of responsibility.
AI can add bias
AI can also promote bias, both through content generation, and via its use as a tool to sort through student applications. Educational institutions using machine learning to process student applications may unknowingly make mistakes and result in discriminatory outcomes if left unchecked. In addition to the above points, the dangers of harmful content, location detection, and identity protection/fabricated identities are all potential dangers that come part and parcel with the AI revolution.
Clearly letting children hand all of their questions and problems over to AI is not going to result in a generation of well rounded individuals with decisive, autonomous minds. We don’t yet know what AI does to the developing brains of children. There will clearly be psychological implications of the AI revolution that come to light over the next decades.
AI for kids can bring benefits to many aspects of their lives, but as adults, we need to make sure that their relationship with “the machine” is a healthy one. Reintroduced in May 2023, The Kids Online Safety Act has put the responsibility on platforms to mitigate the spread of potentially harmful content to minors. While this controversial bill does not cover AI specifically, certain aspects, including the requirement for platforms to disclose personalized recommendation systems, will provide interesting reading for concerned parents.
The future is now, and it is important to educate children about responsible use of AI. Explaining how AI works is a good place to start, followed by demonstrating how it works by actively engaging with it. The limitations and bias of AI should also be explained to children, in language that they can understand. As life gets more and more automated, it is vital to nourish creativity and critical thinking. While AI offers solutions to many problems, it also creates a slew of new problems. Educate yourself and your children, and give them the independence to make up their own minds about AI.
Edmund is an English copywriter based in New Taipei City, Taiwan. He is a widely published writer and translator with two decades of experience in the field of bridging linguistic and cultural gaps between Chinese and English.