Exploring Children’s Notions of Digital Good/Bad

Dylan Yamada-Rice
9 min readFeb 23, 2024

This blog relates to an ESRC-funded project that is exploring children’s attitudes towards notions of digital good/bad. The team is made up of an interdisciplinary group of researchers that combines the arts and social sciences (Myself, John Potter, Angus Main, Eleanor Dare, Steve Love and Richard Nash). Within this disciplinary mix I call myself an Experience Design Researcher, which essentially means that I like to use art and design-based methods as part of the research process to work with children in a way that might be more engaging and meaningful for them than traditional ones such as interviews or surveys, and thus in turn the research workshops, I hope, are a kind of “experience”. Particularly I believe quite strongly that play is a form of communication in its own right, one of the strongest in childhood, and thus the methods I use are designed with this at the core.

As a team we have worked together over the course of a number of projects
that have centred on one or more emerging technologies, exploring what this might mean in terms of children’s entertainment, play, education and/or health. This latest project is still emerging through a series of public engagement networking events where we seek out core themes to include in our later research workshops with children, but already there is a strong interest in what Artificial Intelligence (AI) might mean for children.

The first of the public engagement network events was loosely centred on perspectives to do with the technology (ies), whilst the next one is going to consider how we can gain children’s perspectives and educate them (formally or informally) about aspects connected to emerging technologies beyond simply how to use them.

The purpose for having the public facing networking events has been to draw out some key themes from beyond the project’s core team, to understand what people are thinking about in this space, and then how we can use these themes as the basis for the workshops we will do with children to gather their own opinions and ideas later in the project.

The remainder of this blog post draws out some key topics that arose from the first networking event which are (1) AI in Relation to Specific Technologies, (2) Who Decides Tech’s History (and Why?), (3) Tech and the Climate Crisis, (4) Big Tech Objectives, (5) How Tech Works, (6) Would Kids Design Tech Differently? and (7) Surveillance. These are discussed next.

AI in Relation to Specific Tech

In terms of salient points arising from the first networking event, we were
lucky enough to be joined by Octavia Reeves from the ADA Lovelace Institute. She talked about her research in this area and the need to explore AI not as a sweeping generalisation, but in relation to specific technologies. For example, the Ada Lovelace Institute study found people had both positive and negative views of AI depending on the technologies or situations in which it might be used.

This led me to think that we might also want to explore children’s views of AI in relation to specific technologies. For example, if sick would children prefer to see a human doctor? Or would they be OK with being diagnosed by AI?

In relation to this, and other key themes I will go on to mention, I have begun creating a series of visual responses to points raised in the first networking event. These (such as the one on the left above but also on the slides below) build on my understanding of thinking and knowing through making and drawing, but eventually we might also include some of them in the workshops with children as a way to also get them thinking about these issues.

Who Decides Tech History?

James Edward Marks from PlayLa.bz reminded us that science fiction and speculative design in this format can often lead to actuality.

This acted as a wider reminder of the need to think about technologies within the context of history and not to look at them as completely new but rather to question what they are trying to replace in children’s lives. Also, who is developing them and why? So one of the questions we might seek to answer is do children know the histories behind technologies?

This fits quite well with ideas in Crary’s book 24/7 which considers the development of dual purpose technology, of which AI is one, that is designed to serve military and civilian needs.

Is there a situation in which technology that’s developed for military good could also be considered a social good? For me there are definitely big tensions. Particularly in this time where we are faced with what’s happening in Gaza. Bearing in mind that we are also very aware that children pick up on politics in their playground play etc, might there be anything here we can explore with children or is the topic simply too sensitive?

Another point James Edward Marks brought up was the climate crisis. I know from other research that I’ve done that children are suffering from climate anxiety and so when James brought up the work of Take the Jump which has a series of practical guidelines for things we can do to help with the climate crisis, this point also resonated with me.

Tech and the Climate Crisis

Specifically, I was interested in the one about needing to keep products for at least seven years. This made me wonder whether in spaces of disadvantage where we are thinking that children are hampered by having older technologies, could there be a way within the context of the climate emergency for it could be reframed more positively because actually all of us need to use technologies for longer? Could they become trail blazer or is this too idealistic? Definitely older technologies could be repurposed for newer needs and thus perhaps we don’t have to go with big tech’s push for us all to desire newer and faster tech. I wonder what children think about this idea.

This links to a lot of what I have been reading about the Anthropocene, in relation to my teaching within the digital arts, but also looking at AI as a form of “intelligence” (although of course it is computation and not intelligent at the moment). If we think about human intelligence, does that mean we can ignore animal and plant intelligence? Yet, we know that especially now at this moment in the climate crisis we need to look beyond the human.

Big Tech Objectives

Eleanor Dare from X||dinry Stories and the University of Cambridge suggested that we need to look at the motives that lie behind AI and who will gain. Thus, it becomes important not to think of it as a kind of magical system, but to try to get kids to unpack who owns these companies, who is responsible for its development, and what are their motives? Even as adults these can be quite big questions but what about for kids? Can they understand some of these issues as well? Are they interested in this?

These ideas link with those in Broussard’s book More than a Glitch looking at biases involved in technologies.

The quote above about Kodak is shocking especially when we know that this is just one of many inbuilt prejudices that mean technologies help a few rather than all, and makes me question how we get children to question alternative design futures? Connected to this, Angus Main from the Royal College of Art outlined his Countermeasures project, a finding of which was the need to educate children about how technology works. So, in that project he had children drawing around technology, imagining what was inside it, and he found that they had a mismatch of ideas that combined accurate information with imagination and also bits of old technology such as wires.

How Tech works

Where are children’s ideas drawn from? Why do they include outdated knowledge? Is it a deliberate ploy by tech companies? In a paper Angus and I co-wrote we mention how the tech company Apple deliberately uses the word magic as a ploy to make people think that technology is magical and thus beyond comprehension. How could kids be educated to be critical of the terms used by tech companies?

The Countermeasures project stems from Angus’ wider work on digital sensors and the data they collect. Of course a lot of great work on sensors, data and privacy is being undertaken by other academics such as Sonia Livingstone and her team.

I was trying to think about how we can break some of the ideas in this area down into smaller bits for children by using “What if” type scenarios such as is shown in the comic I drew below.

This seemed to link with the talk by George Simms, a doctoral student at the University of Plymouth, who uses Crip and feminist theory to seek alternative ways of viewing technology. If after children have knowledge about emerging tech and what lies behind it, would they want to design technology differently? Could they design it in a way that adults don’t? Would they wish it could do different things and if so, how can we empower them to get there? Because I am situated in an art space, I am very interested in the development side rather than just focusing on kids as consumers, i.e. how can we encourage creativity, making and critical thinking to allow children to believe that they could develop things differently, that they could make their own more local social networks for example and don’t have to rely on Instagram?

Would Kids Design Tech Differently?

This links with idea’s by Peter Frase in Four Futures that demonstrates how the future is not set in stone and that the past could have been radically different if we had opted to take free time over money in the age of automation. Thus, let’s install in children the idea that they can define the future in a radically different way than what is currently in existence. Here’s hoping.

Surveillance

This brings us to the final theme, which came from Thomas Enemark Lundtofte from the University of Southern Denmark talk. Thomas brought up the idea that there might be a gap between what children and adults want from technology. He gave an example of this in relation to surveillance apps where he stated that adults tend to put these apps on kids phones and are indeed even a motivation for giving children a phone in the first place, as a way to keep them safe, but actually children want phones to be sociable with friends. This served as a reminder that the research could consider the gaps between what adults want for children and what children want for their own lives.

Here’s to the start of framing our workshops with kids. I’ll write in the future about what we learn.

--

--