I build with AI. I also have serious concerns about AI and childhood. Here is how I hold both.
- Mieke from Nuri Tales

- 9 minutes ago
- 4 min read
The week we shipped a significant new feature in Nuri Tales, I also spent three evenings reading research papers about the risks of AI-generated content for children's imaginative development. I found myself underlining passages that made me uncomfortable. Passages that complicated the thing I was building.
I want to talk about that discomfort. Not because it makes me look thoughtful — though I am aware it might — but because I think the discomfort is the point. I think anyone building AI products for children who does not feel this tension is either not reading enough or not being honest enough.

So here, as plainly as I can manage, is how I hold both truths at once.
The concerns I carry
My first concern is about imagination. There is a legitimate question — not yet fully answered by research — about what happens to a child's imaginative life when they are repeatedly presented with AI-generated visual and narrative content. Human imagination develops, in part, through the effort of constructing inner worlds from incomplete external prompts. A description in a book gives the child almost nothing visually and demands that they generate the rest. That generative effort is not incidental. It is, researchers like Paul Harris at Harvard have argued, one of the primary ways children develop the capacity to understand minds other than their own.
When AI generates the image, the character's face, the forest's atmosphere, the fox's exact expression — it removes some of that generative demand. What does that do to the imaginative muscle over time? Honest answer: we do not fully know yet. That uncertainty should make every designer in this space more careful, not less.
My second concern: mediation
My second concern is about what gets mediated away. Some of the most important parenting moments are unmediated — a parent struggling in real time to find the right words, fumbling toward something true, imperfect but genuine. There is something in that struggle that a child receives. They see the adult working. They see that the relationship is worth the effort.
I worry about technology — including what I am building — that makes those moments too smooth. That offers a perfect story so readily that the parent's own imperfect words never have a chance to reach the child. The technology is supposed to support the connection. If it starts to replace the effort of connection, something has gone wrong.
"Technology is at its best when it disappears and leaves only the human moment behind. The moment it becomes the point — rather than the bridge — we have built the wrong thing."
Why I build anyway
I build anyway because of a distinction I believe is real and important, even if it is not yet fully captured in regulation or public discourse.
There is a meaningful difference between AI designed to extract attention and AI designed to serve a specific human purpose and then step back. The first is what powers most of the platforms my children encounter. The second is what I am trying to build.
Nuri Tales does not have a recommendation algorithm. It does not have a variable reward loop. It does not optimise for time spent in the app. It generates a story, the parent reads it with their child, and then the app is closed. The technology is supposed to make a specific moment better and then get out of the way. That is a fundamentally different design philosophy from a platform that wants your child's attention for as long as possible.
Is it perfect? No. Does it carry risks I cannot fully predict? Yes. But the alternative — leaving families without tools for the moments that are genuinely hard, while engagement-optimised content floods every other hour of their child's day — does not feel like the safer choice.
The standard I hold myself to
Every feature we build at Nuri Tales must answer one question before we ship it: does this bring parents and children closer together, or does it come between them?
If the answer is that it brings them closer — that it makes a difficult conversation easier, that it gives a child a character to help them process something real, that it supports the parent in arriving at bedtime with presence rather than exhaustion — then it belongs in the product.
If the answer is that it makes the app stickier, that it increases session length, that it makes the parent feel like the technology is doing the parenting — then it does not belong there, however commercially attractive it might be.
I will not always get that judgment right. But it is the question I keep asking. And I think it is the right question. Technology is at its best when it disappears and leaves only the human moment behind. The moment it becomes the point — rather than the bridge — we have built the wrong thing.
![]() This is the philosophy behind every design decision at Nuri Tales. We share it publicly so that you can hold us to it — and so that the standard, however imperfect, is at least visible. |
Research reference: Paul Harris, Harvard Graduate School of Education — imagination and theory of mind. Harris' research demonstrates that children's engagement in imaginative play and narrative construction is a significant driver of theory of mind development — the capacity to understand others' mental states.





Comments