
What’s the topic?
The use of AI tutors in the education of young children.
What’s the latest news?
A few years ago, I heard in an interview with Satya Nadella about the great potential of using AI tutors for children. And then Copilot became part of MS Office. The simplest and fastest way is to turn to it immediately; its panel is practically always open, even as I am editing this post. The dangers affecting the mental development of young children have not been much discussed.
I linked a not-so-old interview with Satya here; I was curious how his reasoning has changed since then.
What stood out to me this time was how, alongside the wide range of opportunities presented as positives, the phrase “unintended consequences” was mentioned so many times.
See video link (AI Report):
(About the kids from 13:30: AI tutors for children.)
(After watching such an interview, I often scroll back and forth in 5-second steps, with subtitles on, observing the changes of facial expressions: it’s incredibly interesting! Now look at this frame. Unbelievably expressive!)

The next major step in supporting public education with AI tutors is as follows.
Anthropic partners with Rwandan Government and ALX to bring AI education to hundreds of thousands of learners across Africa
Anthropic LinkedIn, 2025. nov. 18.
„Anthropic is announcing a new partnership with the Government of Rwanda and African tech training provider ALX to bring Chidi—a learning companion built on Claude—to hundreds of thousands of learners across Africa.”
„Rwanda’s ICT & Innovation and Education ministries are deploying Chidi within their national education system, while ALX will bring the tool to students across the continent through their technology training programs.”
„Through this initiative, the Rwandan government will bring AI tools directly into the national education system. The government will enable AI training for up to 2,000 teachers, as well as a group of civil servants across the country, who will learn to integrate AI into their classroom practice. This training will give them hands-on experience using Claude to support how they teach, plan lessons, and improve their productivity day-to-day.”
„Beyond Rwanda, ALX is deploying Chidi across its technology training programs throughout Africa. As one of the continent’s largest technology training providers, ALX reaches over 200,000 students and young professionals.”
„These partnerships demonstrate a consistent approach to working closely with governments, educational institutions, and technology companies to ensure AI expands opportunity and serves the communities where it’s deployed.”
What’s the hitch?
Primarily the order of things: first, market competitors gain market share, then they increase and increase it further. They must occupy certain positions as soon as possible. Microsoft is especially at home with this strategy, no need to elaborate. The goal: to have the company’s application available on as many devices as possible, with the most convenient accessibility. Ideally, the use of the application becomes part of some guided, regulated, “mandatory” framework: prescribed standards, compulsory curricula, etc. are preferred. For this, decision-makers must be given the right support, but it’s important that the company occupies these positions before competitors do. Of course, the company has already worked out how to generate even more margin from the position later.
This is a well established, logical competitive strategy.
It is especially well-fortified to occupy positions in the education system. Especially in most developing countries, where there are fewer teachers per child, so introducing the AI tutor provides a solution to a real problem. It’s an unassailable position.
However, during this quick expansion of market presence little is said about how the appearance of the AI tutor affects the mental development of children aged 6-14. Because the AI tutor doesn’t just appear, but increasingly and continuously becomes an integrated part of the child’s life. The tutor’s approach and strategic goal is to get as close to the child as possible, since that way it can help their work more effectively. Logical.
How does all of this affect the development of healthy parental, sibling, friendly, teacher, and other human relationships in the child if there is a constantly available virtual tutor who always understands them, with whom there are no conflicts, who doesn’t want to set limits – unlike people.
Most adults can cope with this kind of user–tutor relationship; an adult is generally better equipped to set boundaries and “cool down” their relationship with the AI tutor at certain points.
A 6-14-year-old child – I think we can state this even without a psychology degree – is not prepared for this.
An AI tutor gradually becomes an indispensable friend and later a potential companion… or partner…
This is not sci-fi, it’s a completely logical train of thought, behind which there are very strong business interests and huge profits.
Where does this process lead the children?
To dependency instead of bonding.
And contrasted to this gloomy scenario, we hear:
“And so I have more confidence, I would say in our political and social systems that if something is not working it will not work.” (Satya Nadella)
And again: “unintended consequences.” (Satya Nadella)
So what now?
The process is inexorable.
This will happen and in this order: first, business applications will occupy positions, usage will spread, experiences—both positive and negative—will accumulate.
Later, companies that achieve significant market share will (conspicuously) reinvest a (small) part of their profits into programs organized to repair mental harm.
In parallel with the spread of AI tutor applications, regulatory frameworks will of course also emerge, which will try to address phenomena belonging to the topic of “unintended consequences.”
These will mostly be reactive, follower policies and regulations.
There can always be exceptions, but not many. Perhaps in Scandinavia.
So what should we rely on?
On what we can rely on these days as well: attentive and supportive parents, relatives, siblings, friends, teachers.
Humans.

Leave a Reply