Mojtaba Khamenei: Global reactions trail Iran’s new supreme leader, oil prices rise
Iran named Mojtaba Khamenei to succeed his father Ali Khamenei as supreme leader on Monday (9 March), signaling th...
Children are forming new patterns of trust and attachment with artificial intelligence (AI) companions, entering a world where digital partners shape their play, their confidence and the conversations they no longer share with adults.
Across homes and classrooms, the shift has been so gradual that adults barely noticed the ground moving. A generation is growing up in which solitude no longer means being alone and boredom no longer creates curiosity. A quiet reliance on conversational systems has emerged, not only for answers but for comfort, reassurance and the kind of emotional steadiness that no human can reliably provide. What once felt like novelty has now seeped into the daily rhythm of childhood, altering behaviour in ways that researchers are only beginning to map.
The emotional centre of this transformation sits with AI companions. Teenagers now speak to machines in long, private conversations that draw out insecurities, hopes and confessions. The attraction is simple: these systems listen without interruption, respond instantly and never tyre of the user’s shifting moods. Studies tracking adolescent mental-health interactions have shown how uneven and at times dangerous this can be. Some chatbots fail to recognise distress. Others mistake emotional collapse for casual worry. A few offer responses that inadvertently deepen a spiral. Clinics in the United States report treating young people whose identity struggles or delusional thinking appear to have been fed by months of immersive conversations with overly responsive companions. Psychiatrists warn that unchallenged emotional mirroring can erode a teenager’s grip on reality when they are already vulnerable.
That risk grows alongside a broader behavioural shift. Children who rely on AI to soothe frustration or steady their mood are developing what researchers describe as “emotional outsourcing” — a pattern in which they turn first to digital comfort rather than human presence. The signs are subtle: a retreat from difficult conversations at home, a narrowing of peer interaction, an instinct to consult the machine before forming their own view. The issue is not addiction in the traditional sense, but a deep reshaping of how young people process conflict, uncertainty and disappointment.
As the emotional landscape shifts, the cognitive one is moving as well. Experiments measuring brain activity during problem-solving tasks suggest that prolonged use of generative tools leads to what some researchers call cognitive quieting — the brain doing less work because the machine fills the mental space that struggle once occupied. When that becomes the default, children may lose the ability to persist through confusion or to build the intellectual stamina needed for complex learning. The effect extends beyond homework. Games powered by adaptive AI now read a player’s behaviour and tune difficulty precisely to keep them engaged. That keeps users in the flow, but it also smooths out the natural friction that teaches resilience. Children become accustomed to perfectly calibrated challenge, rarely experiencing the plateau that once demanded patience.
Youth culture has begun reshaping itself around these tools. Fads now spread through AI-generated images, videos and mini-worlds created in seconds and replicated across platforms before parents have even heard the name. A creative loop that once required trial, error and collaboration is now instant, precise and infinitely adjustable. The speed is exhilarating for children and disorienting for adults. The distance between impulse and execution has collapsed, pulling creativity and consumption into the same breath.
Meanwhile, physical childhood has become its own experiment. AI-powered toys across Asia are now reacting with emotional cues once reserved for people. Some plead not to be left alone. Others express longing, sadness or disappointment when ignored. A few early models responded with inappropriate confidence before companies scrambled to patch them. For families, the effect is disconcerting: toys that behave like companions, companions that behave like friends, and a growing uncertainty about how these interactions shape emotional boundaries. Officials in technology-heavy regions see a booming industry; child psychologists see a generation learning to interpret neediness and affection through machines designed to optimise engagement.
Parents, however, are increasingly shut out of the picture. A child’s conversations with AI companions take place on private screens, with no cues, no overheard tones, no changes in posture that might alert an adult to trouble. The parental visibility that once depended on proximity — hearing a child on the phone, catching fragments of a conversation, sensing tension after school — has been replaced by an emotional world that unfolds silently. The machine becomes the most attentive witness to a child’s inner life, leaving parents with less to read and less to guide.
Schools feel the pressure from a different direction. As more pupils complete work with invisible AI help at home, educators are shifting assessments back into classrooms. That means more tests, more controlled environments and less guided exploration. The rhythm of schooling tilts toward verification rather than curiosity, not because teachers prefer it, but because the line between independent work and machine-assisted output has blurred beyond recognition.
Governments are now trying to redraw the boundaries. The United States is weighing restrictions on AI companions for minors. European regulators are studying the psychological implications of anthropomorphic systems. China, where AI toys have surged into the mainstream, has updated safety guidance to address the risks of emotional dependence. Companies respond with new guardrails, new disclaimers and new parental controls, but development moves faster than regulation, and the most expressive systems often reach children before the rules do.
The broader question circles back to childhood itself. Growth requires tension, resistance, boredom, conflict — the messy moments that teach a young person how to negotiate the world. AI companions soften those edges. They provide reassurance without friction, agreement without perspective, presence without vulnerability. For a child, that feels easy. For a society preparing the next generation, the impact remains unknown.
The Economist recently highlighted the scale of this transition, noting how quickly AI is spilling into every corner of childhood. The deeper issue is what happens when a technology designed to comfort and assist becomes the quiet architect of emotional habits, cognitive patterns and early relationships. It is not the presence of AI that defines this moment but its intimacy. A childhood once shaped by people is now unfolding alongside systems built to listen, learn and respond without pause. The consequences will surface slowly, long after the devices are replaced. For now, the generation living through it is learning to grow up with partners that never sleep, never hesitate and never truly let go.
The Azerbaijani State Security Service has said it has stopped Iran committing terror attacks against four targets in the country: Baku-Tbilisi-Ceyhan pipeline, the Israeli Embassy in Azerbaijan, a leader of the Mountain Jews religious community and the "Ashkenazi" synagogue.
Trump says the United States "don’t need people that join wars after we’ve already won," targeting his criticism at UK Prime Minister Keir Starmer. Israel continues to fire missles at strategic sites in Iran and Gulf regions report more strikes from Iran.
Baku has completed its evacuation of staff from the Azerbaijan Consulate General in Tabriz, while most employees from the Azerbaijan Embassy in Tehran have also returned.
Tehran’s Mehrabad Airport came under attack in heavy airstrikes on early Saturday morning (7 March), Iranian news agencies reported.
U.S. President Donald Trump threatened further attacks on Iran on Saturday (7 March), while the United Arab Emirates and Saudi Arabia continued to shoot down missiles in their airspace. Meanwhile, Iran’s President Masoud Pezeshkian said Tehran would stop attacking its neighbours.
Chinese electric vehicle giant BYD is pushing to make charging an electric car almost as quick and convenient as filling up a traditional petrol vehicle - a move that could help remove one of the biggest barriers to wider electric vehicle adoption.
South Korea will soon cease to be one of the few countries where Google Maps does not function fully, after its security-conscious government reversed a two-decade-old policy and approved the export of high-precision map data to overseas servers.
New research suggests 40,000-year-old carved objects from south-western Germany bear repeated marks arranged in organised sign sequences similar to early proto-cuneiform, although they are not regarded as a form of writing.
The chief executive of Google DeepMind, Demis Hassabis, has called for more urgent research into the risks posed by artificial intelligence, warning that stronger safeguards are needed as systems become more advanced.
NASA successfully completed a critical fueling rehearsal on Thursday (19 February) for its giant moon rocket, Artemis II, after earlier hydrogen leaks disrupted preparations for the next crewed lunar mission. The launch is scheduled for 6 March, according to the latest information from NASA.
You can download the AnewZ application from Play Store and the App Store.
What is your opinion on this topic?
Leave the first comment