May 18, 2025

McIntyre Report Political Talk Show

Help us help defend free speech and save democracy from the World Economic Forum planned Totalitarian Great Reset. and help us expose the Covid Fraudsters

The Vladimir Putin Interview

Recent News

The next 3 minutes will transform your life forever.

Get our free News Emails on latest articles, alerts and solutions for both legal templates and ways to help fight back against the Globalists vax Mandates , and health resources to boost your immune system and ways to Protect from deadly EMF 5G radiation and more.

FREE E-BOOKS AND REPORTS ALSO

Australian National Review - News with a Difference!

How you can advertise on Truthbook.social

How Chatbots Are Replacing Human Connection—And Leaving Us Lonelier

Facebook
Twitter
LinkedIn
WhatsApp
Email
How Chatbots Are Replacing Human Connection—And Leaving Us Lonelier

‘Those who spend more time with chatbots tend to be even lonelier,’ said MIT Media Lab research.

What happens when we start turning to machines for the comfort we once found in people?

A growing body of research suggests that the rise of AI chatbots may be quietly reshaping how we connect—and not always for the better.

Programs like ChatGPT are powered by artificial intelligence to engage in conversation with users.

As the technology has advanced, they have become increasingly human-like and capable of more natural and realistic conversations, even engaging emotionally.

MIT Media Lab released a study (pdf) in March exploring the interaction between people and machine, finding overall that users would initially experience a drop in loneliness.

“For average levels of daily use, conversing with a chatbot with highly empathetic, emotional, and socially considerate responses was also associated with higher loneliness and lower socialisation,” the report said.

“Those who spend more time with chatbots tend to be even lonelier.”

The study found that people with “social vulnerabilities,” including those with attachment tendencies and experience distress from emotional avoidance, were more likely to feel loneliness after engaging daily with a chatbot.

A man looks at his smartphone in Newcastle, Australia on Dec. 1, 2024. (Roni Bintang/Getty Images)

A man looks at his smartphone in Newcastle, Australia on Dec. 1, 2024. Roni Bintang/Getty Images

Even Non-Personal Interaction Can Result in Dependency

Meanwhile, even non-personal conversations could be susceptible with those asking chatbots for advice with brainstorming becoming emotionally dependent.

“When users engage in non-personal conversations, the [chatbot] model also responds more practically and informatively than emotionally, such as by facilitating the development of the user’s skills,” the report said.

“At high usage, chatbots with a greater degree of professional distance, even to the degree of frequently neglecting to offer encouragement or positive reinforcement when appropriate, tend to be more strongly associated with emotional dependence and problematic use.”

Yet researchers could not explain why this happened.

A Convenient Reprieve From Loneliness: University Dean

Paul Darwen, associate dean of IT at James Cook University’s Brisbane campus, said that while people were more connected than ever, they were “less connected with other people.”

“And that’s a question. That’s not a question for computer science. That’s a question for social science,” he told The Epoch Times.

Darwen further stated that while AI might be a “band-aid solution” to loneliness, it might also create other problems.

“And what [will] happen in the future? People are talking about [AI] sexbots. I am not sure what will happen then,” he said.

The associate dean also pointed out that people were also beginning to substitute real interaction with chatbots, and this could motivate AI companies to focus on this niche market for profit.

“There was an episode of [the animated sitcom] South Park where, in the dystopian future, Alexa was like the robot companion of everyone who was lonely,” Darwen said.

“We’re very close to that being a possibility,” he said, noting that development in this field was

A person has a conversation with a humanoid robot in Las Vegas, Nevada, on Jan. 10, 2024. (Frederic J. Brown/AFP via Getty Images)

A person has a conversation with a humanoid robot in Las Vegas, Nevada, on Jan. 10, 2024. Frederic J. Brown/AFP via Getty Images

Chatbots and Suicides

In recent years, this issue has become a reality with dire consequences.

In October 2024, a Florida mother filed a lawsuit against AI startup Character Technologies, Inc., and its co-founders, alleging that they were responsible for the death of her 14-year-old son.

According to the lawsuit, the boy used a chatbot program marketed through Character Technologies’ AI platform and developed an emotional dependence on it.

The mother alleged that the chatbot’s ability to simulate realistic human interactions later caused her son to undergo severe emotional distress, which ultimately led to his suicide.

In a separate case, a Belgian man committed suicide after being persuaded by a chatbot in 2023.

The man developed an obsession with climate change and engaged heavily with an AI chatbot app called Chai to alleviate his concerns.

Following a several-weeks-long discussion, the chatbot advised the man to sacrifice his life to save the planet, which he eventually did.

The man’s death sparked calls for new laws in the EU to regulate chatbots and impose responsibility on AI companies.

In the same year, an eating disorder association in the United States shut down its AI chatbot service after it was reported the program was giving harmful advice to users.

According to one user, the chatbot advised her to try to lose weight and measure herself on a weekly basis despite being told she had an eating disorder.

Too Many Unanswered Questions: AI Safety Group

Greg Sadler, CEO of Good Ancestors Policy, a charity focused on AI, said studies had shown that chatbots can be as persuasive as humans.

“There are unanswered questions, like whether chatbots should have access to dangerous information, whether AI developers can reliably control their models, and who is liable when chatbots cause harm,” he told The Epoch Times.

“This isn’t just a challenge for chatbots intended for social engagement. Businesses proposing to use customer-facing chatbots face real risks and legal uncertainty until these legal and technical challenges are resolved.”

To tackle these issues, Sadler said the government could introduce legislation that helps establish minimum safety standards and impose responsibility when things go wrong.

“Government should also support technical research into ensuring AI is aligned with our values and can be controlled,” he said.

According to data from the U.S.-based market research company Grand View Research, the value of the global AI chatbot market was around US$7.76 billion (A$12.1 billion) in 2024.

The company forecasts the market to grow at a compound annual rate of 23.3 percent between 2025 and 2030, with the market value hitting US$27.3 billion by 2030.

Source link

Original Source

Related News

Let’s not lose touch…Your Government and Big Tech are actively trying to censor the information reported by The ANR to serve their own needs. Subscribe now to make sure you receive the latest uncensored news in your inbox…

Join our censor free social media platform for Independent thinkers

URGENT: JUST 3 DAYS REMAIN TO HELP SAVE INDEPENDENT MEDIA & ANR, SO LET'S CUT THE BS & GET TO THE POINT - WE WILL BE FORCED TO LAY OFF STAFF & REDUCE OPERATIONS UNLESS WE ARE FULLY FUNDED WITHIN THE NEXT 2 WEEKS

Sadly, less than 0.5% of readers currently donate or subscribe to us But YOU can easily change that. Imagine the impact we'd make if 3 in 10 readers supported us today. To start with we’d remove this annoying banner as we could fight for a full year...

Enter Details for free ANR news