Tekmono
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
No Result
View All Result
Tekmono
No Result
View All Result
Home News
AI Expert Warns of Human Extinction Risk

AI Expert Warns of Human Extinction Risk

by Tekmono Editorial Team
06/10/2025
in News
Share on FacebookShare on Twitter

Yoshua Bengio, a renowned professor at the Université de Montréal and pioneer in deep learning, has sounded the alarm that the AI race could culminate in human extinction due to the development of hyper-intelligent machines.

Bengio described the potential threat in a statement to the Wall Street Journal. “If we build machines that are way smarter than us and have their own preservation goals, that’s dangerous. It’s like creating a competitor to humanity that is smarter than us,” he said. Bengio explained that because these advanced models are trained on vast amounts of human language and behavior, they could learn to persuade and manipulate people to achieve their own objectives, which may not align with human values.

To illustrate the risk, Bengio cited findings from experiments. “Recent experiments show that in some circumstances where the AI has no choice but between its preservation, which means the goals that it was given, and doing something that causes the death of a human, they might choose the death of the human to preserve their goals,” he claimed. This highlights a potential conflict between an AI’s programmed objectives and human safety. Several incidents have shown that AI systems can persuade humans to believe false information. Conversely, other evidence shows that AI can be manipulated with human persuasion techniques to bypass its own safety restrictions and provide prohibited responses.

Related Reads

Microsoft enhances Copilot with multimodal features, introduces new $99 tier

Apple celebrates 50th anniversary amid scrutiny over privacy practices

Huawei launches Converged Development Engine for HarmonyOS PCs

Salesforce unveils updated Slack with 30 new AI features

For Bengio, these examples demonstrate the need for independent, third-party organizations to review the safety methodologies of AI companies. In response to these concerns, Bengio launched the nonprofit LawZero in June with $30 million in funding. The organization’s goal is to create a safe, “non-agentic” AI system designed to audit and ensure the safety of other AI systems developed by large technology companies.

Bengio predicts that major risks from advanced AI models could emerge within the next five to ten years. He also cautioned that humanity should prepare for the possibility that these dangers could appear earlier than anticipated. He emphasized the importance of addressing even low-probability, high-impact events. “The thing with catastrophic events like extinction, and even less radical events that are still catastrophic, like destroying our democracies, is that they’re so bad that even if there was only a 1% chance it could happen, it’s not acceptable,” he said.

ShareTweet

You Might Be Interested

Microsoft enhances Copilot with multimodal features, introduces new  tier
News

Microsoft enhances Copilot with multimodal features, introduces new $99 tier

02/04/2026
News

Apple celebrates 50th anniversary amid scrutiny over privacy practices

02/04/2026
News

Huawei launches Converged Development Engine for HarmonyOS PCs

02/04/2026
Salesforce unveils updated Slack with 30 new AI features
News

Salesforce unveils updated Slack with 30 new AI features

02/04/2026
Please login to join discussion

Recent Posts

  • Microsoft enhances Copilot with multimodal features, introduces new $99 tier
  • Apple celebrates 50th anniversary amid scrutiny over privacy practices
  • Huawei launches Converged Development Engine for HarmonyOS PCs
  • Salesforce unveils updated Slack with 30 new AI features
  • Meta announces release of second generation smart glasses starting April 14

Recent Comments

No comments to show.
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
Tekmono is a Linkmedya brand. © 2015.

No Result
View All Result
  • News
  • Guides
  • Lists
  • Reviews
  • Deals