sourcegraph
November 28, 2023

A German computer scientist known as the “father of artificial intelligence” says concerns about the technology are misplaced and advances in artificial intelligence aren’t stopping.

“You can’t stop it,” Jürgen Schmidhuber said of artificial intelligence and the current international race to build more powerful systems, according to The Guardian. “Not at an international level, of course, because one country’s goals may be incompatible with another’s. Totally different. So, of course, they’re not going to be involved in some kind of timeout.”

According to the Guardian, Schmidhuber worked on artificial neural networks in the 1990s, and his research later led to language-processing models for technologies like Google Translate.

He currently heads the AI ​​program at King Abdullah University of Science and Technology in Saudi Arabia, and says in his resume that he has been working on building “a self-improving artificial intelligence (AI) that is smarter than itself” about 15 years since he took the job. age.

AI could become ‘Terminator’, surpassing humans in Darwinian rules of evolution, report warns

Jurgen Schmidhuber (Getty Images)

Schmidhuber said he doesn’t think anyone should try to stop progress in developing powerful AI systems, arguing that “in 95 percent of cases, AI research is really about our old motto of making humans live longer, healthier and easier”

Schmidhuber also said that concerns about AI are misplaced and that developing AI-powered tools for good purposes will counter bad actors using the technology.

The future of AI: New technologies will create ‘digital humans’ that could consume more energy than all workers by 2025

“It’s just that the tools now being used to improve lives can be used by bad guys, but they can also be used against bad guys,” he said, according to The Guardian.

OpenAI logo

Schmidhuber said concerns about AI are misplaced and that developing AI-powered tools for good purposes will counter bad actors using the technology. (Bloomberg via Getty Images)

“I’m more worried about the old dangers of nuclear bombs than the new little dangers of artificial intelligence that we see now.”

Tech CEO warns AI risks ‘human extinction’ as pundits rally after six-month moratorium

His comments come as other tech leaders and experts have sounded the alarm about the threat posed to humanity by powerful technology. Tesla founder Elon Musk and Apple co-founder Steve Wozniak joined thousands of other technologists in signing a letter in March calling on artificial intelligence labs to suspend their research, until security measures are in place.

Jeffrey Hinton

Artificial intelligence pioneer Geoffrey Hinton speaks at the Thomson Reuters Finance & Risk Summit in Toronto on December 4, 2017. (Reuters/Mark Blunch/File)

Artificial intelligence “godfather” on AI may wipe out human beings: “It’s not inconceivable”

Geoffrey Hinton, dubbed the “Godfather of AI,” announced this month that he was quitting his job at Google to publicly voice his fear of the technology. On Friday, Hinton said artificial intelligence could pose “more urgent” risks to humanity than climate change — and while he shares similar concerns with tech leaders like Musk, he said a moratorium on AI research in labs is “completely unrealistic” “.

“I think it’s an existential risk camp and it’s close enough that we should be working really hard now and putting a lot of resources into figuring out what we can do about it,” he told Reuters.

Schmidhuber, who has publicly criticized Hinton for allegedly not citing other researchers in his research, told the Guardian that AI will surpass human intelligence and ultimately benefit people when using AI systems, comments he has made in the past.

Click here for the Fox News app

“I have been working on [AI] For decades, basically since the 80s, I still believe in the possibility of witnessing an AI that will be so much smarter than myself that I can retire,” Schmidhuber said in 2018.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *