Skip to content

Science On Tap checks out risks, rewards of AI

Rules needed now to prevent danger, says Schaeffer
0621-telusaitalk-sup
CHATGPT DID NOT WRITE THIS — UP to 100 people will be at an Edmonton bar this June 28 for a Science On Tap talk on artificial intelligence. TELUS WORLD OF SCIENCE EDMONTON/Supplied

EDMONTON - Killer robots probably won’t blow up your house tomorrow, but only if we put in laws to stop them from being used that way today, says a University of Alberta professor.

Up to 100 science-fans from throughout the Edmonton region will be at Fargos Restaurant and Lounge this June 28 for a panel discussion on the facts and fantasies of artificial intelligence.

The talk is part of the Telus World of Science Edmonton’s Science On Tap series, which aims to bring science into less formal locations such as bars. This is the first time the event has been held outside of the science centre since the start of the COVID-19 pandemic, noted staff scientist Cate Collins.

AI has exploded in popularity, with programs such as ChatGPT and Midjourney able to generate texts, images, and videos which are virtually indistinguishable from those made by humans.

“I know programmers who are using it to write code for them,” Collins said of ChatGPT, adding that her co-workers have used it on grant applications. (The website for this month’s Science On Tap event features a poem written by ChatGPT.)

But researchers have also raised concerns about AI’s potential to spread lies and run autonomous weapons. Last May 30, some 350 leading AI experts signed a statement which said that AI was a societal-scale risk capable of wiping out humanity that should be treated with the same seriousness as a pandemic or nuclear war.

Collins said this month’s talk is meant to give people a chance to learn more about the potential impacts of AI from the experts. Featured speakers include University of Alberta philosopher Geoffrey Rockwell, Nidhi Hegde of the Alberta Machine Intelligence Institute, and U of A computing science professor Jonathan Schaeffer.

Need for regulation

Schaeffer, who is known for his work with AI and games such as poker, said AI could bring many benefits to humanity. It could make writers more productive, help doctors spot more tumors in cancer patients, and reduce traffic deaths by acting as a driver that will never drive drunk.

But like nuclear energy, AI could cause serious problems if misused, Schaeffer continued. AI programs don’t distinguish between truth and lies, and can crank out fictitious results (as an American lawyer found out this year after the cases ChatGPT cited for him turned out to not exist). They can also have built-in biases based on their training data — train a facial recognition program on mostly light-skinned people, and it ends up being bad at spotting dark-skinned ones.

AI has produced impressive results but is not yet at the point where it can think and act on its own, Schaeffer said.

“The AI is not going to turn around and say, ‘You know what? I feel like I should wipe out a city today.’ It can’t do that. It only does what you program it to do.”

Schaeffer said it’s better to think of today’s AI as “augmented intelligence” — something that assists, but does not replace, human thought. You can use a program like ChatGPT to write a rough draft, but you still need to thoroughly check its work for accuracy.

Schaeffer said it is important for governments to put laws in place today before AI programs reach the point where they can think and act independently. He pointed to Europe’s new data protection regulations as a good start.

“Unless you put some ground rules in place, people will do whatever they want,” he said.

“The consequences could be terrible.

Visit telusworldofscienceedmonton.ca for details on the talk.



Kevin Ma

About the Author: Kevin Ma

Kevin Ma joined the St. Albert Gazette in 2006. He writes about Sturgeon County, education, the environment, agriculture, science and aboriginal affairs. He also contributes features, photographs and video.
Read more



Comments
push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks