Sdorn Provides Timely and Accurate Technology News, Covering APP, AI, IoT, Cybersecurity, Startup and Innovation.
⎯ 《 Sdorn • Com 》

OpenAI chief seeks to calm fears on job losses

2023-05-27 01:48
The boss of OpenAI, the firm behind the massively popular ChatGPT bot, said on Friday that his firm's technology would not destroy the job market as he sought to calm fears...
OpenAI chief seeks to calm fears on job losses

The boss of OpenAI, the firm behind the massively popular ChatGPT bot, said on Friday that his firm's technology would not destroy the job market as he sought to calm fears about the march of artificial intelligence (AI).

Sam Altman, on a global tour to charm national leaders and powerbrokers, said in Paris that AI would not -- as some have warned -- wipe out whole sectors of the workforce through automation.

"This idea that AI is going to progress to a point where humans don't have any work to do or don't have any purpose has never resonated with me," he said.

Asked about the media industry, where several outlets already use AI to generate stories, Altman said ChatGPT should instead be like giving a journalist 100 assistants to help them research and come up with ideas.

ChatGPT burst into the spotlight late last year, demonstrating an ability to generate essays, poems and conversations from the briefest of prompts.

Microsoft later laid out billions of dollars to support OpenAI and now uses the firm's technology in several of its products -- sparking a race with Google, which has made a slew of similar announcements. 

Altman, a 38-year-old emerging star of Silicon Valley, has received rapturous welcomes from leaders everywhere from Lagos to London. 

Though earlier this week, he seemed to annoy the European Union by hinting that his firm could leave the bloc if they regulate too severely.

He insisted to a group of journalists on the sidelines of the Paris event that the headlines were not fair and he had no intention of leaving the bloc -- rather, OpenAI was likely to open an office in Europe in the future.

- 'Exhausting' -

The success of ChatGPT -- which has been used by politicians to write speeches and proved itself capable of passing tough exams -- has thrust Altman into a global spotlight.

"Years from now, reflecting on this will feel very special... but it is also quite exhausting and I hope life calms down," he said.

OpenAI was formed in 2015 with investors including Altman and billionaire Twitter owner Elon Musk, who left the firm in 2018 and has repeatedly bashed it in recent months.

Musk, who has his own AI ambitions, said he came up with the name OpenAI, invested $100 million in it, was betrayed when the company turned itself from non-profit to profit-making in 2018, and has said Microsoft now effectively runs the company.

"I disagree with almost all of that, but I will try to avoid a food fight here," said Altman. "There's got to be more important things than whatever he's going on about." 

Instead, he wanted to focus on the mission of OpenAI, which he said was to "maximise the benefits" to society of AI and particularly Artificial General Intelligence (AGI) -- the much-vaunted future where machines will master all sorts of tasks, not just one.

He conceded that definitions of AGI were "fuzzy" and there was no agreement, but said his definition was when machines could make major scientific breakthroughs.

"For me, if you can go figure out the fundamental theory of physics and answer it all, I'll call you AGI," he said.

A major criticism of his products is that the firm does not publish the sources it uses to train its models.

As well as copyright issues, critics argue that users should know who is responsible for answering their questions, and if those replies used material from offensive or racist webpages.

But Altman argued the bottom line was that critics wanted to know whether the models themselves were racist.

"How it does on a racial bias test is what matters there," he said, deflecting the idea that he should publish the sources. 

He said the latest model, GPT-4, was "surprisingly non-biased".

jxb/kjm