OpenAI hires crisis managers to tackle election interference
OpenAI is recruiting people to manage the risk of artificial intelligence around elections as more than half the world’s population heads to the polls this year.
The ChatGPT creator is advertising two new roles, for an election programme manager and a rapid response lead. They will “identify election-related risks and design, coordinate and implement mitigation plans”, according to the job ads, which offer a salary range of $190,000 to $280,000.
Business leaders and politicians gathered in Davos have honed in on the complications thrown up for the democratic process by misinformation and bias from the increasing sophistication of artificial intelligence.
The World Economic Forum kicked off with a survey warning that election disruption from AI posed “the biggest global risk in 2024”.
The job ads came as Sam Altman, OpenAI’s chief executive, said at the conference: “I believe that America is gonna be fine, no matter what happens in this election. I believe that AI is going to be fine, no matter what happens in this election, and we will have to work very hard to make it so.”
The ad for the rapid response lead said: “We are seeking a leader who can independently develop our playbook and drive internal coordination around high-stakes incidents, including critical moments around elections and civil unrest.”
OpenAI is trying to get on the front foot because companies have been increasingly drawn into the debate about fairness in elections.
Meta, for example, was heavily criticised for its delay in removing Donald Trump’s profile from the platform following the last election, after it was used to co-ordinate action among his supporters which led to a violent storming of the Capitol building in Washington in 2021.
In a recent blog post, OpenAI said: “As we prepare for elections in 2024 across the world’s largest democracies, our approach is to continue our platform safety work by elevating accurate voting information, enforcing measured policies, and improving transparency.”