Exclusive: Google promises changes in search supervision after internal revolt

(Reuters) – Alphabet Inc’s Google will change procedures before July to review the work of its scientists, according to a city hall recording heard by Reuters, part of an effort to stem the internal turmoil over the integrity of its artificial intelligence (AI) research.

ARCHIVE PHOTO: Google’s name appears outside the company’s London office in Great Britain on November 1, 2018. REUTERS / Toby Melville

In comments at a team meeting last Friday, Google Research executives said they were working to regain confidence after the company fired two prominent women and rejected their work, according to an hour-long recording, the content of which was confirmed by two sources.

The teams are already testing a questionnaire that will assess the risks of the projects and help scientists navigate the analyzes, said director of operations for research unit Maggie Johnson at the meeting. This initial change will be implemented at the end of the second quarter, and most documents will not require extra verification, she said.

Reuters reported in December that Google had submitted a review of “sensitive topics” for studies involving dozens of issues, such as China or prejudice in its services. Internal reviewers have demanded that at least three AI articles be modified to avoid putting Google’s technology in a negative light, Reuters reported.

Jeff Dean, Google’s senior vice president who oversees the division, said on Friday that analysis of “sensitive topics” “is and was confusing” and that he has tasked a senior research director, Zoubin Ghahramani, to clarify the rules , according to the recording.

Ghahramani, a professor at the University of Cambridge who joined Google in September from Uber Technologies Inc, said during City Hall, “We need to be comfortable with the discomfort” of self-critical research.

Google declined to comment on Friday’s meeting.

An internal email, seen by Reuters, offered new details about the concerns of Google researchers, showing exactly how Google’s legal department modified one of three AI documents, called “Extracting training data from large language models”. (bit.ly/3dL0oQj)

The e-mail, dated February 8, from a co-author of the newspaper, Nicholas Carlini, was sent to hundreds of colleagues, seeking to draw their attention to what he called “deeply insidious” editions of corporate lawyers.

“Let’s be clear here,” said the e-mail of approximately 1,200 words. “When we, as academics, write that we have a ‘concern’ or find something ‘worrying’ and a Google lawyer demands that we change to sound better, this is much more than a Big Brother intervening.”

The necessary edits, according to your email, included changes from “negative to neutral”, such as changing the word “concerns” to “considerations” and “hazards” to “risks”. Lawyers also demanded the exclusion of references to Google’s technology; the authors’ discovery that AI leaked copyrighted content; and the words “breach” and “confidential,” said the email.

Carlini did not respond to requests for comment. Google, in response to questions about the email, contested its claim that lawyers were trying to control the tone of the newspaper. The company said it had no problems with the topics investigated by the newspaper, but found some legal terms used imprecisely and conducted a full edition as a result.

RACIAL EQUITY AUDIT

Google also last week appointed Marian Croak, a pioneer in Internet audio technology and one of Google’s few Black vice presidents, to consolidate and manage 10 teams that study issues such as racial prejudice in algorithms and technology for the disabled.

Croak said at Friday’s meeting that it would take time to address concerns among AI ethics researchers and mitigate damage to Google’s brand.

“Please consider me entirely responsible for trying to reverse this situation,” she said on the tape.

Johnson added that the AI ​​organization is bringing in a consulting firm for a comprehensive assessment of the impact of racial equity. The department’s unprecedented audit would lead to recommendations “that will be very difficult,” she said.

Tensions in Dean’s division deepened in December after Google left Timnit Gebru, co-leader of his ethical AI research team, after his refusal to withdraw an article on language-generating AI. Gebru, who is black, accused the company at the time of evaluating her work differently because of her identity and of marginalizing employees from underrepresented backgrounds. Almost 2,700 employees signed an open letter in support of Gebru. (bit.ly/3us5kj3)

During City Hall, Dean worked out which grant the company would support.

“We want responsible AI and ethical AI investigations,” said Dean, setting an example of studying the environmental costs of technology. But it is problematic to quote data “by almost a factor of a hundred” while ignoring more accurate statistics, as well as Google’s efforts to reduce emissions, he said. Dean previously criticized Gebru’s article for not including important findings about the environmental impact.

Gebru defended the quote from his newspaper. “It is a terrible appearance for Google to come out so defensively against a newspaper that it has been quoted by so many of its similar institutions,” she told Reuters.

Employees continued to post about their frustrations on Twitter last month, while Google investigated and then dismissed ethical AI co-leader Margaret Mitchell for moving electronic files outside the company. Mitchell said on Twitter that he acted “to raise concerns about race and gender inequality and talk about the problematic resignation of Dr. Gebru by Google”.

Mitchell contributed to the article that led to Gebru’s departure and a version published online last month without an affiliation with Google called “Shmargaret Shmitchell” as a co-author. (bit.ly/3kmXwKW)

Asked to comment, Mitchell, through a lawyer, expressed disappointment with Dean’s criticism of the newspaper and said that his name was removed by order of the company.

Reporting by Paresh Dave and Jeffrey Dastin; Editing by Jonathan Weber and Lisa Shumaker

.Source