Google Suspends Engineer Who Claimed AI Bot Had Become Sentient
Blake Lemoine, wherein he said he concluded the Google AI he interacted with, dubbed LaMDA, was a person, "in his capacity as a priest, not a scientist."
June 13, 2022
(Bloomberg) -- Blake Lemoine, a software engineer on Google’s artificial intelligence development team, has gone public with claims of encountering “sentient” AI on the company’s servers after he was suspended for sharing confidential information about the project with third parties.
The Alphabet Inc. unit placed the researcher on paid leave early last week on claims he breached the firm’s confidentiality policy, he said in a Medium post titled “May be fired soon for doing AI ethics work.” In the posting, he draws a connection to prior members of Google’s AI ethics group, such as Margaret Mitchell, who were eventually dismissed by the company in a similar fashion after raising concerns.
The Washington Post on Saturday ran an interview with Lemoine, wherein he said he concluded the Google AI he interacted with was a person, “in his capacity as a priest, not a scientist.” The AI in question is dubbed LaMDA, or Language Model for Dialogue Applications, and is used to generate chat bots that interact with human users by adopting various personality tropes. Lemoine said he tried to conduct experiments to prove it, but was rebuffed by senior executives at the company when he raised the matter internally.
“Some in the broader AI community are considering the long-term possibility of sentient or general AI, but it doesn’t make sense to do so by anthropomorphizing today’s conversational models, which are not sentient,” Google spokesperson Brian Gabriel said in response. “Our team –- including ethicists and technologists –- has reviewed Blake’s concerns per our AI Principles and have informed him that the evidence does not support his claims.”
The company said it does not comment on personnel matters when asked about Lemoine’s suspension.
Read more about:
Alphabet Inc.About the Author
You May Also Like