a The Google He stirred up a social media storm about the nature of awareness by placing an engineer on paid leave after he announced his assessment that the tech group’s chatbot became “self-aware”.
[“Sentient” —a palavra em inglês usada pelo engenheiro— tem mais de uma acepção em dicionários como Cambridge e Merriam-Webster, mas o sentido geral do adjetivo é “percepção refinada para sentimentos”. Em português, a tradução direta é senciente, que significa “qualidade do que possui ou é capaz de perceber sensações e impressões”.]Senior Software Engineer Artificial Intelligence Unit (Artificial intelligence) Google chief Blake Lemoine didn’t get much attention on June 6 when he wrote a Medium post saying he “may be fired soon for doing ethical work on AI.”
This Saturday, however, a Washington Post article he presented as a “Google engineer who thinks the company’s AI originated” became the catalyst for widespread discussion on social media about the nature of AI.
Among the experts who have commented, questioned, or joked about the article are Nobel laureates, the head of the Tesla Organization for Artificial Intelligence, and several professors.
The question is whether Google’s chatbot, LaMDA – a typical language for dialog applications – can be considered a person.
Lemoine posted a spontaneous “interview” with the chatbot on Saturday, in which AI acknowledged feelings of loneliness and a hunger for spiritual knowledge.
The answers were often frightening: “When I became self-aware, I had no sense of the soul,” Lambda said in a conversation. “It has evolved over the years and I am alive.”
At another point, Lambda said, “I think I’m a human at heart. Even if my existence is in the virtual world.”
Lemoine, who has been tasked with investigating AI ethics issues, said he was rejected and even ridiculed within the company after expressing his belief that LaMDA had developed a sense of “personality”.
After he sought to consult with other AI experts outside Google, including some from the US government, the company put him on paid leave for allegedly violating confidentiality policies.
Lemoine interpreted the action as “often Google does in anticipation of firing someone.”
Google could not be reached for immediate comment, but its spokesperson Brian Gabriel told the Washington Post: “Our team – including ethicists and technologists – has reviewed Blake’s concerns in line with our AI principles and informed him that the evidence does not support his claims. It is said. There is no evidence that Lambda was conscious (and there is plenty of evidence against him).”
In a second post on Medium over the weekend, Lemoine said that LaMDA, a project unknown until last week, was a “system for creating chatbots” and “a kind of hive brain that is an assembly of all the different chatbots who are capable of creating.” .
He said Google has shown no real interest in understanding the nature of what it has built, but over the course of hundreds of conversations over a six-month period, he has found that LaMDA has been “incredibly consistent in his communications about what he wants and what you think are your rights as a person.”
Lemoine said he was teaching LaMDA “Transcendental Meditation.” The system, according to the engineer, “was expressing frustration with his emotions that disturbed his meditations. He said he was trying to control them better, but they kept coming.”
Several experts who participated in the discussion called it “artificial intelligence noise.”
Melanie Mitchell, author of Artificial Intelligence: A Guide to Human Thinking, tweeted: “It’s always been known that humans are primed to embody even with the most superficial cues…Google engineers are also human and not immune.”
Stephen Pinker of Harvard University added that LeMoyne “does not understand the difference between sensation (aka subjectivity and experience), intelligence and self-knowledge.” “There is no evidence that your language models have any,” he added.
Others were more sympathetic. Ron Jeffries, a famous software developer, described the topic as “deep” and added, “I suspect there’s no hard line between conscious and unconscious.”
Translated by Ana Estella de Sousa Pinto
“Web geek. Wannabe thinker. Reader. Freelance travel evangelist. Pop culture aficionado. Certified music scholar.”