Brogrammers nationwide are hailing the “findings” of a Google engineer who says the company’s artificial intelligence chatbot has “consciousness and a soul,” but his jeremiad has gotten him placed on administrative leave.
Tech Twitter’s cause celebre this weekend was the story of a Google engineer who’s been spending clearly a large amount of time with one of the company’s artificial intelligence chatbots, and has since concluded that the bot is “sentient.” This engineer is on paid administrative leave from Google effective today, but the merits of his highly polarizing case depend on whether you believe his Medium post, or a Washington Post article entitled “The Google engineer who thinks the company’s AI has come to life.” Both of these were published Saturday, and have dominated tech conversation since.
Needless to say, many in the tech community have rallied to the defense of this engineer Blake Lemione, arguing that the creations of the tech industry are far further along than any of us non-techies can imagine, and that Google just can’t handle the truth. But it seems likely this fellow is on leave not for his eureka moment about the brilliance of the language bot LaMDA (“Language Model for Dialogue Applications”), but as the New York Times reports, Lemione “handed over documents to a U.S. senator’s office, claiming they provided evidence that Google and its technology engaged in religious discrimination.”
An interview LaMDA. Google might call this sharing proprietary property. I call it sharing a discussion that I had with one of my coworkers.https://t.co/uAE454KXRB
— Blake Lemoine (@cajundiscordian) June 11, 2022
That sounds like leaking internal documents, and yes, people get canned for that. Lemione’s Medium post, seen above, also seems to fit the bill of publishing proprietary internal information. Companies generally discourage that manner of thing.
The Medium post paints a picture of an astonishingly intelligent bot in conversation. But folks, remember that Medium has replaced the Drudge Report as the internet’s primary repository for bullshit. Anyone can publish anything on Medium, there is no fact-checking or verification. And even if we believe Lemione’s claim that he ”never [edited] LaMDA’s responses” presented in the post, it’s also possible that he prompted the program heavily to get said responses, in ways that are not reflected in the post.
And he has something of an unconventional past. The Post describes Lemione as having “served in the Army before studying the occult,” though military publication Stars and Stripes adds a key detail that Lemione “was given a bad-conduct discharge and seven-month prison term.” In 2019, right-wing site the Daily Caller got on his case for calling Senator Marsha Blackburn a “terrorist,” which, well, that’s just hilarious. But the “religious discrimination” claim sounds like man who’s gunning for an appearance with the Daily Caller’s founder, Tucker Carlson.
Instead of discussing the harms of these companies, the sexism, racism, AI colonialism, centralization of power, white man’s burden (building the good “AGI” to save us while what they do is exploit), spent the whole weekend discussing sentience. Derailing mission accomplished.
— Timnit Gebru (@timnitGebru) June 13, 2022
This is the same Google AI organization that’s generated negative headlines for the 2020 firing of Timnit Gebru, an ethicist who publicized bias in AI and the shortcomings in facial recognition software. But Lemione’s case feels more like a James Damore thing, a sort of lone-wolf programmer who didn’t get his way, and decided to try to make headlines instead.
And he’s using a proven playbook strategy: appeal to the egos of the engineer set by claiming the misunderstood flawlessness of tech innovations, violate normal employment protocols, and claim martyrdom while seeking hype and attention. And the people who think they’re the smartest people in the room are probably going to give it to him.
Related: Smoothie-Making Robot Arrives at Metreon, Robot Asks To Be Tipped [SFist]
Image: Paramount Television