"A certain danger lurks there": how the inventor of the first chatbot turned against AI
Some subjects have been very hard to convince that Eliza (with its present script) is not human, Weizenbaum wrote. In a follow-up article that appeared the next year, he was more specific: one day, he said, his secretary requested some time with Eliza. After a few moments, she asked Weizenbaum to leave the room. I believe this anecdote testifies to the success with which the program maintains the illusion of understanding, he noted....
For Weizenbaum, judgment involves choices that are guided by values. These values are acquired through the course of our life experience and are necessarily qualitative: they cannot be captured in code. Calculation, by contrast, is quantitative. It uses a technical calculus to arrive at a decision. Computers are only capable of calculation, not judgment. This is because they are not human, which is to say, they do not have a human history they were not born to mothers, they did not have a childhood, they do not inhabit human bodies or possess a human psyche with a human unconscious and so do not have the basis from which to form values...
The later Weizenbaum was increasingly pessimistic about the future, much more so than he had been in the 1970s. Climate change terrified him. Still, he held out hope for the possibility of radical change. As he put it in a January 2008 article for Süddeutsche Zeitung: The belief that science and technology will save the Earth from the effects of climate breakdown is misleading. Nothing will save our children and grandchildren from an Earthly hell. Unless: we organise resistance against the greed of global capitalism...
https://www.theguardian.com/technology/2023/jul/25/joseph-weizenbaum-inventor-eliza-chatbot-turned-against-artificial-intelligence-ai
There is SO MUCH between the ellipses. Out of curiosity, wondering what AI would make of this, used TLDR on BingBot with this not too bad, though very superficial, result:
Unlike GPT, Bing gives sources. All of the superscripted refs are to the Guardian Article. Other sources, unreferenced in the synopsis, are also listed.
Long ago I wrote a GWBASIC version of Eliza. Even though I knew precisely how the responses were generated, as long as took care to not break the input the resulting output gave quite an eerie feeling of an actual conversation.
For those with interest, github has the source code of Weizenbaum's program implemented in Java OOP by Charles Hayden. Fascinating to explore. As for myself, need to "step away from the computer" and get going in the Real World this morning!!
https://github.com/codeanticode/eliza
usonian
(13,540 posts)I harken back to the famous Hammerbacher quote:
In 2011, Jeff Hammerbacher, a former Facebook data team leader, riffing on Ginsberg, bemoaned, The best minds of my generation are thinking about how to make people click ads. That sucks. Of all the things to optimize, a generation had chosen manipulating attention.
In the 1980's AI was a toy. Somehow, it returned with a vengeance in recent times as Machine Learning, but it had enormous data sets to work on.
The uses and abuses are hot topic items today.
As for climate change. "Two Words"
Reposting this:
Even if someone convinced the cult that antifa, George Soros, Barbie and Satan himself were stoking the fires, nothing changes until the money that binds politicians and boys in black nightgowns is excised from politics and the courts.
Big oil owns damn near everything and everyone who could possibly contain Big oil. Until government is eliminated, which seems to be one of their strategies, they own it.
America's moral compass largely points to money. Or is that MoreOil compass.
Putin and MBS, who are paupers without oil bought an entire government.
Citizens United against civilization.
It's a choice: greed or sharing with others and caring for others.
Most choose poorly.