Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsYou don't have to use AI (Becca Rothfeld, staffer at the New Yorker, former nonfiction book critic at WaPo)
https://afeteworsethandeath.substack.com/p/you-dont-have-to-use-ai-snip-
For the past few years, Ive tried not to think about AI too much, and shockingly, Ive succeeded. Its been more than half a decade since Ive taught at a university (although I hope to do so again someday), so I havent had to scheme to keep my students from outsourcing the exquisite work of thinking to a glorified search engineand AI is worse than useless for a critic, so I havent been the least bit tempted to use it in my line of work. Im not really one for optimizing away the various inconveniences in my daily life, preferring to luxuriate in inefficiencies, so I havent turned to it for scheduling or the like.
Of course, its long seemed obvious to me (and, I presume, basically everyone I care about or respect ) that the use of AI in most humanistic endeavors is beyond the pale. Even if it were good at writing prose or doing philosophyand thus far it isnt to use AI to write or philosophize would be to render those activities futile. In some of the sciences, some of the time, the point is the outcomethe vaccine, the medicine, the finding, the technology. In the humanities, the point is the process. The point of writing is to make something beautiful or interesting; the point of reading a book or a philosophy paper is, at least in large part, to make contact with another human mind that has strained to make something beautiful or interesting, whether or not the human mind has succeeded or failed. There would be no reason to read a novel that an AI had written, even if AI got to a point where it could write well, as perhaps it will if it keeps cannibalizing the greatest fruits of human endeavor with impunity. Writing a book with AI would be like driving a car to a marathon finish line, then claiming the title. Even considering using it to write evinces a complete misunderstanding of the enterprise. I cant stress it enough: if these claims are not obvious to you, you are not in my world and not of my flesh, to paraphrase a remark of Stanley Cavells in Aesthetic Problems of Modern Philosophy. I hope you are ashamed of yourself and your dwindling humanity, and I think it should be legally required for people to say when/if theyve used AI in the process of writing something so that I can avoid it at all costs.
-snipping to get to her comments on what was said by an Anthropic executive being interviewed by Ezra Klein-
At some point in the interview, Klein asks him about the pitfalls of engaging with relentlessly affirmative and ingratiating chatbotsKlein, by the way, does a very good job of holding Clark to account throughoutand Clark replies that AI neednt be used for self-affirmation. In fact, he explains, he uses it to help him occupy other peoples perspectives: Ive used these A.I. systems to basically say: Hey, Im in conflict with someone at Anthropic. Im really annoyed. Could you ask me some questions about that person and how theyre feeling to try to help me better think about the world from their perspective?
He didnt even pause before confessing that hed used AI in this way and didnt appear to think it was a cause for shame or contrition. aaaaggghhhHHHHHHHHHH!!!!!!!!!!!!!!?? I dont know whats worse, using AI in this fashion, or thinking that using AI in this fashion is such a normal, acceptable, and forgivable thing to do that you should admit to having done so on a podcast in a national newspaper. Imagine having a fight with a colleague, friend, lover, etc, and learning that the colleague, friend, or lover with whom you have a human relationship, whom you trust to approach you in human terms, is outsourcing the ethical labor of relating to you charitably to a machine???? I mean, I would simply leave my husband on the spot if he ever did this. No questions, no divorce filings. I would open the door, grab my dog, put on my running shoes, and sprint as far as possible in the other direction. What a despicable violation of the basic ethical contract linking one person to another!
-snip-
For the past few years, Ive tried not to think about AI too much, and shockingly, Ive succeeded. Its been more than half a decade since Ive taught at a university (although I hope to do so again someday), so I havent had to scheme to keep my students from outsourcing the exquisite work of thinking to a glorified search engineand AI is worse than useless for a critic, so I havent been the least bit tempted to use it in my line of work. Im not really one for optimizing away the various inconveniences in my daily life, preferring to luxuriate in inefficiencies, so I havent turned to it for scheduling or the like.
Of course, its long seemed obvious to me (and, I presume, basically everyone I care about or respect ) that the use of AI in most humanistic endeavors is beyond the pale. Even if it were good at writing prose or doing philosophyand thus far it isnt to use AI to write or philosophize would be to render those activities futile. In some of the sciences, some of the time, the point is the outcomethe vaccine, the medicine, the finding, the technology. In the humanities, the point is the process. The point of writing is to make something beautiful or interesting; the point of reading a book or a philosophy paper is, at least in large part, to make contact with another human mind that has strained to make something beautiful or interesting, whether or not the human mind has succeeded or failed. There would be no reason to read a novel that an AI had written, even if AI got to a point where it could write well, as perhaps it will if it keeps cannibalizing the greatest fruits of human endeavor with impunity. Writing a book with AI would be like driving a car to a marathon finish line, then claiming the title. Even considering using it to write evinces a complete misunderstanding of the enterprise. I cant stress it enough: if these claims are not obvious to you, you are not in my world and not of my flesh, to paraphrase a remark of Stanley Cavells in Aesthetic Problems of Modern Philosophy. I hope you are ashamed of yourself and your dwindling humanity, and I think it should be legally required for people to say when/if theyve used AI in the process of writing something so that I can avoid it at all costs.
-snipping to get to her comments on what was said by an Anthropic executive being interviewed by Ezra Klein-
At some point in the interview, Klein asks him about the pitfalls of engaging with relentlessly affirmative and ingratiating chatbotsKlein, by the way, does a very good job of holding Clark to account throughoutand Clark replies that AI neednt be used for self-affirmation. In fact, he explains, he uses it to help him occupy other peoples perspectives: Ive used these A.I. systems to basically say: Hey, Im in conflict with someone at Anthropic. Im really annoyed. Could you ask me some questions about that person and how theyre feeling to try to help me better think about the world from their perspective?
He didnt even pause before confessing that hed used AI in this way and didnt appear to think it was a cause for shame or contrition. aaaaggghhhHHHHHHHHHH!!!!!!!!!!!!!!?? I dont know whats worse, using AI in this fashion, or thinking that using AI in this fashion is such a normal, acceptable, and forgivable thing to do that you should admit to having done so on a podcast in a national newspaper. Imagine having a fight with a colleague, friend, lover, etc, and learning that the colleague, friend, or lover with whom you have a human relationship, whom you trust to approach you in human terms, is outsourcing the ethical labor of relating to you charitably to a machine???? I mean, I would simply leave my husband on the spot if he ever did this. No questions, no divorce filings. I would open the door, grab my dog, put on my running shoes, and sprint as far as possible in the other direction. What a despicable violation of the basic ethical contract linking one person to another!
-snip-
Found this after a friend on another platform, an expert on IP theft and copyright infringement, recommended it.
Wikipedia article on Rothfeld: https://en.wikipedia.org/wiki/Becca_Rothfeld
Her essay in the New Yorker on the closing of the Washington Post's books section:
https://www.newyorker.com/books/page-turner/the-death-of-book-world
1 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
You don't have to use AI (Becca Rothfeld, staffer at the New Yorker, former nonfiction book critic at WaPo) (Original Post)
highplainsdem
Saturday
OP
SheltieLover
(79,666 posts)1. Excellent article!