I've been thinking about three articles I've read recently. The New York Times has run two interesting pieces about the writing of fiction using AI (large language models). The first was about Coral Hart, a romance writer who, under pseudonyms, cranked out some 200 romance novels last year using AI. Today's article was a thinkpiece about the dangers of AI-created (or assisted) fiction and the harm it does to the author-reader bond. The New Yorker ran a piece last month which wondered what will happen if people actually like AI-written fiction--the large language models will only get better at what they do. Related to this, I belong to a Facebook group which is often used by authors to promote their work, and the group admins have banned any promotion for books with AI text or cover art.
First, let me say that I, like many people, am left a bit queasy over the issue of the use of AI to create any kind of art. I hate "AI slop," especially when it is presented as nonfiction or as neutral information. I've recently read many articles or short pieces about movies and books and the theatre that are clearly written by AI. There are stylistic giveaways, but what bothers me more is that many of them include false or misleading information. (In the most egregious piece I've seen, a post featuring a photo of Carol Channing and Lucille Ball meeting backstage made reference to a play that Ball was in that never actually existed.) Because I am an arts and entertainment buff, I can identify the errors quickly, but readers less versed in the topic probably can't, in the same way that I would not notice erroneous information in a article about football or opera. In addition, I have never used generative AI to create anything and don't see myself doing so anytime soon.
I understand some of the concerns of creators. For example, if AI cover art comes to be used extensively, artists would lose work. The same goes for musicians in terms of AI-produced music, and for people who write the short essays I refer to above. But I'm less certain about the harm AI would do in fiction. As I have dipped into reading the occasional gay romance novel (M/M), a genre I'm discovering I don't really like, I've noticed that lots of readers are looking for very specific things in the books they read. They want only happy endings, or only sad endings, or a coming-out story that's not too traumatic, or two firemen falling in love. They want lots of sex, or, more often, not a lot of sex, just affection and cuddles. Vampires in an apocalyptic setting. To quote a recent request I saw, one reader wants stories in which "the protagonist falls in love and gets his heart ruthlessly broken [and] other adventurous elements." I've seen similar requests on Goodreads by readers of straight romances, fantasies, and mysteries.
In my 60 years of reading books, it never dawned on me to have such specific requests when it comes to reading matter. This seems to be related to the social media echo chambers we live in, where we can read only exactly what we want to hear. Obviously, I pick and choose books based on a number of criteria that involve genre, plotting, style, characters, and authors. I like mysteries and classic novels and WWII spy stories; I avoid fiction about sports or finance or teenagers. But I would never limit myself the way that many contemporary readers do. It feels like such a programmatic, mechanical way to read. So maybe AI fiction, which I assume is programmatic and a bit mechanical, is the perfect solution for such readers.
As far as I know, I have never read AI-written fiction, but the male-male romances I've read recently might as well have been. I understand that genres have conventions--for example, I prefer my mysteries solved at the end in a roomful of suspects. But the recent romances I've read are so predictable, and often in trying to avoid triggering vulnerable readers, they are dull and vanilla with characters who don't remain in my imagination an hour after finishing the book. My point here isn't really to bash the genre but to note that AI might do just as good of a job as flesh-and-blood authors at providing entertainment and satisfaction for readers, specifically for genre readers. Someone will still make money. Coral Hart's AI books aren't bestsellers, but last year she sold some 50,000 books and made six figures in income. I can imagine an author seeing the request I quoted above for ruthless heart-breaking adventure, feeding in keywords to AI, getting a novel spat out which would probably satisfy that reader, and making a little money. While the author-reader bond is important to some readers, it's not to everyone, as the requests I've read by readers make clear. And I assume that authors who use AI would change and edit and add to the AI output.
The current case of the book "Shy Girl," which was cancelled by its American publisher when AI content was found, highlights nicely two points. First, the author claims that an editor did any adding of AI content, which means that she must not have actually read her own edited book. Duh. Second, the problem came when the book was picked up by a mainstream publisher. I think, right now at least, it's in the world of self- and indie publishing that AI is showing up most obviously. I would not support nonfiction written by AI, though AI assistance is happening as we speak. But I come around to my title above, and the content of the New Yorker essay: if readers like the fiction they read that is mostly or wholly AI generated, what's the problem? I suspect there is a sense that we've been cheated when we read an AI novel, which hearkens back to the author-reader bond, in the same way we are displeased by accusations of plagiarism. But what if someday, a large language model writes a masterpiece?
To close with a related anecdote, I recently bought a book, an self-published M/M romance called Deep Dish. The cover, clearly AI generated, featured a very hot man and, to be honest about being shallow, that's why I bought it. Also, the idea of reading the adventures of a gay pizza boy was appealing. The book was terrible, the most amateurish piece of writing I've ever read (and I used to teach college writing). The guy delivers pizza after pizza after pizza and nothing happens except he gets tips, or doesn't get tips. A side romance develops that I didn't care an iota about. It was also poorly edited. AI could not have delivered a worse book. But that cover... Thank you, AI.
No comments:
Post a Comment