Please tell me, if you were reading something intended as a romance and it started to talk about fiery embraces of the lips, you’d laugh. Please tell me you’d glaze over if someone told you that the lions at the zoo look big and powerful. Please tell me you can’t look at artificial writing and mistake it for real. At the very least, try reading it aloud. As a matter of personal research, I searched online for a free AI content generator that would put out 300 words or so in response to whatever parameters I might feed in. “I want a poem about the moon,” I typed in, “with use of rhyme and metaphor.” The invisible robot delivered twenty lines of—no, I cannot call it poetry. Verse. Flat, technically correct but laughably inconsequential lines of something that tried not to be doggerel but horribly failed. In a matter of seconds, the AI entity had scoured its source material for all the commonest, most obvious moon imagery, and juggled “silver light”, “glowing orb”, “beacon” and “wisdom” with rhymes like balm/calm, unknown/shown, gift/lift and so on. Maybe an undiscriminating reader might not flinch too hard at phrases like “gentle guide” and “dear lunar friend” (ew, and ew). Many a would-be poet has committed offenses just like these and worse: wrenched rhyme, trite message, cliched imagery, bland and meaningless positivity—and many a friend/family member has praised their shallowest conceits and woodenest attempts at versification. I’m sure an AI could compete with a human writer who knows little of their craft and has nothing original to say. Does this mean that an AI content-generator can compete with humans who know how to write? I tried again. I asked it to write me an opinion-piece against the use of AI content-writers. It delivered a bunch of all the standard concerns against AI writing, all mixed in together to the point of self-contradiction. It threw around words like “ethics” and “belief”, but failed to express any useful line of reasoning. It read like a collage of random sentences sorted into a five-paragraph template that arrived at a generalised “conclusion” devoid of nuance or proviso. Is this what’s supposed to make human thought-pieces obsolete? I won’t claim that AI has nothing to add to literary endeavour. Spellcheckers and grammar apps can help to eliminate most (not all!) sentence-level errors. And, conceivably, an automated program that throws together random words or phrases could now and then come up with an association of ideas that might be worth developing… by a human poet who knows on a very nuanced level what works and what doesn’t. What I DON’T believe is that a roomful of AIs will ever come close to the works of Shakespeare, nor even invent a single new phrase that falls trippingly off the tongue. “I mean, it’s not BAD WRITING,” people comment when they see an AI’s attempts. Well, let me beg to differ. Every sample I’ve seen so far is very, very bad writing. It may be grammatically correct and structurally unsurprising. It may list a series of concepts it’s picked up elsewhere. It may successfully plod through a sequence of events, carrying the plot of a story like a conveyor belt in a factory. But the protagonist of an AI-produced novel will never “take over” and bend the plot by refusing to act out of character. The story will be inorganic, forgettable. Metaphorically speaking, God has not breathed into its nostrils the breath of life, and it has not become a living soul. And I’m betting—betting my credibility as a reader and a writer—that it never will. Of course AI writing will improve, improve and improve. Chewing its way through bigger and bigger files of existing material in each genre, it will aim for an ever-more-unexceptional average of whatever’s already been written. It will echo more and more eerily the tropiest human-written tripe, but it will never pass what I choose to call the Gemma Test. Let me tell you about Gemma. She is a nice little girl who always does what is expected. She does her best in school; she rather likes doing writing, because her writing is the neatest in the class. If you take Gemma to the zoo and then ask her to write about it, she’ll write that the lions are big. If the word “powerful” happens to appear on the classroom display (“Uplevelling Our Writing”), she’ll stick it in too. She will never be the one to venture that the lions sleep close in together like all the different moods inside your head. She will do her best to use those things called Metaphors when the teacher adds them to the “Uplevelling” wall chart, but she’ll never explore it, divert it, subvert it. Her writing stays firmly within the box. She will obediently state that the Moon is a silver coin. She will never suppose that the Moon is the weeping ghost of an Earth that could have been. She won’t physically bully the classmate who said that, but she’ll Stop Being Friends with her, and the rest of the Borg will finish her off. So here’s the Gemma Test: Does the passage sound like Gemma wrote it? Would Gemma understand it? Would she stay friends with you after she’d read it? Gemma, by the way, is completely fictional. Any resemblance, etc. etc. She’s merely the figment of a thought experiment in recognising whether a piece of writing has personality, attitude, originality. Humans’ writing may be derivative, but AIs cannot not be derivative. Why then are so many authors afraid that AI material could replace their work? If we write like Gemma, then maybe we are replaceable. But every lion in my head is roaring that I am not. That there is and always will be writing that an AI could not have written. That AIs cannot create the same things we create. That an AI cannot do me.
We don’t need a Butlerian Jihad (thanks all the same, Frank Herbert, inimitable as you are and always will be). There is no machine in the true likeness of the human mind. There is no robot Shakespeare. You’re only replaceable if your writing consistently fails the Gemma Test: if your stories are all tell-not-show; if you never use any literary device deeper than a secondhand metaphor; if you cannot sense the best way to use alliteration or literary allusion; if your characters and their dialogue never quite come alive. If you stay inside the box. But if you will learn your craft and write as only a human can, you cannot be replaced by a robot.
3 Comments
Corinne Murphy
27/12/2023 02:37:40 am
I avoided it for some time, but recently had a sort of conversation with Chat GPT. There is definitely a use for it for non-writers, I think. But, for anyone else, it’s a bit like peeking over a young child’s shoulder at his homework assignment. When I queried my bot about being AI and opinions…well, (she?, he?) IT got a teensy bit defensive…Erk! Is that even possible?
Reply
Fiona M Jones
27/12/2023 03:06:05 am
Yes, I can definitely see that there's going to be "more efficient... content creation". But since that content is going to be nothing more than a mashup of whatever's already been fed into it, there is no potential for anything truly original--and what AI does produce makes for painful reading as well. Here's Margaret Atwood's take on it: https://thewalrus.ca/margaret-atwood-ai/?utm_source=pocket-newtab-en-gb --and, like me, she reckons AI could replace hacky writers who have nothing to add, but it cannot replace true creatives with original ideas.
Reply
Leave a Reply. |
AuthorFiona M Jones is a creative writer living in Scotland. Her short fiction, CNF, poetry and educational content is published all over the world, and one of her stories gained a star rating in Tangent Online's "Recommended Reading" list for 2020. You can follow Fiona's work through @FiiJ20 on Facebook and Twitter. Archives
November 2024
Categories
All
|