Some things I object to on principle. Things that are just . . . wrong. Things that shouldn’t be. Things that cheapen or degrade ourselves or others.
This week’s example hit me—appropriately—as I was scrolling my social media feed, a place where devaluation and degradation are becoming the norm rather than the exception.
Normally, I can shrug off such assaults. Performative outrage, shameless puffery, sycophantic fawning, lickspittle tirades, blatant misinformation, misguided memes: these take up a growing fraction of the non-advert portion of my feed, but it was the advert portion that got my dander up.
To the extent of which I am capable, I have turned off ad-tracking. This doesn’t stop me from seeing adverts, but at least it eliminates (okay, reduces) the creepy, Big Brother-esque, “we’re watching you” feeling I get when I ask my wife where the hammer is and then see an advert for ball-peens on Facebook. Sometimes, though, just sometimes, I’m presented with an advert that is somewhat relevant.
This time it was an advert for Jasper. “Artificial intelligence makes it fast & easy to create content!” I was informed. This software has “read 10% of the internet” and would help me create blog posts, social media interaction, and marketing copy up to “10x faster.” It would even help me write a novel that is “original and plagiarism free [sic].”
—[shudder]—
Usually I do not engage with adverts (except by mistake, via clumsily executed clicks) as this only provides fodder for the tracking I try to avoid. In this case, though, I was overcome by a looky-loo train-wreck revulsion/attraction impulse to investigate some of the literally thousands of comments appended to the post.
Let me pause for a moment, as my state of mind, whilst preparing for a dive down this rabbit hole, is pertinent.
I appreciate a well-crafted phrase or sentence, revel in a paragraph that takes me on a little journey, and marvel at novels filled with allusions, metaphors, contextual layers, and well-orchestrated plotlines. Conversely, whenever I read poorly written prose—be it long form or in a short news article—prose that cries out for an editor (“An editor! An editor! My kingdom for an editor!”), I die a little inside.
Yet there I was, faced not only with the prospect novels published without the benefit of an editor, but without the benefit of a writer.
With the burgeoning of algorithms and “artificial intelligence” (quoted here, because it’s not a true intelligence, artificial or otherwise), there are dozens of products and services like Jasper, all of which tout the same credibility-stretching boasts. Write a novel! In the style of your favorite author! In a language you do not know! In mere hours!
To be honest, this kind of algorithmic assist can only help some of the novels I’ve read, but in general, it’s the end-state of our own dumbing-down. Quality no longer carries currency, if this becomes the norm. All we need now is another service that will read this dreck for us (because I sure don’t want to suffer that way).
Eventually, I perused a few hundred of the comments that people (I’m assuming they were actual people) made on the Jasper advert, and I was shocked. All of the comments I read—and I mean all of them—were derisive, often with replies to comments that piled the ridicule higher and higher.
So, there is hope for us. We may not have the collective gumption to oust autocrats, defend democracy, treat women as fully actualized humans, or deal with a planet that’s on fucking fire, but at least I know that a large portion of us think that reading a book written by an algorithm is a stupid, laughable idea.
k
I’m surprised so many people’s comments were actually derisive. There is indeed hope. Then again, if someone thought it a good idea to have AI write a book for him/her, he/she wouldn’t be wise to be too open about it. At least this points to some inherent shame in masquerading as creative when, in fact, the only effort is in deciding on a genre and pressing a button. I wonder how many AI articles and books we’re already reading, though, without being aware…
LikeLiked by 1 person
Yes. There’s an inauthenticity baked into the cake, here, that I think people were reacting to. That and an inherent “I can’t be bothered” type of laziness. However, I’m sure we’re reading more AI output than we know.
LikeLiked by 1 person