HomeOpinionAI Content Generation Is...

AI Content Generation Is Overrated, And We Should Accept This

This is an opinion column. Opinions don't represent a Splaitor.com point of view. All opinions expressed here are the contributor's own.

AI is overrated, and that’s a fact we have to accept. 2023 was a real AI boom, but did it make sense? Not really, as much as it may seem.

I thought it was incredible when I first got access to generative AI. You could write a task to it, and it would execute it. For example, it will generate text for you. And that text will seem like it should be read. But only until you’re immersed in what you’re reading.

Any generative AI, all nuances aside, is just a blender loaded with millions of texts and “trained” a model on these examples of how to write. Again, details aside, it works simply as guessing a sequence of words. The AI receives a query and selects words in a way that builds a coherent text based on what was most often seen in the training model.

And that makes generative AI very overrated. Another thing that impressed me about 2023 is the Red Ventures story. The company seemed to have found a goldmine for how to use AI to generate articles.

And it seems to make sense to some “effective managers” who see it as an opportunity to cut costs on writers while creating more content. It’s not just about Red Ventures; it’s about the industry as a whole. This approach may seem very reasonable at first glance: less cost, more benefit.

Is the content a king?

The problem is that generative AI is unsuitable for journalistic reporting, complex and unique articles or opinions. Suppose you try to write a complex article with AI. In that case, you will either fail because the content will be inaccurate, mediocre, and too general that the time to revise it will not be worth the effort using AI (every editor knows this well; sometimes it is easier to write a good text from scratch than to edit what is poorly written). Or you won’t spend the effort to refine the content, in which case you’ll get a text that is about equal in quality to a copywriter from a content farm.

Still, now, even the most novice authors who have no understanding of what they write about rarely take and rewrite content from one source, slightly rephrasing it and changing words in places. This is straight plagiarism, and almost no one but the most stupid “authors” do this. Instead, a “copywriter” will open the first 4-5 links in Google, read what is written, and rewrite it in his own words. Sometimes, important details are missed, making the article too simple, general, and thus wholly uninformative.

I have encountered such authors many times in my career. Sometimes, I told them goodbye; sometimes, I edited them to make them readable and increase their usefulness. I can say with certainty that it is much easier to say goodbye to such an author than to spend much time editing the source material.

It would seem that there is one niche here where AI-generated content can be king: SEO-optimized articles to farm clicks from search. Usually, a person has a clear query and wants to find something that fulfills that query.

Here, unlike more complex forms of content, the text doesn’t have to be too unique because often the information is common knowledge. So many people think you can just run an AI, and it will throw up text in a few minutes, which can then be monetized through clicks. This can be as simple as a native ad that will bombard users when they get to the site—and additional earnings through clicks and affiliate links.

This is a win-win model, especially if you have a highly authoritative website. Google loves such sites and puts them in good positions in the search engine, implying that authority is the key to quality. But the thing is, it’s a road to nowhere, especially for the creators of such content.

Red Ventures, the company behind brands like CNET, Bankrate, ZDNET, The Points Guy, and more, on the audio recording Futurism received, explicitly discusses how “the future is AI” and how they will be able to utilize generative AI in their business. And it all sounds terrifying. But not because I’m afraid that “machines will replace writers.” On the contrary, I’m afraid Red Ventures and hundreds of other companies are rowing in the wrong direction. We might as well hire freelance writers from low-wage countries to cut costs.

According to Glassdoor (although it’s always inaccurate and should be treated with skepticism because it’s based on anonymous reports submitted by users), the average salary of a CNET writer is $74,000 a year. An impressive sum could be reduced almost seven times if the company started recruiting writers from lower-wage countries like India. The main problem with why big brands don’t do this is that it does not always get the right results. More precisely, it doesn’t bring the desired result.

The content is a king

The main problem is that good content is always much deeper than what AI can generate. It may seem to some that AI generation gives a decent result, especially if it is tweaked a bit after the post is generated: add pictures, headlines, and some facts, quotes, and links.

But this is decidedly insufficient, and such content will almost always be superficial. Or you can feed ChatGPT or any other AI model a ready-made article written by someone else. In that case, chances are he’ll rewrite it, and it won’t lose meaning as he writes it. But the truth is that it will be almost complete plagiarism because your AI-derived article will be a paraphrase of the original article.

And that says only one thing: AI is overrated. Although we, on the wave of success, think that AI is the future (and probably is), as with many other innovations, its application is more likely to be on a different plane. This was well written by Andrei Nimin, who said AI would not replace programmers. The same goes for content creators.

In most cases, what you get out of it will be low-quality content that won’t help make your audience more loyal. Sure, the media gets a “click.” But then there’s a chance you won’t get them again because readers will bypass your media. And that’s a long-term risk that big media who hope to get more with AI need to consider.

The other side is smaller media that fill themselves with AI-generated content. In that case, the benefits might seem obvious: low costs and the ability to publish even more per day than the NY Times publishes. If you hope to get clicks, Google will still come after you, sooner or later. And you’re going to lose a lot. I’m not addressing the creators of content farms like The Tech Edvocate, who generate hundreds of articles daily. Understandably, this is a way for them to “get a quick buck.” But if you value your website, you better not run after quick content.

So, why is AI overrated?

AI-generated content is always so bland and “generic” that it cannot be considered quality content. Moreover, if you use AI to generate articles, you are engaging in indirect plagiarism by simply reinterpreting the content used to train the AI model. Besides the obvious problems with AI being hallucinatory and producing inaccurate information, it is also incapable of generating new information or new ideas. As a result, we end up with a cycle of useless content that doesn’t help the media.

Keep in mind; I’m only talking about using AI to create content, not AI as an assistant that can help make summaries or analyze information to use it to write more in-depth articles.

Discussion

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More similar stories