Should we care whether AI is writing for games websites?

ai writing

For years, writing has been devalued and denigrated. The advent of AI, Ryan writes, might be our final chance to fight back.


 

The word ‘versificator’ is mentioned only twice in George Orwell’s dystopian novel, Nineteen Eighty-Four. It’s an incidental detail, really, but a fascinating piece of world-building all the same: the versificator is described as a device created to produce everything from salacious stories in “rubbishy newspapers” to similarly disposable bits of popular music.

“The words of these songs were composed without any human intervention whatever on an instrument known as a versificator,” Orwell wrote. “But the woman sang so tunefully as to turn the dreadful rubbish into an almost pleasant sound.”

In the topsy-turvy 21st century, we have something that functions uncannily like the versificator: machine learning. Driven by the vast troves of data circulating on the web, inventions like Midjourney have been used to ‘imagine’ what lies beyond the canvas of the Mona Lisa. ChatGPT has been employed to write songs in the style of Nick Cave, much to the musician’s chagrin.

Artificial intelligence has, of course, also begun to creep into writing and journalism. CNET, a tech site almost as old as the internet itself, has published stories generated using AI in recent months, but many had to be corrected when they were discovered to be full of errors.

More recently, US media company Gamurs Group, which owns such sites as Destructoid, Siliconera and The Escapist, advertised for a new job role: that of AI editor. The successful candidate would be expected to put out “200 to 250 articles per week,” presumably with assistance from an AI-driven bit of tech like ChatGPT.

For reasons unclear, Gamurs swiftly took that job posting down. That it was published in the first place indicates a certain direction of travel for the firm. In March, Gamurs laid off about 40 percent of its 171 staff; mere months later, it came up with a new job position – an editor who can, using AI, single-handedly produce a stream of content that would once have taken several writers to put out.

(Whether an AI editor could actually achieve that figure of 200 to 250 articles per week is debatable; it would require them to generate, check and upload a post every 12 minutes nonstop, eight hours a day, five days per week – a tall order even for the most diligent worker.)

The question is: does it matter? Who cares whether the writing on a website – or in a magazine or book – was generated by AI or a machine?

The answer partly depends on what you want from the things you read. AI’s proponents argue that it’s merely a tool, which is true – you could theoretically use ChatGPT to scrape the internet for, say, a bunch of information on a subject, distilled into bullet points, then use that information as the basis for your own, livelier piece of writing. Even here, though, there are potential problems: a piece of software like ChatGPT won’t fact check for you, and the information it scrapes and then serves up to you is only as good as the data given to it. Microsoft’s Bing AI served up several factual errors during a demo in February, for example.

Besides, the intentions that larger media companies have for AI are very different from an individual writer using ChatGPT as a (flawed) research assistant. Firms like Buzzfeed are using AI to generate travel guides and quizzes. As we’ve already seen, Gamurs appears to have at least tentative plans to plaster the internet with hundreds, if not thousands of AI-generated (and hastily human-fact-checked) posts about pop culture.

The implications, if other media companies do the same thing, are huge. Thousands of writers and other editorial staff will lose their jobs – something we can already see happening at those aforementioned companies and others besides.

The impact on journalism – already hobbled by years of cost-cutting in newsrooms – could be similarly disastrous. AI can’t go out and interview people, or break stories that could be embarrassing to, say, billionaire owners of the world’s biggest video game publishers. It can’t speak truth to power or hold the guilty to account. All it can do is synthesise information that already exists and repackage it.

There are other downsides to this, too: research has shown that an AI fed on AI-generated content will begin to spit out more errors. In extreme cases, researchers discovered, “learning from data produced by other models causes model collapse”. In other words, like photocopying a photocopy, the quality of results degrades as they’re repeatedly fed through a system like ChatGPT. To put it even more simply: AI needs writing created by humans to function properly.

Lastly, there’s a question of style. As humans, we bring our personalities, our life’s stories, our fears, our hang-ups to the things we write. AI isn’t capable of writing passionately about a game it loved as a child; it can’t mourn the loss of a game that was begun and then abandoned. It can’t make wry observations about the foibles and weird conventions found in the entertainment industry, or track down developers left broken from working exhaustingly long hours for months on end.

For years, the value of writing – and art in general – has been denigrated and devalued. Now generically labelled ‘content’, it’s something piped into our screens by increasingly large and faceless companies, even as the wages for writers, journalists and artists have continuously been driven to new lows.

The advent of AI is, perhaps, a chance – perhaps our final chance – to push back on all this. To start thinking again about the value of writing of all kinds, to support outlets, artists and journalists that produce informative, heartfelt and above all human work, and shun the mechanical and the AI-driven. Or, to quote Ross Anderson, a professor at Cambridge and Edinburgh universities, we need to be vocal about our distain for the firms that are “about to fill the Internet with blah.”

Because, what’s the alternative? Do writers let AI take over and find employment elsewhere (which, if they’re as hapless as I am, probably isn’t advisable)? Or do writers instead sit at their equivalent of the versificator dreamed up by George Orwell, watching dreadful rubbish emerge from a machine, wondering how it can be turned into an almost pleasant sound?

In either scenario, the future we face is truly dystopian.

1 Comment

  • Hammy Havoc says:

    Yes, we should care. If we want to read about an artistic medium, having someone capable of analyzing said medium is a requirement.

    Anything not written by a human being is getting unsubbed as an RSS feed because why would I care to read it? If someone couldn’t be bothered writing it, I can’t be bothered reading it.

Leave a Reply

Your email address will not be published. Required fields are marked *

More like this