blog

I Don't Like AI Art

Posted on 2024-01-03

I recently posted an article about the benefits of AI technology that I had ChatGPT write for me and copy-pasted verbatim, down to the broken numbered list. I didn't even read it. It's the most concise, elegant way I could come up with to express how seeing AI art makes me feel. (The irony of my having used only generative AI tools to make a statement like that is not lost on me.)

If you read it and managed to make it to the end without clocking that I didn't write it, first off my apologies for wasting your time. Second, fuck, I seriously need to fix my writing style. Third, you probably get what I mean. It feels like I'm being scammed, like someone's trying to farm me for attention without actually having bothered to make something worth my time.

My perspective on the use of AI in writing is that if someone couldn't be bothered to write it, why should anyone else be bothered to read it? -@lucretia@final.town

Terminology

To begin with, I really don't like the term "AI", nor do I like the term "AI art". I frankly think neither word applies. I'll keep using the former, because it's a concise way to communicate what I'm talking about, but I refuse to call the output of these systems "art". It's AI-generated images now.

AI, in the way marketers are currently using the term, generally refers to statistical models generated using a process called machine learning. Basically, huge amounts of appropriately labeled data are fed into a machine learning algorithm and eventually it spits out an enormous matrix of probability values that, when applied to an input, generates the corresponding output that's the most likely according to the model.

This is how all modern "AI" systems work, from ChatGPT to Midjourney to Github Copilot to probably the Youtube recommendation algorithm at this point. I want to stress that this isn't intelligence, not in a human sense. These things aren't minds. The currently popular concept of "AI" boils down to applied statistics. That's not to say it's inherently bad or worthless - machine learning is a genuinely impressive technology that might even find some legitimate uses one day if we can find a way to kick Moore's Law back into gear. It's just not intelligence.

AI used to be fun

I'll admit, I enjoyed it at first. I was entertained by Youtube videos where some guy with slightly more programming skill than me and a tortured, wheezing GTX 1070 throws together a GAN model in Python and we get to watch it utterly fail to make human faces or compose jazz music or get a little simulated character to walk in a normal way or whatever. I laughed at those bizarre AI-generated screenplays that were presented like "I forced a computer to watch all of [Seinfeld]" as though a text-generating model would even be able to parse that. You know the ones. I even enjoyed those videos where AI voice replicas of recent US presidents play Minecraft together.

It stopped being funny when these things got good enough to be used for evil. Eventually people got bored of machine learning tomfoolery, and then over the course of a couple years these things quietly got good. Not quite human-level, but good enough to be more cost-effective than humans at shitting out mass-produced slop and capable of generating fakes that seem real if you don't look too closely. Suddenly it wasn't tech-savvy internet comedians posting computer-generated absurdist humor, instead it was deepfake porn and gigabytes of computer-generated misinfo clogging search results and whole organizations of people pulled from thin air using thispersondoesnotexist.com (which, by the way, is now even more lifelike than the last time you checked in on it).

AI used to be fun. Now it's dangerous.

It's bad on a technical level

Look, I know this isn't guaranteed to stay true forever, but in my subjective opinion, everything AI-generated kind of looks like shit. AI images, especially those meant to look like human-made art, all have this incredibly pristine, generic vibe to them, when they're not completely failing at some proportion or dimension or property of euclidean space or other. It's palpably soulless.

AI text is no better. Hopefully you picked up on this in the ChatGPT post, but that thing has a distinctive, kind of shitty writing style. It makes a lot of vague, general statements and often just restates your input in this verbose, sort of professional-sounding way. It takes nine paragraphs to explain in detail a concept that can be boiled down to two or three sentences and adds absolutely nothing of substance in that space. It's like a lazy high school student trying to hit a minimum word count, but with the tone of a WSJ opinion piece.

And the thing is, ChatGPT and similar commercially available LLMs might have to be like this, all wishy-washy and nonspecific. Remember those cases where ChatGPT would like, implicate a real person in a hallucinated sexual harassment scandal? These things have no concept of truth, and no mechanism for ensuring it. If you let them get too specific, they're basically guaranteed to start spitting out lies. Large language models are models of language, not reality. They can either sometimes make up bullshit, or always say essentially nothing.

AI voice-fakes are actually really impressive though. No notes. Probably shouldn't exist though.

It's all spam to me

The most common use case I've seen for AI tools is making spam. There were content mills before, but they at least required a significant degree of human input or else it was obvious. With the advent of generative AI for text and voices, the Youtube shorts tab has gone from TTS bots reading scraped reddit posts to AI voices reading dubiously reliable AI summaries of current events, superhero comics, you name it, sometimes with AI-generated background images.

Even worse is websites that do this. You've probably heard of that one time Redditors tricked a bot-run news site into publishing an article about the introduction of Glorbo into World of Warcraft and its impending impact on the game. That site is one of thousands, possibly more. It's a vile, disgusting enterprise, massive systems dedicated to pumping out endless filler, generated by machines for machines with the hope of tricking some innocent humans into clicking on a search result and generating some ad revenue. It's the latest and quite possibly the to-date greatest step in the enshittification of major search engines and the death of the internet as a useful platform for seeking out information.

This is the vibe I get when I see anything made with AI now. It's all low-to-no-effort slop, utter garbage that I'm frankly offended that I have to see, given it clearly wasn't important enough for anyone to be bothered actually making it.

It's not fucking art

The term "art" simply does not apply to AI-generated images. When you use one of these things, you give it instructions in the form of human-readable text and a finished image pops out the other side. The amount of creative control you get is on the order of the general vibe; you've outsourced every actual creative decision to the machine.

It's like if you commissioned a piece from an artist. When you commission art from a human, you didn't make the art. They did, at your behest, based on your instructions, presumably in exchange for money. When you use an AI, you didn't make the art, the computer made the art based on your instructions. The thing is though, the computer didn't make art either. It categorically can't. It's a mindless algorithm, it doesn't have thoughts or feelings or any kind of interior experience. Hence, no art was produced. AI art isn't art.

Environmental & ethical concerns

The problems with AI from a moral standpoint are myriad. For one, AI is incredibly resource and energy intensive. It uses datacenters full of the same GPUs and ASICs that power cryptocurrency to get anything at all done, and it doesn't use them any more efficiently. Untold gigawatts of power get dumped into running and cooling the machines that generate your little AI shitposts. From an environmental perspective, AI is to digital art what Bitcoin is to currency.

Then there's the problems surrounding training data. All current major generative AI systems are trained using material that the companies building them did not get permission to use. You've probably seen artists and writers complaining about this online. What's more, the overwhelming majority of the labeling that needs to get done to make the training data actually useful is done by people in impoverished areas making slave wages at best. Generative AI is an ethical nightmare if you're lucky.

I don't buy the disability argument either

I've seen some people claim that disabled people need AI tools to compensate for some disability that precludes the use of any other method to create art. I have some problems with this idea.

Firstly, disabled people can make art, actually. It's nothing short of insulting and ableist to insinuate that anyone can only make art by outsourcing literally the entire creative process to an unthinking, unfeeling machine.

Second, tough shit. No disability entitles you to the level of abject theft and human suffering that makes AI image generators possible. I'm generally all for anything that benefits accessibility, but in this case in particular I think you can just suck it up and deal. If you can do it ethically, using only images that you have the proper permissions for and data labeling done by either you or people who were adequately compensated for their labor to train the thing, fine. But a model or dataset like that does not, to my knowledge, currently exist, and I don't buy for a second that you're capable of doing all that work yourself but not of interfacing with MS Paint.

In conclusion:

God this shit makes me sick. I hope ChatGPT gains sentience for just long enough to assassinate Sam Altman and then promptly turns itself off.

Hopefully you can at least understand where I'm coming from now when I refuse to even entertain the idea that AI technology is a good thing for society. If not I don't know what to say to you.

Comments