Ever one to hope for the best, last semester I suggested that students use ChatGPT as a “research partner” in the big bibliographic project they do in the Fall term of Parallel Narratives, their Junior-level history-and-criticism class. In a 27-person class, only one student took me up on the idea. The rest looked vaguely distrustful, and didn’t consider it. Later in the semester, that one student abandoned his plan for total ChatGPT involvement when it turned out his subject was too arcane for the averaging of averages on which academic AI sourcing depends. I figured that, like all new things, it would take a bit for ths approach to catch on, and we all thought about other things for a few months.
Two days ago, at the start of a new course, I imagined that the value of large language models in the classroom had most likely advanced geometrically, since every lanyarded engineer I see on the street in Seattle is working on it. I suggested to a studio class of 13 that they might, after going through a torturous semiotic matrix image-making process on their own, find what the most likely response would have been by checking ChatGPT. The class, which up till then had been a quiet group, erupted into vociferous pushback. In just a few months, students’ general distrust of AI had turned into a clearly defined, clearly delineated hatred.
A student who until this time had uttered not a word in class held forth for some time on the entirely wasteful use of energy and water. Two other students jumped in, nodding and agreeing with each other about their revulsion for AI illustration. Students on the other side of the room chimed in about the stealing of millions of pieces of “fodder imagery” scraped from unwitting and uncompensated artists. But the comment they all agreed on was this, “I’m sick of instructors telling me that I have to learn it--that I have to use it or I won’t be able to get a job.”
Only one student, who had earlier confided in me that they had used ChatGPT to fill the matrix because their brain just did not work logically, held back from the fray. And then they said something I have been hearing for the last thirty years. They said, “It’s just a tool.”
I felt a need to sit down. And this is why.
Around 1995 I worked for the AIGA in New York, the American Institute of Graphic Arts, the big American association of graphic designers. In my time there, I sat in many board meetings and many Chapter meetings and, well, just many, many meetings where the topic of conversation was the overwhelming threat of computers and how they would destroy the profession, cause a terrible dip in employment, leave us all without a way to make money, make everything just terrible and also kill us. In these meetings, this drumbeat of doom was generally finally countered by some “early adopter” wearing all-black and just back from a Macintosh Conference, who smiled as if at the dimwitted, shook their head, and said in a world-weary manner, “The computer is just a tool.”
Thirty years later and guess what? We didn’t die. Hundreds of design programs churn out thousands of design graduates every year, just as they did thirty years ago. And, just like many years ago, only a few of those graduates stick with the profession for longer than 3 years. Those that stay may not be using X-acto knives and Rubylith, but they seem to be wrangling Procreate fairly adroitly.
And now you think I am going to say that we should all relax, that the computer turned out to just be a tool after all, and that AI will also turn out just to be a tool, and that the future for designers will be all pink clouds and rosy sunrises and birdsong. But I am not going to say that we should all relax. As a matter of fact, my students will realize at this point that I’m about to flip the script on you here, and say exactly the opposite.
But first I have to tell you another thing.
Thirty years ago, the AIGA organized big, nationwide competitions of graphic design work. We thought the fax machine to be a pretty wonderful invention. The internet was not a thing yet, though we did have email, and should have seen it coming. Since there were no PDFs in those days, people mailed or FedExed their competition entries to the AIGA in New York. Our competitions coordinator, Nathan Glück, unwrapped each of these printed pieces and arranged them all on lines of tables, using long sheets of butcher paper to divide the categories, and to keep prying eyes away. We called this “the lasagna method” of show judging. The process was very formal, very correct.
The design judges flew in from all over the US, and AIGA paid for them to stay in a nice place and took them out to dinners and lunches, all the while talking with them about how business was in their neck of the woods. About what European designer had made what and when. About whom they hoped to lure to their design firm. When not judging or eating, these illustrious people—and they really were pretty illustrious—design fame being a thing in those days-- would visit with the staff and talk about what we could do to help them keep the association percolating along.
But mostly they spent many well-fed but grueling days with the other judges, standing around the work, looking closely at every printed piece and discussing everything about each piece, and making their choices for the best pieces of graphic design produced by corporate designers in that year. It was all analog. I’m not saying it was good old days. It was hard, and at the end of it everyone was exhausted. Part of that exhaustion came from looking at a lot of really good work, work they’d never seen before.
A couple of summers ago, I was a judge for the 365 Show, which is the AIGA’s big corporate competition these days. The competition’s administrator sent information for accessing the reviewing site, and I reviewed 650 PDFs sitting at my kitchen table. For warm day after warm day, I sat there, looking at one image of graphic design after another on my little laptop, and deciding which of these images I believed should go to a second round of judging. After we’d reviewed the first round and met on Zoom we all went back to our kitchen tables and looked again at the 300 or so images that had made the first cut. And on we went, sitting wherever we were and looking at little images until the last round, weeks later, when we met again by Zoom and discussed the final contenders for awards in each category.
I loved the other judges. But if I had not known most of them already, I would have had little impression of them because of Zoom’s constraints. Sure, it all was much easier than flying to New York, staying a few days, and flying home. But I missed being in the city, I missed the hugs of old friends, and I missed standing around with various other judges, talking about this image or that type. But mostly I missed being surprised by something, finding joy in the beauty of something.
Most of the work we looked at was similar in textures, colors, type. Some of it was very good. None of it was particularly memorable. All the work looked pretty or fun, because Adobe makes it very easy to make things look pretty and fun. Some of it was very well-thought out. But most of it seemed as though it had been made by the same person. By the same consciousness. It felt odd that there was so little difference between things. The myriad constraints of the applications used to make it—and those used to judge it--though seemingly invisible, were visible in the outcome.
As it turns out, the computer was “a tool,” but not the tool those 30-years-ago “early adopters” believed (because they had been told to believe) it would be. It was no pencil: It made getting into the profession expensive, and the expense of the tools had an immediate effect on who got into the business. The migration of design onto applications and then to internet applications and lease agreements cost money, and now never stops costing money. So that is the first thing. Computers made design a profession for people who had the money to join in. You needed seed money. Before computers, you could own an X-acto knife and a pen and a drawing board and make money. I know because I did.
Second, demanding that designers use computers--making InDesign the industry standard, for instance, pressed people of talent into a regulated way of working that made their work easier to commodify. The universality of the tools took designers from being unique individuals with their own unique voices and methodologies to users of a system of “creation” that was in fact a system of indoctrination and labor exploitation.
There’s a reason for that current cry in “artistic circles” that the last twenty years has been markedly deficient in genius--markedly deficient in new ideas. For all the “innovation” talk we’ve heard, there’s been very little of it in the visual and musical arts that heretofore informed design. We’ve become excellent at repackaging works of old genius, but new genius must comply with using the expensive tools in prescripted ways. And nothing makes genius fizzle like prescription. The computer is a tool, all right: A tool for tech bro domination of the means of production.
My students are generally around 19 or 20 years old. They are just coming into the first flush of their talent. But they’ve had a phone since they were 10 or so, and have used an iPad since they were two, and have learned to order their making to fit those parameters and no others. Instead of their coming to my studio classes fresh, they come techno-weary.
The big money that rules design technology insists that they cleave to what is expected, insists on their ordering their genius to fit corporate systems of making— even gets them to hold up their early efforts to the world, exposing their juvenilia to the grubby hands of Instagram. We blow the windows and doors off of the house of their individuality, and expect them to come up with fresh ideas and things not seen before. We give our young artists and designers no room to consolidate their personas before we demand that they learn to comply to a world of snap-to grids.
The computer was not just a tool. And AI isn’t, either.
I’m fine with AI in medical research and I’m happy if it figures out equations that are going to keep airplanes in the air and all that. But why—have you ever asked yourself-- did the tech bros feel that the first and best place to start getting the public used to it was by providing tools that would write and design for them-- draw for them and think for them? Why would it be important to start teaching people to not write and think they are writing? Not design and think they are designing? Not make art and think they are making art? My Sophomore propaganda class students know the answer to that.
Let’s get in just a bit deeper. What could be more colonizing than leap-frogging an individual’s power to create, making ”creative tools” depend on databases of predigested thoughts and images? An artist’s or writer or designer’s work is not a “most likely” solution averaged from every piece that has gone before it. It is a unique expression born only from that person’s experience colliding with their one and only mind. Now, with AI all over, of all places, the arts—the center of the soul of a culture--instructors like me are truant if not paving the way for students to squeeze themselves into “working with” large language models—the same AI that we tell them will out-design them, out-write them and out-think them, make them impoverished, and render their individuality unimportant.
Anytime you tell people they must use certain tools to create correct outcomes you are in a situation more fascist than free. The Modernists especially enjoyed creating lines that their students then had to toe—why do you think Learning from Las Vegas was such a big hit? And politics has always played its part-- Henryk Tomaszewski was great for a reason. It’s a syndrome we had been under the impression we were sloughing. In the last few years, it looked like we all were going to learn not just from Las Vegas, but from each other, from various world views, various hierarchies of value. But with the tech bros making decisions about what goes into the AI pot, what to scrape and whom to ignore in the illegal scraping, this broadening of the definition of what is useful and valuable in the understanding of art and design and its stories may come to an abrupt end.
Telling students that they must get along with the “assistance” of AI or can’t be writers or designers or illustrators is a usurpation of the freedom of the individual to go through the pain of their own time and craft and art, and fight it through, and reap the reward of individual accomplishment.