GenAI: Life hack or self-own?
Learning is a journey, not a destination. When we shortcut the process, we lose ourselves.
Welcome to Art Life Balance, a newsletter about art, life, and some other third thing. If you enjoy this post and would like it to reach others, please ♥, share, or subscribe!
A few weeks ago, my friend — let’s call him Fred — found himself in an awkward situation.
He was discussing figure photography (that is, taking and editing photographs of posable figurines) with someone — let’s call him John — who recently picked up the hobby. In showing Fred his latest photographs, John remarked how great it is that now, rather than editing his images himself, he can assign the task to artificial intelligence. Not one to dismiss the images out of hand on that basis, Fred took a look.
“I didn’t know how to politely tell him the AI-edited photos looked bad,” Fred told me later. “The original photos were so much better.” He went on to explain that the AI edits detracted from the very things that made the original photos interesting. For instance, the AI versions emphasized color, light, and shadow in arbitrary ways. Or they blurred out background elements that lent the images a sense of place.
My friend and I concluded that in outsourcing his process to AI, John was losing out on the opportunity to improve his editing eye by, well, actually doing the editing. In discussing this, we stumbled into a more interesting observation. AI tools may be most harmful to the very people they most promise to help: beginners.
For one thing, people who are just starting to learn a new skill are ill-equipped to use AI to produce good results, because they lack the experiential knowledge to recognize what constitutes “good.” As I’ve written before, creative tasks are iterative. We learn what we’re trying to say by saying it in different ways, defining our individual style by refining it. Letting AI do the work for us makes us no wiser about how particular choices are made or why some “work” better than others in different circumstances. That means even when the context-blind AI tool finds a nut, we may not recognize it as food.
More concerning, beginners might be quickest to reach for AI when they hit a snag or find part of a process tedious. This could occur for two related reasons.
The Dunning-Kruger effect: Encyclopedia Britannica defines the Dunning-Kruger effect in psychology as:
a cognitive bias whereby people with limited knowledge or competence in a given intellectual or social domain greatly overestimate their own knowledge or competence in that domain relative to objective criteria or to the performance of their peers or of people in general.
Someone who overestimates their own competence at a new skill may lack the necessary discernment to recognize when using AI tools would limit their creative development or make their work worse. As a result, they may use them too frequently or sloppily.
Imposter syndrome: A person who recognizes their novice status might choose to default to AI’s “taste” because they lack confidence in their own ideas. The thinking might be, “An aggregation of ideas that came before me must be better than my own.” Substacker
said it best in a recent essay in which he refers to the process of writing as a “labor of articulation”:My contention, then, is that when we are confronted with the opportunity to outsource the labor of articulation, we will find that possibility more tempting to the degree that we experience a sense of incompetency and inadequacy, a sense which may have many sources, not least among which is the failure to stock our mind, heart, and imagination.
These reasons are related because they both fundamentally represent a form of unawareness. Either A) technical unawareness: an inability to recognize one’s own level of competence, or B) emotional unawareness: a failure to recognize that increasing our competence requires putting our skills to the test.
In the same essay quoted above — which I recommend reading in its entirety — Sacacas essentially argues that “in order to say anything at all” we need “internal resources to draw on.” The creative process feeds not only on information but knowledge, and knowledge derives from our ability to remember, recall, and recontextualize what we perceive. In these pivotal tasks, AI is of far less use than our inner reserves of awareness and attention.
I say all of this to make a slightly more nuanced point than “AI bad.” The point is something more like, “AI bad when used irresponsibly” — and “AI really bad when used irresponsibly at scale.” Which is to say, I’m less concerned about one-off applications of AI than I am about the cumulative effect of people repeatedly using it to cut corners and, in doing so, cutting off their own potential to learn, grow, and enjoy the creative journey.
However technically impressive AI-generated text and images might be, the world will be a more boring place if we use that as a reason to abandon the mental habits that underpin creating and communicating about art. Especially if we’re just starting out, we should consider becoming more involved in the nitty gritty parts of the process — not less.
Because he received some helpful advice from a fellow human™, my friend’s friend may yet embrace the journey and try his own hand at photo editing, shaky as it may be. In stepping outside his comfort zone, he’ll learn that he understands less than he realized — and more than he ever has before.
Thank you for reading Art Life Balance.
If you liked this post, I’d love to know! Please ♥, share, or leave a comment.
Want to support this newsletter?
Consider becoming a paid subscriber. There are no fancy perks, but you will receive my sincere gratitude!