AI is a Pandora's Box
Apr. 2, 2025
Preface
This article is not meant to be a deeply researched or evidence-based analysis. It's just one person trying to make sense of a rapidly changing world. My goal is to share my thoughts, open up a discussion, and reflect on how AI is shaping our lives. Take it for what it is: a journal, not a manifesto.
The myth of Pandora's Box
In the ancient Greek myth of Pandora's Box, Pandora's curiosity led her to open a container left in her care by the gods. Upon doing so, she inadvertently released evils into the world, such as sickness and death. Pandora closed the box before releasing the final entity: hope.
Pandora's Box has been used in several articles and social media posts about AI. The intent they want to convey by doing so is that we've released a great evil into the world that we can't undo. I've been struggling with how to feel about this technology, and this article is my attempt to wrangle all of those thoughts into one place.
I use generative AI regularly
The first point I really want to get across is that I've been an active user of generative AI tools since they entered the zeitgeist. I started playing around with ChatGPT and image generators such as Midjourney as early as they were available (I was even in the beta). I've used LLM Studio and run models locally. I've tried it for writing, art generation, coding, research, random questions, etc.
Beyond doing aimless exploration with the tools, I used them to generate temporary scenes or characters for my personal Dungeons & Dragons group, get ideas for my personal game projects, leverage chat-based AI for bouncing coding ideas off of, and even spent an entire year with GitHub Copilot tab completion enabled. I tried most AI models that came out. I was absolutely awestruck by the technology, followed by the overwhelming dread of what this would mean for the world.
The reason I mention all of this is that the current state of affairs when discussing AI is completely polarized. You either love it or hate it.
Love
Strong arguments
"It makes things easier"
Programming is hard, art is hard, writing is hard, coming up with ideas is hard—our lives are hard. Many things in life are difficult and time-consuming. We have very short lives as it is and an even shorter window to explore all of the amazing possibilities the world has to offer. Generative AI tools give us some of that time back to just focus on the results and making things, moving past the tasks we don't care to do ourselves. It's unrealistic to expect that everyone has the time or money to dedicate to an idea or project. If you do have that, great, but if you don't, then utilizing a tool to explore an area you are interested in can save you a lot of time, money, or both.
"It's just another tool that will let me be more productive"
Programmers use frameworks to speed up development, artists use tools to enhance their workflows, and writers use techniques and tools to organize and structure their works. We all utilize tools (physical or digital) to improve our productivity and dip our toes into fields that we otherwise have little ability or time to explore. Generative AI is just another example of that. It is a genuinely amazing technology and feat of human achievement. The fact that these models are actually pretty good at doing a lot of things at a surface level is incredible.
"It allows me to do things I am incapable of otherwise"
Something that is rarely discussed, but I see as a compelling argument, is that people who are impaired physically or mentally are able to produce things that would have been outside of their grasp. This allows them to participate in the same joy of creation that many people have. Beyond the joy of creation, it actually allows many of them to participate in society in ways they couldn't before. The advent of generative AI comes hand-in-hand with a whole new dynamic for building accessibility tools for people with disabilities. The last generation of these tools often required developers to do a lot of legwork to support accessibility protocols. That may be a thing of the past!
Weak arguments
"I'll be able to get my job done faster"
Freeing you up to do what? More work. We aren't headed toward a shorter work week or a world where our corporations have gotten more generous. This is just another tool for the machine. I'd consider this a good argument if you're a fixed-rate worker or self-employed, but anyone who is employed by someone else isn't likely to see a return in their free time, just an increase in their expected output.
"I can compete with experts in the field"
People who are experts in something shouldn't be dismissed because AI tools are capable of making convincing copies of their work. Don't be fooled into thinking that simulating their output means you know what you're doing. If anything, this tool in the wrong hands can make a fool feel like they're smart and get themselves into trouble. The reality is that these tools in the right hands will always be better, and I think that's a good thing. AI tools will generate average approximations for what you ask for. The more specific you are, the better the approximation. The best analogy I have for this as a programmer is that programming languages are for humans to describe to a computer what we want it to do. Computers don't need high-level programming languages—they speak machine code. At a certain point, if your English description requires a certain level of specificity to do something very detail-oriented, you might just be better off writing that yourself. Same applies to art or writing.
"AI copying humans is the same as humans copying each other"
Humans have rights; AI doesn't. We've muddied the waters quite a bit by giving companies rights to do things that are normally only extended to individual citizens, but I'm not making a legal argument here—I'm making a moral one. We know that if a human makes a negligent error or commits a crime, we have someone to hold accountable. When an AI does something like this, the AI companies will just toss their arms up and claim it's operating of its own accord or push the responsibility onto their users.
Hate
Strong arguments
"It's trained on stolen content"
Yes, it absolutely is.
Besides the literal theft of proprietary content that has happened—including Meta having torrented copyrighted books from a corporate laptop— our governments will soon decide which side of the coin to land on as to whether AI training is covered by copyright. I'm again not making a legal argument here. I don't really care how they end up ruling because money talks more than logic or reasoning does. A copyright is a contract—a contract that, had we known what we were giving up long ago, many, if not most, content creators would have opted out of. The reality is the government is likely to land where it usually does: on the side of the wealthy companies that have already stolen the content. These companies are also very happy to litigate against small businesses when their copyrighted content is stolen. They didn't suddenly become paragons of freedom from copyrighting.
"It's going to replace many artists and content creators"
Yes, it will. Especially the gig-economy types that charge a small amount for quick jobs like creating an avatar, art for your game or book, etc. The kind of jobs you see on Fiverr are pretty much doomed. I believe there will still be a market for handmade content and art, and I'll explain more below.
"It's bad for education and it's going to make us lazier"
Very likely, yes. Humans are inherently lazy, and AI plays perfectly into that. AI is uniquely bad in this regard because it's automating the process of thinking, unlike many technological predecessors. If you've seen the recent trend in vibe coding and watched any of those streams, it's like they pride themselves on actively trying to do as little as possible. I myself have gotten too lazy to even type out a full paragraph description of what I'm looking for when generating some image, resorting to just one-word or one-sentence prompts. I've also seen engineers treat answers from LLMs as "research," only to watch that research fall apart when it gave a wrong answer or hallucinated something.
On the education side, education is very slow to adapt. I developed very bad reading habits in school because I skated by with SparkNotes and managed to "accelerate" a few homework assignments with services like Cramster or Chegg. I couldn't really do that with writing or more project-based assignments. AI changes that game entirely. In hindsight, I was 100% borrowing against my future self, and I had to actively work to break some bad habits around reading and go back to relearn fundamental concepts I thought I could skip.
Weak arguments
"It just produces slop"
In the wrong hands, a piano can sound terrible, a drawing can suck, and a program can be buggy. AI is a tool like any other, and lazily generating something without attempting to learn, improve, or polish will have the same results. I think we're seeing an above-average amount of sloppy AI content because it's new, it's popular, and it's easy to start. But as I said before, AI tools in the right hands are far more powerful than in the wrong hands. A rookie software developer can easily make gaping security vulnerabilities or get into an unmaintainable state with their project. Maybe the tools will get better at this, but in their current state, they aren't.
"You have to do it yourself"
Arbitrary gatekeeping. A film photographer can endlessly argue about the virtues of developing film in a darkroom to achieve hand-crafted lighting and shadows, while a digital photographer presses a button to get a similar effect. They're two different art forms, each with their own merits. You decide if you want to do something a specific way, but the market decides who buys what. I'll admit I have a tendency toward this argument, and it's my kind of gut reaction to seeing so much AI content. This is definitely just a personal issue that we need to get over. I'll still appreciate human-made content for the sake of it, like I appreciate watching a live play over a film in some cases.
"But I like doing it myself"
Then do it. No one is stopping you. I enjoy coding games from scratch without using a game engine. It's definitely not the best approach for delivering something quickly, but it matters to me more than the dozens of dollars I might get in sales. If you like doing something, then keep doing it. Just because someone else is doing something differently doesn't invalidate you. If you've never experienced a flow state in some activity, no matter what it is, I feel bad for you. Don't rob yourself of that. Find something that you enjoy doing just for the sake of it.
Pandora's Box can't be closed
Pandora's Box is a perfect analogy for AI because it's out, it's here, and we can download many of these pre-trained models to our computers. Even if we legislated it away, it would refuse to die. So like it or not, you have to live with it. That part actually isn't a choice. You can choose if/when/how you interact with it, but it will be pervasive in your life in a similar way that the internet became, if it isn't already. But let's extend the analogy all the way, because there is still hope.
Hope
With every technological predecessor to this date, there have been new opportunities created that were not clear in the beginning. New possibilities arose that weren't obvious at first. I think we're still riding the hype wave and filtering through the initial garbage stage. We did the same with the internet, cloud computing, social media, the internet of things, blockchain, and now we're doing it with AI. There's no reason to believe this one is special or different just because we're in the middle of it. Not every technology is inherently good, but even with a force as destructive as the atomic bomb came breakthroughs in medicine, science, energy generation, and, strangely enough, peace. I can't pretend to know for sure where this ends up, and there will almost certainly be downsides, but I also can't pretend it's all going to be bad.
There will be changes
Work
Your industry or your job might require you to start interacting with AI in some ways. If you want to stay in that career, you may have no choice but to adapt. Your personal happiness does also matter if you stay in your career, so pick and choose your battles.
Life
Many things happen around us without our consent. If it matters enough to you to stand up for it, do that. If not, then you just have to accept it. Change is hard. There's a reason why there's old human wisdom about accepting things you cannot change.
Hobbies
I get asked a lot about "why don't you use AI to do X" and it's an annoying question, because I actually enjoy my hobbies. If I automated my life away, then what's the point in doing anything? I chat with AI to get some ideas while programming my game, but I want to write the code myself. I use AI to bounce game ideas off of, but I want to design the levels. I use AI to generate visual scenes for my D&D campaign or specific creature stat blocks, but I want to write the story and weave together the narrative. Don't automate the fun out of your life. Pick and choose what you care about and let others do the same.
Your hobbies are your hobbies. You choose what you do there. Digital piano music has been possible with the use of software for a long time now, but it didn't invalidate learning piano. If you're interested in something, go do it with or without AI.
Do what makes you happy.
Reflection on writing this
I wrote this for myself. I wanted to share it both because I believe in putting myself out there, getting feedback, and opening up discussion with friends or colleagues.
This article took me about 4-hours to write. I probably could have had an AI generate a version of this if all I wanted to generate content for the sake of it. If I did that, I would have robbed myself of the self-reflection, organization of my chaotic thoughts, and an opportunity to just lay out something that has been causing me some anxiety in an effort to relieve some of that. No matter what happens with AI, there is value in doing something yourself that goes beyond the end result and is deeply personal. AI cannot rob you of that.