Every December, I feel an erratic kind of ache in my body. I should anticipate this by now, yet here I am, startled by the pressures of closing out another year like a frenzied rabbit hunted by late capitalism. The clickety-clack of my keyboard doesn’t let up as I chew on my own tongue, racing to finish projects I swore I had more runway for, cobbling together year-end reflections I barely remember, stitching a patchwork of new-year goals I already know will get disrupted. Outside my window, Amazon trucks barrel down the street delivering glittering plastic; outside my browser window, apps parade the data they’ve collected on me in saccharine year-in-review reels. Events and errands coat the calendar so thick; time boils to syrup. Sticky, heavy, impossible to move through. All the while, in Seattle, where I live, the sun rises at 7:53AM and sets at 4:19PM, plunging me into darkness before I’ve even thought to step outside. There’s only one thing I want to do. Rest.
This end-of-year crunch goes even harder if you, like me, have a birthday in December. “It’s hard to remember your own birthday, let alone expect other people to remember it too,” a fellow December-birthday friend told me. “But in the end, all I really want is for my friends and family to be real when they communicate with me.”
“Tell me more,” I said.
“My friend texted me the other day, and I could tell she used AI to write the text. It just wasn’t her voice or grammar. I get using AI to generate work emails…but this? I thought we both felt safe saying anything to each other, even if it’s messy. Now, I don’t know anymore.”
Confusion over what’s real and what’s artificial became the new baseline of personhood this year, blurred further when the reality we face is a daily horror show. It’s hard to recall, if at all, the last time communication felt safe with one another. The internet’s relentless acceleration over the last decade has undeniably reengineered reality and our grip of it; our paranoia is now enmeshed with algorithms that fed us like we were new Tamagotchis. Whether AI was used to write the text almost doesn’t matter, the fact that technology keeps outrunning ethical frameworks, safety regulations, and, hell, even qualitative research is enough to strain interpersonal and societal trust. Yet something about my friend’s story, and the quake in her voice as she told it, wobbles my December dysregulation even more. I suspect it traces back to the last two years: as generative AI and large language models move from hype to habit—now used by over one billion people globally each month—our reliance on AI is taking root while our capacity to trust in people, in reality, even in ourselves, is withering.
2025 was the year we saw AI-generated articles begin to outnumber human-made ones. Automated bots now make up more than half of global internet traffic. A Deezer/Ipsos survey found that 97 percent of people can’t tell the difference between AI-generated and human-made music. Google’s Gemini skyrocketed its monthly user base from 450 million in July to 650 million in October, and over 30 percent of the entire company’s code is now written by AI. Scientists and researchers are hiding AI prompt instructions in their academic papers to get positive peer reviews. OpenAI revealed that 1.2 million people are demonstrating weekly “heightened levels” of emotional attachment to ChatGPT. People are getting caught cheating on their spouse after their chatbot history on their shared computer exposed variations on how to get away with it.
“Y’all are driving up energy prices in the middle of a climate crisis?” cries poet Saeed Jones on my Instagram feed, author of Alive at the End of the World. “Y’all are using ChatGPT to write wedding vows?! Let me see someone using a chatbot in front of me. Let me see it, because I am going to slap your phone out of your —”
Soon, all writing might be some form of AI writing. All code might be some form of AI code. Multiple studies report that workers with access to AI tools are completing, on average, “12 percent more tasks, at 25 percent faster work speeds, with 40 percent higher work quality.” But what does quality become when we outsource the skill development required to recognize it, or when standards of craft narrow to the models and patterns of a single tool? One study on AI assistance in legal analysis found that, “access to GPT-4 did not consistently improve the quality of law students’ work but significantly increased their speed,” and also, “GPT-4’s impact depended heavily on the student’s starting skill level; students at the bottom of the class saw huge performance gains with AI assistance, while students at the top of the class saw performance declines.”
Next year, Purdue University will require all undergraduate students to demonstrate basic AI competency, and I suspect more universities will follow suit. But while AI can kickstart our entry into a specific skill, ultimately, it’s trial and error, steady practice, the support of a community, and learning to trust ourselves—things only time can nurture—that sustain our competency, discernment, and taste. Are we accounting for this when giving AI mandates? When adoption becomes compulsory, we risk erasing the human conditions that nurture and sustain our learning.
Of course, with AI everywhere, so too is the language of taste and thinking.
Earlier this year, people lined up for new merch—including “thinking” caps—from none other than Anthropic. Even the former dean of Harvard Business School—who has long urged organizations to become data-driven—recently remarked that “good taste is more important than ever.” Still, can we trust ourselves to cultivate taste and keep thinking under the relentless pressure to be vaingloriously productive at lightning speed—all the way into the dark days of late December—outsourcing every thought to AI? What happens when our sense of judgment and quality hollows out, and the only trust we have left is in machines, engineered by the wealthy and powerful few? I asked ChatGPT for alternative headlines for this essay. One of the variations included: Who Needs Humans Anyway?
I get it though. I can leverage AI on tedious tasks like making my words concise in an email, so it has a higher chance of being read. This matters immensely in a remote, long-distance, ever-changing, context-switching, frenzied-rabbit timeline; my fellow tech workers are nodding their heads. AI tools, in some ways, give my energy and time back so I can focus on things I want to do, like dreaming up possible futures of what my life could look like, or planning that trip I’ve always wanted to go on, or researching nearby hikes in the Pacific Northwest to get away from screens. Then again, what if I used AI to do these things, too? Google Gemini recommends a hike near Skagit County. But it’s not walkable right now; the area is submerged under record floodwaters.
“Y’all are driving up energy prices in the middle of a climate crisis?” cries poet Saeed Jones.
At what point does AI end and my own consciousness begin? Lately, AI has been flooding my dreams, its interfaces, the progressive reveal of its line-by-line responses. I can even feel its omnipresence in my dreams, even when they aren’t about AI at all. I’ve returned to the practice of using pen to paper, scribbling my thoughts with whatever handwriting I have left. Sometimes I draw the “g” with a short, open tail; other times, it loops. The ink bleeds. No chatbot could ever replicate that splotch.
It also teaches me things about myself I wouldn’t notice with AI. It takes me seventeen revisions to write the opening line of a letter, and with each revision, my mind wanders. I need an escape from the pressures of writing. So I doodle. I draw patterns, stars, eyes, and then, suddenly, mid-doodling, I get another idea for an opening line. Visual exploration propels my writing forward. Eventually, I choose what to do next, rather than relying on a machine to suggest it, giving me the courage to make one decision, then the next. It may take a little longer, yet with each decision, I’m learning to trust myself.
I don’t know where I’d be if I didn’t trust myself. Trusting in ourselves, and in our abilities, is how we find resilience when reality bites; we can find the wherewithal to get through challenges by looking to our past, taking note of the difficult situations we’ve overcome, how we overcame them, the people who supported us, and the wisdom we gained along the way. It’s how we protect our agency, without falling under the spell of machines designed to flatter us into staying on their platforms far longer than we intended, or ads poised to manipulate our desires and tastes, which, according to a recent ChatGPT leak, are coming for us in 2026.
And I don’t know where I’d be without trust in other people. This year, doctors, nurses, and hospital staff removed nine non-cancerous tumors encroaching into my spine and pelvic bone; it took months to walk again. While recovering, mentors and colleagues helped me escape a career situation that haunted me to oblivion. New friends pulled me back into community. I feel, in many ways, fucking whole again.
Paranoia grows in the absence of trust. Responsibility wanes. Actions, policies, and systems require a shared sense of reality and trust in each other to evaluate consequences effectively and hold each other accountable. Recognizing cause and effect, distinguishing quality from slop, anchors our values when the rubber meets the road, especially in an economy designed to tempt us away from them. I suspect this may become more urgent as AI becomes more agentic—performing complex tasks autonomously with minimal human input—further blurring our discernment if we’re not careful.
The other day, a clip of Zohran Mamdani, mayor-elect of New York City, swiped onto my TikTok feed. He said, “You always have to remember that the opinions that you’re hearing, especially from the wealthy and the powerful, they will always be amplified in a way that working peoples are not. And if you remember that context, take that seriously, you also don’t think of it as surround sound in the way that it feels.” Mamdani wasn’t specifically referring to AI here, but his words could just as easily apply to it. Trillion-dollar global investments, nonstop model-upgrade press, slick AI company merch, and Time anointing the “architects of AI”—Sam Altman, Mark Zuckerberg, Elon Musk, and other tech billionaires—as Person of the Year, all signal AI as a mainstream story of inevitability and exclusion: AI is revolutionary, urgent, hot. Get in, or get left. But in this narrative, we lose specificity on what AI can actually do, its limitations, its consequences, and what people truly want—or don’t want—from it.
Mamdani continues, “Part of that means getting out of the office, getting actually into New York City, speaking to people, asking them what they’re thinking about, what they’re worried about.” This is where, I think, trust begins. Going beyond the screens. Beyond the bots. Beyond ourselves. Listening to one another. Taking in real-world context. These inputs sharpen our understanding of AI’s actual impact, and expand our capacity to discern the future we want, along with the trust, accountability, and empathy needed to get there. The thing about experience—a word sharing a root with “expertise”— is that it is lived and witnessed, not generated or prompted. (If I catch any of my fellow tech workers using a chatbot as a synthetic user to evaluate the quality of their own product, as poet Saeed Jones puts it, “let me see it, because I am going to slap your phone out of your —”.) AI’s impact lies in the everyday: hours saved on a tedious task, or higher energy bills, a mechanical text from a friend, someone telling you about their fear of losing grip on what’s real.
“Whether or not she used AI to write the text,” my friend continued, “I still want to get to know her more. Reality is getting so hard to grasp. I don’t want to lose sight of it.” I nodded.
“Did you see that video of a house floating down a river in Whatcom County?” she asked. “I couldn’t tell if it was AI or not.”
If you enjoy this newsletter and want to further support my work as a human writer, consider subscribing, or share Tech Without Losing Your Soul with your friends and colleagues. <3





