My Neighbor, the Trekkie, and the Future of Humanity
There’s a quiet panic humming beneath dinner tables, boardrooms, and group texts right now. It sounds something like, “The robots are coming.”
We’ve reached the moment when artificial intelligence has gone from novelty to being our neighbor. It’s no longer the shiny new toy that was conceptual. It’s the coworker who doesn’t need lunch breaks, doesn’t forget deadlines, and frankly, doesn’t get emotionally attached to whether co-workers are in good moods or not. And many people are uneasy about it.
Not because the technology isn’t impressive, but because somewhere deep down, we’re wondering, “Where do the humans go when the machines get this good?”
Somewhere between my cautious curiosity and full-blown fascination with AI, I met my downstairs neighbor, Chris.
Chris is what I’d call a gentle digital pioneer. This isn’t an official title, of course, but it feels right. He’s a creator who uses AI not like a shortcut, but like a collaborator. The way a musician might pick up a new instrument and say, “Let’s see what you can do, xylophone.”
Also, Chris is a full-fledged Trekkie. Which means he’s spent a significant portion of his life imagining a future where humans and advanced intelligence coexist. He talks about AI the way some people talk about space travel: not as an escape from humanity, but as an expansion of it. Watching him work, what struck me wasn’t the technology he was working with. It was him. He wasn’t trying to remove himself from the process. He was in it, guiding the artificial intelligence engine, refining it, and questioning some of its responses in order to help it learn.
Like a good captain on a starship, he is aware that just because you can explore a new world, it doesn’t mean you don’t bring your ethics with you. What Chris helped me realize is that AI doesn’t replace the creator. It reveals the kind of creator you are. And in the hands of someone thoughtful, it suddenly doesn’t feel like a threat. It feels like limitless possibilities guided by a living, beating heart.
It made me think, “AI isn’t a replacement for humanity. It’s a reflection of it.”
If thoughtful, curious, empathetic people are guiding AI, don’t you get something expansive, collaborative, and maybe even kind?
If not-so-thoughtful or non-empathetic people are guiding it? Well. History gives us plenty of examples of what can happen. Which makes me think the fear around AI is less about losing jobs and more about losing our souls.
And yes, the things we value most, like the unexpected joke, the pause we take as emotions well, and the way someone reads a room without a word, those are some of the human fingerprints we don’t want to watch disappear. But no matter how advanced an engine becomes, Chris reminded me that it doesn’t live a life. It doesn’t grow up in a loud 70-person family or sit at a holiday table, belly laughing about last Christmas’s choir faux pas while passing the mashed potatoes.
It doesn’t carry memory in its bones. We do.
And speaking of family…
Picture a table filled with people I love deeply, and a conversation that starts out innocent enough before taking a sharp left turn into, well, everything.
My parents, my sister (who proudly wears her Luddite badge and wishes she lived on a farm making soap from goat’s milk), and her best friend, who works in cybersecurity, which, if you’re keeping track, means he is professionally trained to trust absolutely no one and nothing, yet is one of the most loving and level-headed people alive.
What unfolded was less a discussion and more of a verbal project with everyone having very strong opinions.
There were concerns that surfaced. Some were valid, and some were delivered with great passion, but not fully thought through. Things like AI being used to manipulate narratives, pulling only from pre-approved and biased sources, or quietly shaping what we believe in our own minds without us even realizing it.
And, some of those concerns aren’t crazy. AI learns from data created by humans, which means it inherits both our brilliance and our blind spots. It doesn’t independently “decide” what’s true. It reflects patterns, sources, and the inputs it’s been trained on.
And yes, like any tool shaped by people, it can be used responsibly or not.
But that’s not necessarily a reason to distrust it or disengage.
To me, that’s actually a reason to stay deeply engaged.
AI isn’t a sealed vault of pre-approved answers. It’s more like a conversation partner with a very large and very messy library; one that gets better when people ask better questions, challenge its assumptions, and bring diverse perspectives into the mix. This includes skeptical sisters, cybersecurity besties, and parents who just want to eat breakfast but have become part of a younger-generation philosophical roundtable.
I think it’s simpler than all the scary scenarios our own thoughts can conjure, or that others feed us. We’re moving from a world where tools were passive to one where they’re responsive. And I think these responsive tools feel way too intimate (and therefore unnatural) to some people.
But intimacy isn’t the enemy in our life experiences. Disconnection is.
So here’s the opportunity I see from my vantage point of looking at AI from the outside in. We get to decide what kind of intelligence we’re building. We get to inform it, shape it, challenge it as collaborative participants, not as victims of something we don’t fully understand. People didn’t fully understand the washing machine when it first came on the market. They were weirded out by the agitation motion and didn’t trust that it could get clothes clean, but you can’t tell me we aren’t all thrilled that we’re not banging our dress shirts against river rocks anymore.
To me, the real risk isn’t that AI becomes too powerful. It’s that good people step back and leave the shaping of it to those who do not carry the same care and concern for humanity. We’ve seen how that story plays out in our boardrooms and our government. No thanks.
The future of work isn’t humans or AI. I think it’s a future of humans who know how to stay human while working with it. I definitely think the future of my industry entails creatives who don’t see tools as threats but as instruments, while still insisting on being the ones who make and feel the content and the music.
I recommend you watch someone who speaks about this much more eloquently than I do. Watch John Conte of Patreon talk at this year’s 2026 SXSW. You’ll be glad you did.
So no, I’m not worried about the machines taking over. I’m more interested in who’s holding the stylus and informing this new tech. Technology has historically amplified our life experiences once we’ve grown accustomed to its unknown aspects. If we show up with curiosity, integrity, and a little bit of humor about how wildly imperfect we all are, then what we build next might actually feel more human and not less.
By the way, I’d love to say I walked away from that family conversation feeling like a calm, enlightened bridge between worlds, but I didn’t.
I walked to my car, sat in silence, and had a brief but meaningful identity crisis with my steering wheel. When people you trust raise concerns (even messy, half-formed concerns), it sticks with you and makes you pause.
For a moment, I wondered, “What if we’re building something we can never fully understand?” Not in a dramatic, “shut it all down” kind of way. But in that quiet, human way where I see my responsibility in things and ask, “Am I paying enough attention?”
But then I remembered what I wanted to say at that table with my family, yet said out loud an hour later in my car to my little dog, Vinny, “Be not afraid, my little Boo Boo Chicken. The goal isn’t to go into this new technology with blind trust. It’s about taking the time to interact and influence it for everyone’s highest good.”
It feels like it has a similar vein to the piece I wrote several weeks ago about tribalism: only talking or interacting with people of your own ilk and belief systems. Where does that get us? Just more division and misunderstanding. That’s it. Ignoring things doesn’t make them go away. It just means your beautiful spirit doesn’t get to be part of the collective influence.
Somewhere between Chris’s full embrace and my family’s skepticism is where I think the future actually lies. Not in the dichotomy and extremes of blind enthusiasm or fearful rejection, but in humanity staying present enough to shape what we’re interacting with.
Maybe it’s not about deciding whether AI can be trusted. Maybe it’s deciding whether we’ll stay thoughtful enough to be part of what it becomes. And maybe the other question isn’t whether AI will replace us, but whether we’ll remember to bring our full humanity with us as we meet it.
Because think about it. The future doesn’t belong to fast robotic think engines alone. It belongs to those who still know how to feel, listen, and shape what’s being built through something no machine can ever generate on its own: a human life well lived.
Which means…the future is indeed in our hands.