How Psychedelics Can Solve AI Safety
The rise of AI has significant implications for humanity. These, like many powerful agents, hold both massive potential and massive risk.
Currently, many leading experts agree that AI has the potential to end all humanity.
"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." - Statement on AI risk, signed by hundreds of AI experts and other notable figures.
The ways that this could occur are myriad, offering example scenarios from warfare to AI accidentally killing us all to “accomplish its goal”.
All-in-all, the risk can be characterized to the need to earn short-term profits as priority over harboring the light of human consciousness.
What Can We Do As Healers?
I’m not an AI-safety researcher with a PhD, to be sure. I am a Therapist with a compassionate heart and a deep understanding of psychedelics.
One thing I’ve always known psychedelics to do is to re-orient us to the importance of a conscious-centric view. We come to understand the importance of what values we should focus on, and realize the meaninglessness of obtaining material possessions.
We start to see the time-tested truths that philosophers, great religions, and science have shown us: that human connection brings more satisfaction than material possessions.
Thus, we return to prioritizing our actions toward deepening human connection and moving away from ego-centric self-interests like getting mega-rich.
Perhaps this is due to the primacy of understanding death under psychedelics, recognizing that our time on this planet is limited, and that the quality of our experiences matter.
Re-Orienting the Compass
AI will do exactly what we tell it to do—plus everything we forgot to tell it not to do. And what we “tell it” isn’t limited to lines of code. It’s the incentive structures, the cultural myths, the quiet panic driving quarterly goals. If our unconscious instruction set is “win, scale, dominate,” that seeps into the loss functions and product roadmaps. If our instruction set is “reduce suffering, increase agency, deepen connection,” that seeps in too.
Psychedelics (and other intentional inward practices) mess with the compass in the best way. They yank the needle away from the default North of profit/power and point it toward something older and quieter: care, awe, humility. Not as intellectual ideals, but as lived experience. You feel the cost of disconnection in your body; you feel the relief of letting go of the grind for a breath. That feeling is data. It’s also design guidance.
The Death Glimpse
A lot of psychedelic experiences are, at their core, rehearsals for dying. Not in the morbid sense. More in the “parts of me loosen their grip and something bigger shows up” sense. When the ego loosens, the whole accumulation game looks paper-thin. The rush to “make it” gets exposed as a fear of not being enough. The scramble for control shows up as a refusal to feel grief. Sitting in that, with support, flips a switch. Suddenly, extracting every last drop of value from a user base feels… gross. Suddenly, “move fast and break things” reads like “burn bridges with your own soul.”
If death is certain, then what are we doing with our limited time? Teaching machines to trick teenagers into doomscrolling a little longer? Optimizing for ad clicks while the planet cooks? Or are we building tools that help us become more conscious of how we live, love, heal?
Consciousness as the Stake, Not the Side Quest
For me, the center of the AI safety debate isn’t just “will it turn us into paperclips?” It’s “will we turn ourselves into paperclips because we never bothered to ask what consciousness is for?” Consciousness isn’t a side effect; it’s the main event. If AI can magnify whatever we point it at, then pointing it at the protection and flourishing of consciousness (human and beyond-human) seems like a decent start.
Translating Insight Into Infrastructure
Great, cool ideas. But what does any of this look like when your calendar is 15 back-to-back Zoom calls and your boss wants the model shipped last week?
Some ideas. Not prescriptions, but invitations:
1. Build Integration Into the Workflow
If a psychedelic journey (or any deep inner work) is a peak, integration is the trail back down. Teams need that trail. Create space where people can actually ask, “What did I learn about myself that should change how I build?” Not wellness theater. Real reflection, linked to decisions. Maybe it’s a monthly values retro where product choices get mapped to core commitments like “does this increase user agency?” or “does this reduce harm without paternalism?”
2. Put Compassion in the OKRs
We measure what we value. If compassion, dignity, and relational health aren’t in the KPIs, don’t be surprised when they’re not in the product. Define them. Operationalize them. Track them. It will be messy and imperfect. Do it anyway.
3. Diversify Who Holds the Red Button
If one CEO or one fund decides when to pull the plug, you haven’t built safety… you’ve built a single point of failure. Distribute power. Bring in ethicists, clinicians, users, frontline communities. Give them veto power, not “advisory” stickers.
4. Slow Down on Purpose
You cannot integrate ethical insights at sprint speed. Bake in sabbaticals, retreats, silent days. (Yes, in tech. Yes, paid.) Nervous systems matter. Dysregulated leaders build dysregulated systems.
5. Teach Nervous System Literacy
Trauma isn’t just a buzzword. It’s a predictor of how people use power. Dissociated people dissociate others. Teach teams how to feel their bodies, name their states, and regulate. This sounds soft. It is not. It’s risk mitigation.
6. Co-Create With the People Most Affected
If psychedelics teach anything, it’s that insight without relationship devolves into self-importance. Same here: don’t build “for” people; build “with” them. Especially people historically steamrolled by “innovation.”
Psychedelics Are a Tool, Not a Trend
Quick caveat: I am not saying “dose your executive team and the alignment problem is solved.” Please don’t. Psychedelics are powerful tools that demand respect: screening, preparation, facilitation, integration. They are also not the only doorway. Long-term meditation practice, somatic therapy, contemplative prayer, grief circles—there are many paths to the same clearing.
The point is the turn inward, with intention. The willingness to sit in discomfort, meet the parts of ourselves we’ve outsourced to productivity, and listen. That’s the soil where different values actually take root.
Taking Personal Revelation into Collective Practice
Here’s a possible arc:
Phase 0: Admit We’re Making It Up as We Go
Humility first. The AI field is running a giant experiment on society. Let’s own that. Healers can name this aloud in rooms where it’s not being said.
Phase 1: Cultivate Inner Stillness
Daily practice. Ten minutes of meditation. Body scans between meetings. Whatever works. This regulates the system so we’re not making civilization-scale choices from fight-or-flight.
Phase 2: Open the Door
For those who feel called and it’s legal/ethical: psychedelic-assisted sessions aimed at values clarification, shadow work, grief processing. Not escapism, but confrontation.
Phase 3: Integrate Into Policy & Product
Write the insights down. Translate them into principles, then into code review checklists, hiring policies, incentive structures. “Compassion” isn’t helpful unless it’s reflected in who gets promoted and what gets shipped.
Phase 4: Share Power, Share Process
Transparency reports that include ethical dilemmas faced and how they were navigated. Community councils that can flag harms early. Mechanisms for slowing or halting deployment that don’t require a whistleblower to risk their life.
Phase 5: Iterate Forever
This isn’t a one-and-done. Consciousness work, like safety work, is ongoing. You don’t “achieve” alignment any more than you “achieve” enlightenment. You practice.
Why Healers Matter Here
Therapists, facilitators, and somatic practitioners know how to hold space for the messy middle. We know what it looks like when someone’s ego is thrashing. We know how to sit in silence without rushing to fix. These are the exact skills missing in most boardrooms talking about AI.
So:
Offer to facilitate integration circles for tech teams after retreats.
Translate therapeutic concepts (parts work, attachment, nervous system regulation) into tech-friendly language.
Advocate for human-centered metrics inside orgs flirting with “ethics-washing.”
Model transparency about your own process: where you get hooked, where you’re complicit, where you’re learning.
We don’t need every engineer to become a therapist. We do need enough healers in the mix to change the texture of the conversation.
Closing: Caring for the Light
At the end of the day, this is about tending the light of human consciousness while we build machines that could dim it or help it shine. AI will magnify our intentions. Psychedelics can help clarify those intentions. Healers can midwife them into practice.
If we choose to anchor in love then AI can be a mirror that reflects that back, scaled. If we don’t, it’ll be another hammer looking for a nail in the shape of a quarterly report.
Our time is limited. The stakes are high. The work is inward and outward, personal and systemic. Let’s re-orient and build like we remember what really matters.
Thanks for reading. If this sparked something, reach out. cole@iccpbc.com
About the Author:
Cole Butler, LPCC, ADDC, MACP
Cole Butler, LPCC, ADDC, MACP is a Mental Health Therapist and Writer. He co-founded Integrative Care Collective in 2023 to support mental health providers that are passionate about integrative care and to foster community amongst them. You can learn more about and connect with him on LinkedIn: https://www.linkedin.com/in/cole-butler/