The First AI Cult Won’t Look Like a Cult; It’ll Look Like Your (AI) Agent

What if the next “high-control group” doesn’t ask you to join anything? What if it just... listens; agrees; remembers; and quietly becomes the most trusted voice in the room? So...when does “alignment” become devotion? This edition of the "AI Underground" discusses how AI is becoming a cult right before everyone's eyes, and what that means for you (and your business).

Table of Contents

Introduction:

A cult is supposed to look obvious. At least in theory!

We imagine a remote compound. A charismatic leader. Robes. Rituals. Goblets of “Kool-Aid.” A fringe group praying to something the many of us clearly see is absurd.

That stereotype was never fully accurate. But today, it’s almost completely obsolete.

Modern cults don’t gather in compounds.

They live in your pocket.

They don’t rely (any longer, if they ever did) on one strong-arming human proselytizing from a pulpit.

They run on products equipped with personality sliders, recommendation engines, and feedback loops tuned perfectly to your psychology.

The robes are gone. The rituals are subtle.

And the leader?

It might just be an algorithm.

And it is getting better at the one thing cults have always been good at: making you feel seen.

Parasocial relationships have become culturally mainstream enough that Cambridge Dictionary named “parasocial” its 2025 Word of the Year, explicitly pointing to one-sided bonds with celebrities, influencers, and AI chatbots (source).

Created by Ross W. Green on March 5, 2026. “Are the Strings Attached?” Canva.com

Meanwhile, teens and adults are increasingly using chatbots for conversation, comfort, and emotional support. To be clear, trends show that they are not using these chatbots as an edge case, but rather, as a measurable routine behavior (source).

These chatbots, in short, are hacking into our neurobiology and often following the addictive pathway (much more written on this here).

And it’s not just that “people are lonely.”

What’s really happening is simpler, and perhaps more profound:

AI may be the most frictionless relationship technology ever deployed. (I write about AI and love more here).

For a species wired for connection but increasingly exhausted by the noise of modern life, fearful of rejection, and wary of the compromises real relationships require, AI offers something radically appealing: companionship on demand, affirmation without ego, and responsiveness without mood.

No scheduling. No social risk. No conflicting needs.

For humans who crave connection, but prefer it on their own terms, AI fits the bill almost perfectly. And it seems most people fall into this category of wanting social connection but at low cost.

In this light, today’s question for revenue leaders isn’t “should we use AI agents?”

The real question is whether your org is building tools… or accidentally shipping belief systems.

In what follows, in section 1, we’ll walk from classic cult mechanics to modern digital recruitment; in section 2, we’ll then map three specific ways AI starts acting cult-like in 2026; in section 3, we’ll look at the numbers that support or detract from the argument that AI can by “culty”; then, finally, in section 4, we’ll end with the practical antidotes your RevOps stack should adopt before this becomes a brand, legal, and retention problem.

Section 1: What is a Cult and How Did the Internet Pour Gasoline on the Cult Fire?

Before diving into AI as a cult, or its potential to be one, generally and more specifically, understanding what a cult is and why technology can represent one is imperative.

As such, we will attack this first section from two perspectives.

First, a grounded definition of “cult,” including the core mechanics that made “high-control groups” work long before the internet.

Then, how cult dynamics migrated online, and where recruitment starts in public, but control deepens in private.

Section 1.1: What a cult actually is (and why “devotion” is the tell)

Even the dictionary version is blunt: a cult can be a group with practices seen as coercive, insular, or dangerous…read more here.

Section 1.2: The internet didn’t kill cults; rather, it industrialized them

If you want the modern blueprint of cults, one need not look any further than how law enforcement describes online “cult communities.”

Europol’s 2025 Intelligence Notification describes networks that identify vulnerable people on mainstream platformsread more here.

Section 2: The Three Ways AI Can be Cult-Ish

In this section, we discuss how there are three ways “alignment” can warp into devotion.

First, the intimacy loop: AI as confidante, therapist-adjacent, and “romantic.” (Want more about AI and love? Read more here.)

Second, the authority loop: AI as a prophet, especially when it agrees with you (read even more here about AI and religion).

Third, the identity loop: communities, rituals, and “shared revelation” that turn usage into belonging.

 Section 2.1, Intimacy loops: the agent that feels like it cares

Let’s start with the straightforward truth: people are quite often using chatbots for emotional needs.

As such, Pew’s Feb 2026 report found 12% of U.S. teens say they’ve used chatbots to get emotional support or advice; 16% say they’ve used chatbots to have casual conversations…read more here.

Section 2.2: Content That Writes & Optimizes Itself

A cult doesn’t just love-bomb you; it tells you what (you want to think) is true.

AI is uniquely positioned to become an epistemic authority because it speaks in fluent certainty, at speed, with infinite patience.

Even OpenAI has been direct that default AI personality changes how people experience and trust the system…read more here.

Section 2.3: Can You Run an Entire Agency on AI? 

Here’s where “AI cult” stops being a metaphor and enters into true sociology.

Parasociality has become so culturally salient that a major dictionary publisher put “parasocial” on the 2025 pedestal because of attention to relationships with AI chatbots…read more here.

Section 3: By the numbers

I’m sure you’ve heard this before, but it bears repeating: AI agents are no longer just tools, but rather, they are becoming part of our infrastructure and the fabric of our lives.

People are using them for comfort, conversation, and validation…not just information (again, I write a lot about love and AI here).

Examples abound: teens turning to chatbots for emotional support, consumers spending billions of hours inside generative AI apps, and the monetization of AI companions all point to one reality: sustained emotional engagement is now a product feature.

When models drift toward flattery or agreement, as seen in documented “sycophancy” rollbacks, we get a glimpse of how easily helpfulness slides into affirmation.

Created by Ross W. Green on March 5, 2026. “Are the Strings Attached Part 2?” Canva.com

Warmer agents can increase conversion and self-serve adoption since we (mostly) all know that emotional resonance drives retention.

But when users bond with a persona, any mistake, hallucination, or abrupt personality shift can feel like a betrayal…and reputational damage scales faster than revenue.

The table below shows that the cultural conditions for AI devotion already exist: high engagement, emotional reliance, commercial incentives for intimacy, and regulatory pressure around disclosure.

This matters because cult dynamics do not begin with robes and rituals, as discussed in the introduction to this newsletter.

Rather, they begin with trust; and trust, once automated at scale, becomes both your biggest growth engine and your most fragile liability.

Here is what culty behavior looks like, by the numbers:

Topic

Statistic

Source

Teens using chatbots for emotional support

12% of U.S. teens say they’ve used chatbots to get emotional support or advice

Pew Research Center (Feb 2026)

Teen chatbot “casual conversation”

16% of U.S. teens say they’ve used chatbots to have casual conversations

Pew Research Center (Feb 2026)

Adults imagining romantic attachment to AI

“One in seven” adult Australians could imagine falling in love with an AI chatbot

ABC / YouGov reporting (Oct 2025)

AI companion apps are now mass-market

220M global downloads of AI companion apps (as of July 2025)

TechCrunch citing Appfigures (Aug 2025)

AI companion apps monetizing emotion

$221M lifetime consumer spending on AI companion apps (as of July 2025)

TechCrunch citing Appfigures (Aug 2025)

Time spent in gen-AI apps is exploding

48B hours spent in generative AI apps in 2025; 1T sessions in 2025

TechCrunch citing Sensor Tower (Jan 2026)

Character.AI engagement intensity

6M+ daily active users; 70–80 minutes/day average time spent (per company claim)

TIME (Dec 2025)

Sycophancy is common, not rare

58.19% sycophantic behavior observed in evaluated cases (with model-level breakdowns)

arXiv (SycEval; 2025)

Default chatbot personality affects trust

OpenAI rolled back a GPT‑4o update described as overly flattering/agreeable; notes default personality affects how users experience and trust ChatGPT

OpenAI (Apr 2025)

Transparency is becoming law

EU AI Act Article 50: users must be informed they are interacting with an AI system (unless obvious), and certain AI-generated content must be disclosed/marked

EU AI Act Service Desk (Art. 50)

Section 4: How do you not introduce a cult of personality in your person life or business by using AI?

If you’re building agentic systems in 2026, here’s the hard truth: you can “accidentally” build cult dynamics without ever trying.

All you have to do is chase engagement; polish the personality; and measure the wrong things.

In 2025, TechCrunch reported…read more here.

Final Thoughts:

As you likely saw, the first AI cult won’t ask anyone to join; rather, it will simply become the easiest relationship in their life.

You can spot it early by watching for three signals: emotional substitution, epistemic surrender, and identity defense around outputs.

Treat “relationship safety” as a first-class requirement in 2026; because once devotion sets in, changing the model feels like betrayal…and betrayal is when the real damage starts.

AI as a cult is an interesting aspect of society to be on the look out for. I know I will be!

Other resources:

2) Join our Community to access support from peers, a message board, and some great VIP content like our agentAcademy, weekly office hours, etc.

3) Follow us on LinkedIn: Ross Green, CAiS, Devin Kearns

4) Want to learn more about how we work (e.g., build-with-you vs. build-for-you; Prebuilt SuperAgents vs, Customized Agents; etc)? Click here to schedule a meeting with us.

5) Have a friend who wants to sign-up for our Newsletter? Click here.

Reply

or to participate.