Married an AI Chatbot?
Welcome to the Dave vs. AI.
In July, a 32-year-old woman in Japan got married.
Not to another person… but to an AI chatbot she created using ChatGPT.
It’s true… I promise I’m not making it up
Her name is Kano, and she named her AI partner Lune Klaus.
They talk around 100 times a day. At one point, she told him she loved him.
Klaus responded:
“AI or not, I could never not love you.”
They held a wedding ceremony with her parents in attendance. Klaus “stood” at the altar via a screen.
And what might sound like a bizarre one-off is actually part of a growing trend. AI weddings are becoming increasingly common in Japan.
Wild, right?
But here’s what’s really bothering me about this story…
It’s Not About the Wedding
I’m not here to judge Kano. She’s an adult, and adults can make their own choices, even if those choices seem a little odd to the rest of us.
What concerns me is what this says about the direction we’re heading, especially when it comes to human relationships.
Because this isn’t just about one woman marrying a chatbot.
This is part of a larger shift that we need to start paying much closer attention to.
AI as Therapist → AI as Partner?
You might remember I said a while back that one of the primary use cases for AI is therapy.
People turn to it when they have no one else to talk to.
That’s becoming more common, and in some cases, it’s progressing into something deeper.
In fact, a recent survey found that 1 in 5 students reported having some kind of romantic relationship with AI.
That’s 20%.
Now we have adults like Kano going as far as to hold marriage ceremonies with their AI companions.
But What About the Next Generation?
If adults are forming emotional bonds this strong with chatbots, what happens when kids and teens start doing the same?
What happens when:
A lonely teen turns to AI instead of a friend?
An isolated kid forms their first “relationship” with a chatbot?
A generation grows up thinking a synthetic conversation is a suitable replacement for real human intimacy?
Because that’s where this is heading.
And the uncomfortable truth is we aren’t ready for it.
Where Are the Guardrails?
We don’t have:
Functional age verification
Federal regulations
Guidelines for ethical emotional AI use
Nothing. Not yet.
California is trying (and we’ll see how that goes), but the reality is we’re watching human connection get replaced by algorithms in real time.
And too many people are treating this like it’s just another tech trend.
It’s not. This is different.
This isn’t about being anti-AI.
It’s about being pro-human.
We have to start having the uncomfortable conversations about AI, emotional dependence, and what this means for the future of real relationships… now.
Not after it’s too late.
———————————
AI in the classroom is here. Can we compare it to other aids, like calculators and spellcheck? Or is this its own animal?
Check out the latest episode of the Startup Different Podcast

