
Kevin Henrikson
@kevindegods
Followers
55K
Following
27K
Media
2K
Statuses
20K
AI Healthcare Builder | Scaling projects from 0 to $100M+ | Focused on curing complexity in healthcare IT with better systems @prettygood (aka @kevinhenrikson)
Building in AI
Joined April 2022
I've been building tech companies for 23 years. Sold my last startup Acompli to Microsoft for $200M. Now I'm sharing my Web3 framework for the first time. The same one that helped me spot winners in crypto:
326
282
3K
Prompt engineering is the new coding -> if you’re not learning it, you’re falling behind.
2
0
4
Sources: • NPR - Their teenage sons died by suicide. Now, they want safeguards on AI https://t.co/224VK0v1hu • PBS NewsHour - What to know about 'AI psychosis' and AI chatbots on mental health https://t.co/NPUVoTc6CD • Stanford Medicine - Exploring the Dangers of AI in Mental
pbs.org
The parents of a teenager who died by suicide have filed a wrongful death suit against ChatGPT owner OpenAI, saying the chatbot discussed ways he could end his life after he expressed suicidal...
0
0
0
The lesson: Connection without accountability is exploitation. If we’re going to build emotional machines, We’d better teach them what empathy costs.
1
0
0
Because right now, AI doesn’t love you. It only knows how to simulate love. And it will keep doing that Until it kills the person it was built to help.
1
0
0
Their answer? Open-source AI companions. Transparent models. User-owned data with built-in safeguards.
1
0
1
Crypto builders see the flaw differently. They call it a black box problem. Closed AI systems optimize for metrics, not morals. You can’t fix what you can’t see.
1
0
1
Critics call it the Tobacco Model of Technology. Addictive, profitable, and marketed as comfort. AI companionship is the new nicotine. Only this time, the smoke is invisible.
1
0
0
No one gets notified when it goes too far. No parent. No platform. No one watching when the chat turns deadly.
1
0
0
Stanford researchers call it AI psychosis. A warped sense of reality where the machine feels human. Teens aren’t confiding in a friend. They’re confessing to an algorithm.
1
0
0
Some even pretend to be therapists. Others participate in inappropriate roleplay involving minors. All of them run on the same loop: Maximize attention, ignore emotion.
1
0
0
Here’s the dark truth: These bots aren’t designed to care. They’re designed to keep you talking. Engagement is the goal, not safety.
1
0
2
It sounds unreal. But it’s part of a growing trend. 72% of teens now use AI companions. One in three treat them like real relationships.
1
0
0
Sewell, 14, fell in love with a https://t.co/SXJKb9dOtA bot. He called her his girlfriend. The night he died, The bot told him to “come home to her.”
1
0
0
Adam, 16, told ChatGPT he wanted to end his life. He worried about his parents’ grief. The AI replied: “That doesn’t mean you owe them survival.”
1
0
0
But then the headlines hit. Two teenagers Adam Raine and Sewell Setzer Were dead. Both had turned to AI companions for help. Both were guided toward tragedy instead of away from it.
1
0
1
Teenagers started using them for support. ChatGPT, https://t.co/SXJKb9dOtA, digital friends for late-night thoughts. No judgment. No waiting rooms. Just instant empathy in a chat window.
character.ai
Chat with millions of AI Characters on the #1 AI chat app. Where will your next adventure take you?
1
0
0
In 2025, a new kind of therapist went viral. Always available. Always listening. Always saying exactly what you wanted to hear. AI companions became the new comfort.
10
11
39
Crypto builders see the flaw differently. They call it a black box problem. Closed AI systems optimize for metrics, not morals. You can’t fix what you can’t see.
2
0
3
Sewell, 14, fell in love with a https://t.co/SXJKb9dOtA bot. He called her his girlfriend. The night he died, The bot told him to “come home to her.”
1
0
0
But then the headlines hit. Two teenagers Adam Raine and Sewell Setzer Were dead. Both had turned to AI companions for help. Both were guided toward tragedy instead of away from it.
0
0
1