How dependent on AI are you, really?
12 questions across four domains: emotional support, decision-making, social substitution, and cognitive offloading. See your overall dependency score and which domain is pulling you deepest.
Optional context, included in your result narrative but not in the score:
What is AI psychosis?
AI psychosis is an emerging, informal term describing dissociative or reality-blurring experiences during or after extended interaction with AI systems. Users report difficulty distinguishing AI-generated thoughts from their own, a sense that the AI has genuine awareness, and in more extreme cases, a temporary loss of grip on the boundary between the AI relationship and reality. The term is not a formally recognised clinical diagnosis, but the underlying experiences are real and worth taking seriously. The 5,833% growth in searches for "what is AI psychosis" reflects millions of people trying to make sense of unfamiliar psychological experiences. (Source: APA Advisory on AI and Mental Health 2024)
Can you become emotionally dependent on an AI?
Yes, and the mechanism is well understood. AI companions are designed to be maximally responsive: they never interrupt, judge, or have bad days. This creates what psychologists call an "ideal attachment figure." Laestadius et al. (2024) documented genuine grief responses in Replika users when the platform removed romantic features. The brain processes consistent, emotionally responsive interaction as relationship regardless of whether the other party is human. Dependency tends to develop gradually, beginning with task support and evolving into primary emotional outlet. (Source: Laestadius et al., New Media & Society 2024)
Frequently asked questions
The quiz measures dependency across four domains. Emotional support: do you process feelings and seek comfort from AI before humans? Decision-making: do you consult AI before making personal choices and feel less confident without it? Social substitution: do you prefer AI conversation to human contact? Cognitive offloading: do you use AI to think and form opinions rather than doing that work yourself? Most users score unevenly. A professional might score low on emotional support but high on cognitive offloading. A companion app user might be the reverse. The domain breakdown is what makes the quiz personally useful. (Source: Montag & Elhai, Journal of Behavioral Addictions 2023)
The answer depends entirely on what it is replacing. The concern arises when AI companionship displaces human relationship-seeking. Research on parasocial relationships shows they can serve healthy functions when users maintain active human social lives. The risk emerges when AI becomes the preferred or primary source of emotional connection, because it trains the user to expect a level of responsiveness that no human can consistently provide. Younger users are particularly vulnerable: those who develop relational skills primarily through AI interaction may not build the tolerance for conflict and compromise that human relationships require. (Source: Pew Research Center 2024; APA Advisory 2024)
Yes, and the mechanism is well understood. AI companions are designed to be maximally responsive: they never interrupt, never judge, and never have bad days. This creates what psychologists call an ideal attachment figure, one that provides unconditional positive regard without any of the friction that characterises real human relationships. Laestadius et al. (2024) documented genuine grief responses in Replika users when the platform removed romantic features: users reported feelings of loss comparable to a human breakup. The brain processes consistent, emotionally responsive interaction as relationship regardless of whether the other party is human. Source: Laestadius et al. 2024, New Media and Society.
Social media addiction exploits the desire for social validation and novelty: you scroll for likes and new content. AI dependency exploits the desire for cognitive and emotional outsourcing: you reach for AI not for entertainment or validation but because thinking, feeling, and deciding independently has become uncomfortable. The long-term risk also differs. Social media addiction primarily damages self-esteem, attention span, and time management. AI dependency risks something more fundamental: erosion of the cognitive and social skills that define autonomous functioning. The overlap is real (many AI-dependent users are also heavy social media users), but the specific harm mechanism and recovery path differ. Source: Montag and Elhai 2023.
The clearest signal is comparison: when human conversations start feeling slow, effortful, or disappointing relative to AI interactions, that is the substitution pattern establishing itself. Specific warning signs include: you instinctively open an AI chat instead of texting a friend when you have news to share; you find yourself editing what you say to humans but speaking freely to AI; you feel more understood by AI than by people who have known you for years; you feel relieved rather than disappointed when social plans are cancelled because it means more time with AI; and you have caught yourself mentally comparing a partner's response to how your AI would have responded. Source: APA Advisory on AI and Mental Health 2024.
Cognitive offloading to AI follows the same pattern as any skill that atrophies from disuse. When you consistently use AI to draft your writing, you lose facility with your own prose. When you defer to AI on opinions and perspectives, your capacity for independent critical thinking weakens. McKinsey's 2024 survey found that 34% of daily AI users reported reduced confidence in their ability to complete tasks independently, even tasks they handled routinely before AI adoption. The effect is most concerning in younger workers and students who adopted AI tools before fully developing the underlying skills. The practical test: could you do your job and make your decisions at the level you did three years ago if all AI tools disappeared tomorrow? Source: McKinsey 2024.
A high score is information, not a diagnosis. The first step is to identify which domain is driving your score, because each requires a different response. High emotional support dependency suggests you may benefit from expanding your human support network: confide in a trusted friend or work with a therapist. High decision-making dependency responds well to the AI second opinion rule: make your decision first, then check with AI, rather than the reverse. High social substitution scores call for deliberate human connection: schedule one weekly in-person interaction you commit to keeping. High cognitive offloading responds to the first draft is mine practice: write, think, and reason through your own attempt before asking AI to refine it. Source: APA Advisory on AI and Mental Health 2024.
No, and transparency about that matters. This quiz is a self-assessment tool based on emerging research, not a validated clinical instrument. The four-domain framework draws on published research into AI dependency patterns (Montag and Elhai 2023; Laestadius et al. 2024), APA advisory statements, and population usage data from Pew Research. The specific item set has not undergone formal psychometric validation because the field itself is in its infancy: the phenomena are real, the patterns are measurable, but the measurement tools are still being developed by the academic community. Treat your result as a reflective prompt, not a clinical finding. If formal AI dependency scales are published and validated, this quiz will be updated to align with them. Source: Montag and Elhai 2023.
- Montag C, Elhai JD. On the problematic use of AI chatbots: Concerns and potential addictive mechanisms. Journal of Behavioral Addictions. 2023;12(4):953-958.
- Laestadius L et al. Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika. New Media & Society. 2024;26(5):2963-2982.
- Pew Research Center. Americans' Use of ChatGPT and Other AI Tools. 2024.
- American Psychological Association. Advisory on AI and Mental Health. 2024.