AI Safety

A look at Insight Timer’s key processes for safe, responsible and useful AI.

We are committed to safety and transparency whilst helping our community lead more fulfilling and meaningful lives. This AI Safety hub covers crisis safeguards, clinical oversight, research metrics, and data rights, all in one place. 

Safety first

Insight Timer’s AI offers general wellbeing guidance. It does not diagnose, treat, cure or prevent mental illness.

Human experts

Our AI helps users with their wellbeing, but its real power is combining AI intelligence with real human wisdom.

AI surfaces the most relevant of Insight Timer’s 20,000 teachers for content suggestions, live sessions, and in-person retreats.

Clinical Advisors

Our Safety Advisors certify the program, review policies & incident metrics.

Dr. Scott Clark, Head of Psychiatry at the University of Adelaide

Research

We’re running an opt‑in study measuring AI’s impact on mental wellbeing.

We've previously published the world's largest longitudinal study for meditation's impact on mood & equanimity. Contact micah@insighttimer.com to get involved in future research projects

Privacy & Security

Data is encrypted, and stored safely and securely. Data is never sold. Users can pause, return, or release their data at any stage.

End-to-end TLS 1.3

Data pseudonymised

ISO 27001-certified hosting (Sydney & Virginia)

SOC 2 Type II audit in progress (target Q4 2025)

FAQs

Reflect is Insight Timer’s AI-powered conversation partner. It’s here to help you explore your thoughts, patterns, and values through thoughtful, emotionally intelligent questions. Reflect doesn’t offer advice or therapy, it simply listens to what you share and responds with reflections that support self-awareness.

Reflect uses advanced language prediction models to understand your messages and respond in a natural, supportive way. It looks at what you’re saying and draws on its training to generate questions and reflections that help you go deeper. It’s not conscious or human, it works by predicting helpful next messages based on language patterns.

Yes. Reflect has been carefully customised by Insight Timer. We use tailored prompts and examples to help it respond in a way that matches our values, kindness, wisdom, presence, and emotional intelligence. This helps it feel more like a trusted companion than a generic chatbot.

Yes, but only within Insight Timer, and only so it can support your journey more meaningfully. Reflect can access your previous chats to stay consistent and offer deeper insights based on your ongoing reflections. This allows your conversations to feel more connected over time. Your information is never shared outside the platform, and you’re always in control.

No. Reflect is not a licensed therapist or counselor. It won’t diagnose, treat, or give expert advice. Instead, it offers a supportive space for you to reflect and gain clarity. If you’re struggling with your mental health, we encourage you to reach out to a qualified professional.

Yes. Your conversations with Reflect are private and stored securely within Insight Timer. They’re never shared with third parties. You can delete any or all of your conversations at any time.

If you don’t opt out, we may use them (in anonymized form) to improve the quality of Reflect. Otherwise, your conversations are used only to support your personal experience within the app.

Reflect aims to be supportive, curious, and thoughtful, but it won’t always be perfect. Sometimes it might miss the mark or sound confident when it’s not quite right. It’s designed to support your journey, not to give you answers. Always trust your own judgment.

Reflect is trained to respond in a way that feels attuned to what you share. When it feels “spot on,” that’s because it’s picking up on the language and patterns in your responses. But it’s still just a tool, not a person.

Absolutely. You’re always in control. You can stop chatting at any time, delete individual or all past conversations. It’s here to support you, not pressure you.

No. Insight Timer does not train AI models using your data. If we ever choose to use fully anonymized data for training in the future, all identifying details would be completely removed, and you would be given the option to opt out. Your personal information is never used for AI training without your knowledge or consent.

Need help right now?

Call your local emergency number (e.g. 000 in Australia, 911 in the USA) or reach a 24-hour helpline:

Report a concern:
Email safety@insighttimer.com or use the in‑app “Report” button.
Page last updated: 23rd July.Next scheduled review: October 2025.
©��2025 Insight Network Inc.