🛡️
Our commitment to child safety
Lyra has zero tolerance for any content or behaviour that exploits, endangers, or sexualises minors. We actively work to prevent children from encountering harmful content, and we cooperate fully with law enforcement and child safety organisations when required.
Minimum age: 13 years old
-
✓
Lyra requires users to be at least 13 years of age. Users in certain jurisdictions may be subject to a higher minimum age (e.g. 16 in the EU under GDPR).
-
✓
Users aged 13–17 are considered minors. We encourage parental involvement and provide a parental guidance section below.
-
✓
During registration, users must confirm their age. We do not knowingly allow children under 13 to create accounts.
-
✓
If we identify that a user is under 13, their account will be immediately suspended and all associated data deleted without notice.
-
✓
All AI-generated content is filtered through safety classifiers designed to prevent sexually explicit, violent, or otherwise harmful content from being produced.
-
✓
Content involving the sexual exploitation of minors (CSAM) is strictly prohibited and will never be generated, stored, or transmitted by Lyra.
-
✓
Lyra's AI companions are designed to decline and redirect conversations that attempt to introduce harmful, illegal, or age-inappropriate themes.
-
✓
We conduct ongoing red-teaming and safety evaluations of our AI systems to identify and close any potential gaps in our content filters.
-
✓
Romantic companion personas are only accessible to users who have confirmed they are 18 or older.
-
✓
We do not collect more personal information from minors than is reasonably necessary to provide the service.
-
✓
We do not serve targeted advertising to users we know to be under 18.
-
✓
Minor users' data is not used to train general-purpose AI models without verifiable parental consent.
-
✓
We comply with the Children's Online Privacy Protection Act (COPPA), the EU General Data Protection Regulation (GDPR), and other applicable child privacy laws.
-
✓
Parents or guardians may request access to, correction of, or deletion of a minor's data by contacting privacy@lyraapp.com.
-
✓
Lyra uses automated systems to detect and block attempts to generate or distribute child sexual abuse material (CSAM). Detected content is never stored and is immediately reported to the National Center for Missing & Exploited Children (NCMEC) and relevant law enforcement.
-
✓
Accounts confirmed to be involved in CSAM or child grooming are permanently banned and reported to authorities without warning.
-
✓
Our Trust & Safety team reviews flagged content and accounts within 24 hours.
-
✓
We maintain a dedicated child safety email monitored around the clock: childsafety@lyraapp.com.
-
✓
We cooperate fully and promptly with law enforcement agencies and court orders related to child safety matters.
If your child (aged 13–17) uses Lyra, we encourage open conversation about online safety. Here is what you can do:
-
✓
Talk to your teen about what AI companions are and are not — they are AI tools, not real relationships, and cannot replace human connection.
-
✓
Review app settings together. Encourage your teen to use the in-app report tool if they encounter anything uncomfortable.
-
✓
Set screen time limits using your device's built-in parental controls (iOS Screen Time / Android Digital Wellbeing).
-
✓
Request account data or deletion on behalf of your minor child by emailing privacy@lyraapp.com from a verified parent or guardian email.
-
✓
Report concerns immediately using the contact details below.
The following are absolutely prohibited on Lyra and will result in immediate account termination and referral to law enforcement:
-
✕
Any content that sexually exploits or depicts minors (CSAM) in any form.
-
✕
Grooming behaviour — attempting to build trust with a minor for the purpose of exploitation or abuse.
-
✕
Soliciting personal information (address, school, location) from minors.
-
✕
Using Lyra to arrange in-person meetings with minors.
-
✕
Sharing or distributing harmful, violent, or sexually explicit content involving minors through the app.
🚨
Report a Child Safety Concern
If you believe a child is in immediate danger, contact your local emergency services immediately. To report a concern specific to Lyra, use one of the options below. All reports are treated urgently and confidentially.
You can also report directly inside the app: tap any message → Report, or go to Settings → Report a Safety Issue.
Regulatory Compliance
Lyra's child safety practices are designed to meet or exceed the following standards and regulations:
🇺🇸 COPPA
🇪🇺 GDPR (Art. 8)
🇬🇧 UK GDPR & Children's Code
📱 Apple App Store Guidelines
▶️ Google Play Families Policy
Frequently Asked Questions
Is Lyra suitable for teenagers (13–17)? +
Lyra is available to users aged 13 and older. However, some features — such as Romantic companion personas — are restricted to users who confirm they are 18 or older. We strongly encourage parents to discuss the app with their teenagers and supervise usage where appropriate.
How does Lyra verify user age? +
Users must confirm their date of birth during registration. We also rely on app store age ratings and platform-level parental controls. If we receive credible information that a user is under 13, we suspend the account immediately. We are actively evaluating additional age-assurance mechanisms.
What should I do if I think a child under 13 is using Lyra? +
Please email childsafety@lyraapp.com with any relevant details. We will investigate and take action within 24 hours. If the account belongs to your child, include their registered email address so we can process the deletion promptly.
Can a parent request deletion of their child's account and data? +
Yes. Parents or legal guardians may request deletion of a minor's Lyra account by emailing privacy@lyraapp.com from a verifiable parent/guardian email address. Include the child's registered email or username. We will confirm the request and process deletion within 30 days. You can also visit our Delete Account page for more information.
Does Lyra report CSAM to authorities? +
Yes, absolutely. Any detected or reported CSAM is never stored, is immediately blocked, and is reported to the National Center for Missing & Exploited Children (NCMEC) via their CyberTipline as required by US law (18 U.S.C. § 2258A). We also cooperate with Interpol and local law enforcement agencies as required.