A new social networking app called Neon Mobile has risen to become the No. 2 app on Apple’s U.S. App Store, despite offering users money in exchange for recording their phone calls. The app then sells this audio data to artificial intelligence (AI) companies.
Neon markets itself as a way to earn hundreds or even thousands of dollars per year by sharing your conversations. Users are paid $0.30 per minute for calls with other Neon users and up to $30 per day for calls to any other number. Referral bonuses are also offered.
How Neon Works
According to Neon’s terms of service, the app can capture both incoming and outgoing calls. In practice, the app claims to record only the user’s side of the conversation unless both parties are Neon users. The collected data is sold to AI companies “for the purpose of developing, training, testing, and improving machine learning models and related technologies.”
Neon grants itself a broad license over user recordings, allowing the company to sell, distribute, host, display, and modify the content in any media, worldwide, and even sublicense it. This gives Neon significant leeway to use users’ data beyond the app’s marketing claims.
Privacy and Legal Concerns
While Neon’s app appears legal under certain state laws that allow one-party consent for recording calls, cybersecurity and privacy experts have raised concerns:
- The term “one-sided transcript” may conceal full call recording.
- Data anonymization may not be sufficient, leaving users vulnerable to voice-based fraud.
- The company does not fully disclose who its AI partners are or how they may use the data.
Voice data could potentially be used for impersonation, fake calls, or to create AI-generated voices, posing significant privacy risks.
Context: AI and User Privacy
This app highlights a broader trend in which AI-driven services increasingly collect sensitive personal data. While some productivity tools collect data with user consent, Neon monetizes data in a way that may put both users’ and third parties’ privacy at risk.
Experts warn that many users may underestimate the implications of sharing their voice data for small financial incentives. In effect, users might be trading privacy for minimal rewards while exposing themselves and others to potential misuse.
Takeaways
Neon’s success demonstrates the growing intersection of AI and personal data monetization. Organizations and individuals alike should remain vigilant about apps that request access to sensitive information, understanding the potential consequences of sharing such data—even if financial incentives are offered.