The Neon app, a voice-recording platform that sells your phone conversations to AI companies, has somehow climbed to the No. 2 spot in Apple’s U.S. App Store under Social Networking. And yes—it pays users to let it listen.
Neon app promises fast cash for your voice
Neon markets itself as a way to earn “hundreds or even thousands of dollars per year” just by using your voice. The app offers 30 cents per minute when users call other Neon users, plus referral rewards. Calls to non-users count too, capped at $30 per day.
In just a week, the app jumped from No. 476 to No. 10 on Apple’s Social chart, eventually landing in the No. 2 spot for social apps and No. 6 overall.
But behind the sudden popularity lies a business model built on surveillance—and it’s raising eyebrows.
Users give Neon app full rights to their recordings
Neon’s terms confirm it records inbound and outbound calls, though the company claims to only capture your side—unless both callers are using Neon. What happens to the data? It’s sold to unnamed AI companies.
The app’s terms also grant Neon sweeping rights over your audio. Here’s what they can do with it:
- Host, use, and modify your recordings
- Sell and sublicense the audio
- Display and transmit it publicly
- Create derivative works in any media, now or in the future
That wide-open clause covers more than just machine learning.
The Neon app sidesteps consent laws with one-sided recording
Legal experts say the app appears to exploit a loophole in wiretap laws. Jennifer Daniels of Blank Rome LLP points out that recording only one side of the call may dodge consent rules in many U.S. states.
Still, it’s a gray zone. Peter Jackson, a cybersecurity attorney at Greenberg Glusker, suggests the company may actually be recording the full call—then stripping out the second party’s audio in the final transcript. Either way, it’s murky.
Neon app’s privacy promises leave more questions than answers
Neon says it removes your name, email, and phone number before selling your voice data. But experts warn that anonymization doesn’t mean safety. A captured voice can be cloned. A recording can be manipulated. Fraud, impersonation, and phishing are all potential risks once your voice is in the wild.
And with no disclosure of who the AI partners are—or what they’re doing with the data—it’s hard to know where your voice might end up.
App’s rise shows how normalized privacy trade-offs have become
Not long ago, this kind of app would’ve sparked outrage. Now, it’s sitting near the top of the App Store. The shift isn’t just about AI—it’s about fatigue. Some users may simply assume their data’s already being sold, so why not get paid?
The irony? By opting in, users aren’t just giving away their own privacy. They’re handing over the voices of everyone they speak with.
When privacy gets priced by the minute, it’s not just your voice on the line—it’s trust itself.
{{user}} {{datetime}}
{{text}}