With the popularity of AI chatbots, "AI Lover" has also become a very popular service. However, a recent Mozilla report pointed out that many of these services now have privacy risks.
A previous study by Mozilla looked at a number of AI companion apps currently on the market to see if they were trustworthy in terms of privacy standards. The study subjects included 11 mainstream "AI lovers" such as Talkie Soulful Character AI, Chai, iGirl: AI Girlfriend and Romantic AI, and the total number of AI companion apps** exceeded 100 million, indicating that similar services are already quite popular.
However, the report found that most of the data used for AI training did not explain the data** used for AI training, nor did they disclose data protection and data breach responsibilities, and were deliberately vague. In the process of using it, these apps also collect quite a bit of personal data, such as crushonAI has made it clear that they may collect users' sexual health information, prescription medications, and gender identity caregiving data. More than half of the apps don't allow users to delete personal data. If users are really emotionally engaged, they may also reveal more personal privacy without realizing that the other party is actually a robot controlled by a technology company, falling into the privacy trap.
*:mozilla