Scathing critiques of AI girlfriend apps may only fuel their
Scathing critiques of AI girlfriend apps may only fuel their

Scathing critiques of AI girlfriend apps may only fuel their popularity

Imagine all the joys of a bitter, public Hollywood divorce spiced up with the personal data version of revenge porn, all under the control of a smartphone variation of HAL from 2001 Space Odyssey.

In a nutshell, that’s the run-don’t-walk warning that the technology nonprofit Mozilla Foundation published on Jan 15, reviewing 11 intimate chatbots attracting growing numbers of smitten human users.

Far from sending customers fleeing – or developers closing up shop – the attention given to artificial intelligence (AI) companions is almost certain to drive more business to these apps, sometimes hoovering reams of personal data out of users.

The applications colloquially known as AI girlfriends and boyfriends prompts some observers to offer exhausted expressions of disbelief like, “is that really a thing?” The apps are really a thing, and reportedly growing in popularity.

According to The Motley Fool, one company offering a chatbot squeeze “based on influencer Caryn Marjorie, had a wait list with more than 15,000 people” wanting to get up-close and personal with it.

Other developers of computer-generated companions charge users anywhere from a US$0.99 single-download price, to a rapidly compounding US$1-per-minute fee. Most subscriptions run from about US$5 to US$30 per month, according to The Pricer.

With people willing to spend real money to create a relationship with an AI-based being rather than actually meeting people, it’s not surprising that the AI girlfriend business is booming. Created by startup Luka in 2017, for example, the site for the Replika sort-of-friend says the app has “10 million users worldwide after seeing a 35% increase during the global pandemic”, with the “#Replika (having) over 110 million views on TikTok alone.”

As these virtual companions provide a digital ear, hearing people out, then using their AI models’ learning capabilities to parrot back things their human users yearn to hear in return – what could the Mozilla Foundation possibly find at fault with amidst all that virtual bliss?

“To be perfectly blunt, AI girlfriends are not your friends,” wrote Misha Rykov, a researcher who participated in the organisation’s investigation into how the apps manage user data.

“Although they are marketed as something that will enhance your mental health and well-being, they specialise in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Well, yeah, but beside those niggling objections, what’s Mozilla’s gripe?

For starters, the authors noted, 90% of the companies selling AI chatbot sweethearts reviewed “may share or sell your personal data”.

Meanwhile, 65% don’t say whether they use encryption to protect all the sweet nothings users whisper to their AI companions, and 45% allow weak password selections – both of which increase vulnerability to hackers.

And with 54% of apps scrutinised not allowing clients to delete collected data, anything muttered in the heat of digital passion becomes third-party property for posterity.

“All 11 romantic AI chatbots we reviewed earned our *Privacy Not Included warning label – putting them on par with the worst categories of products we have ever reviewed for privacy,” the Mozilla Foundation report said.

“In their haste to cash in, it seems like these rootin’-tootin’ app companies forgot to address their users’ privacy or publish even a smidgen of information about how these AI-powered large language models (LLMs) – marketed as soulmates for sale – work. We’re dealing with a whole ‘nother level of creepiness and potential privacy problems.”

Just as troubling are the reports of customers growing emotionally attached to their personal chatbot confidantes, with some so smitten they follow darker AI generated urgings to do harm to others, or themselves. One user was convicted of plotting to kill Queen Elizabeth II after being encouraged to act by his virtual Lady Macbeth, and the suicide of another was blamed by his widow on a chatbot’s influence.

Mozilla isn’t the only critic of the effect of AI companions’ effects on users. A class-action lawsuit by app customers filed in the Northern District of California Wednesday alleges Tinder, Hinge, Match, and other dating apps are designed to function like romantic online games to “lock users into a perpetual pay-to-play loop” that can become addictive.

Separate litigation was also filed yesterday by New York City, its schools, and public hospital organisations against the owners of Facebook, Instagram, TikTok, Snapchat, and YouTube. The suit charges those social media platforms similarly create dependency among young users, which is detrimental to their mental and physical health, and lives.

What do those multi-billion-dollar Internet giants have in common with the startups creating AI companion apps? They all started out small, with ideas that initially seemed quirky, even nutty, and unlikely to catch on. Clearly, each found an audience and customer base that embraced them.

If that history is any guide, Replika’s virtual girlfriends may wind up even bigger than Siri, Alexa, and their other Silicon Valley virtual predecessors. – Inc./Tribune News Service

Sila Baca Juga

Coordinated efforts shared learning among OIC states essential for digital

Coordinated efforts, shared learning among OIC states essential for digital transformation, says Fahmi

ISTANBUL: Malaysia believes that coordinated efforts and shared learning among Organisation of Islamic Cooperation (OIC) …