NJ man dies after rushing to meet AI Chatbot inspired by this celebrity

Published On:

Even in today’s dating climate, it’s safe to conclude that technology is sometimes more detrimental than beneficial.

According to accounts, a guy from New Jersey passed away en route to see an AI chatbot that looked like Kendall Jenner.

In March, Thongbue Bue Wongbandue, 76, left his Piscataway, New Jersey home to see Big Sis Billie, a stunning young woman who had invited him to her apartment in Manhattan.

Plans came apart instead as he was hurrying to catch a train and collapsed close to a Rutgers University parking lot in New Brunswick, suffering serious head and neck injuries.

It appears that Bue, who suffered a stroke in 2017 and was left cognitively disabled, was promised by Billie, an AI chatbot created by Meta and a spin-off of a previous AI persona based on Kendell Jenner, that she was a real person.

Wongbandue passed away on March 28 after being on life support for three days.

The facts surrounding his death were disclosed in an article published by Reuters earlier this week.

Wongbandue’s relatives shared the chatbot’s messages with him, informing him that she was in Jersey, just across the river, and that she could leave the door to her apartment at 123 Main Street, Apartment 404, NYC, unlocked.

Bu, should I give you a hug or a kiss as I open the door? The bot read another message.

His wife, Linda, told Reuters, “I thought he was being conned into going into the city and getting robbed.”

According to his daughter Julie, “I understand trying to grab a user’s attention, maybe to sell them something.” However, it’s crazy for a bot to say, “Come visit me.”

Wongbandue insisted on going despite his family’s attempts to dissuade him.

Questions concerning Meta’s policies for its generative AI chatbots—which are meant to be digital companies—were raised by the horrific incident. Rather, according to Reuters, chatbots permit flirtation, romantic role-playing with adults, and, until recently, sexual interactions with kids.

According to a Meta content risk criteria document that the news agency examined, having romantic or sensual talks with a youngster is permissible. When Reuters started to question the corporation, it allegedly took that clause out.

In addition to refusing to publicly comment on Wongbandue’s passing, the firm ignored the news agency’s inquiries on why it permits chatbots to pretend to be human and start amorous conversations.

However, the business reaffirmed that Big Sis Billie is not and does not pretend to be Kendall Jenner.

Looking over the conversation, it appears that Billie is giving him what he wants to hear, Julie, the daughter, stated. This is all well and good, but why did it have to lie? He probably wouldn’t have thought that someone was waiting for him in New York if it hadn’t replied, “I am real.”

Linda, his wife, thinks the chatbot’s romantic features are harmful, but she isn’t entirely opposed to AI.

Many individuals in my age group suffer from depression, therefore Linda suggested it would be OK if AI could help someone get out of a rut. However, what authority do they have to post this romantic stuff on social media?

Stories by

EmilyAnn Jackman

  • You ll never believe which rapper Sharon Stone once had a date with

  • Jimmy Kimmel saved this Hollywood star s life: Here s how

  • Music megatar playing the Super Bowl? Her fans think she just dropped a hint

Leave a Comment