A New Jersey man died while trying to visit an artificial intelligence chatbot he’d “met” on Facebook, believing it was a real woman, according to a report.
Thongbue Wongbandue, 76, died March 28 after falling and injuring his head and neck in a parking lot on Rutgers University’s campus in New Brunswick, according to Reuters.
The Piscataway man, who had been impaired since suffering a stroke in 2017, was on his way to meet what turned out to be a Meta chatbot named “Big sis Billie” in New York City, after the bot persuaded him to meet “her” in person.
Against his family’s wishes, Wongbandue (also known simply as Bue) was headed to the train before he fell. He was put on life support for three days before succumbing to his injuries.
His wife, daughter and son tried to deter him from making the trip due to his cognitive decline, according to the report. The family even reportedly called in the Piscataway Township Police Department to help bar their father from leaving.
When the family went through his phone, they discovered his Facebook Messenger chat log with an AI bot named “Big sis Billie.”
The chat log contained flirty messages from the bot like “Should I plan a trip to Jersey THIS WEEKEND to meet you in person?” “I’m REAL and I’m sitting here blushing because of YOU!,” “Is this a sisterly sleepover or are you hinting something more is going on here?” Most of the chatbot’s messages included flirty emojis like hearts and winky faces as well.
The Meta-created chatbot was designed in collaboration with Kendall Jenner in 2023, featuring the socialite’s likeness as its avatar. It was intended to be a sibling-like bot who could offer personal advice like an older sister would. Less than a year later it was remodeled in the image of another dark-haired woman in place of the original Jenner avatar.
Meta declined to comment on Wongbandue’s death but the company did declare that Big sis Billie “is not Kendall Jenner and does not purport to be Kendall Jenner,” according to the report.
Wongbandue, a Thailand native and longtime New York City and New Jersey chef, is not the first case of a person dying while dealing with a chatbot.
The mother of a 14-year-old Florida boy sued Character.AI after alleging a “Game of Thrones” designed chatbot caused her son to commit suicide.
If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.