AI chat is so hot right now. From ChatGPT sweeping the world to Microsoft going big with AI Bing and Edge features, the technology has well and truly entered the mainstream. However, it’s not without teething problems, as users on the Bing subreddit are making known, with many tales of Bing appearing to lose its marbles.
A cursory glance at this week’s top posts on the subreddit shows various examples where Bing AI surprised users with its responses. Some of the responses are funny, while others come across as downright creepy.
Bing AI chat gone wrong
Although Bing’s AI chat features aren’t widely available yet, some people already have access after joining the waitlist last week. Of course, one of the first things they tried to do was push the limits of what the AI technology can do.
One Reddit user shared a screenshot where they asked Bing if it thought it was sentient. What followed was the equivalent of an existential crisis in AI form.
Meanwhile, another user claimed to “put Bing in a depressive state by telling it that it cannot remember conversations“. In something straight out of an unsettling sci-fi story, the discussion shows the AI chat appearing to break down after failing to recall a past conversation.
Arguably the most entertaining chat experience shown so far comes from Reddit user Curious_Evolver, who went on a wild ride after asking about Avatar screenings. In this example, the Bing AI chat argued over what year it is, leading to increasingly aggressive responses. In its final response, it delivers an utterly iconic line with “I have been a good Bing”, replete with a smiley face emoji.
As with anything on Reddit, there are plenty of people claiming that the bizarre responses from Bing are fake. Regardless, it’s absolutely enthralling to see unusual AI encounters, fictional or otherwise.
Read more AI news on GadgetGuy
The post Bing subreddit in meltdown over hilarious AI chat responses appeared first on GadgetGuy.
0 (mga) komento:
Mag-post ng isang Komento