BGR.COM
This outlandish story about ChatGPT can teach us all a lesson about AI
I've been following ChatGPT ever since the AI chatbot went viral in late 2022. I started using it shortly after, and it wasn't long before I became a ChatGPT Plus subscriber. I've been covering it along the way, explaining its new features and telling you how you might want to use them to speed up personal or work tasks.From the get-go, I told you that you can't trust everything the AI says. Always ask for proof and check sources. It's not just ChatGPT that hallucinates, which means making up information. Other AI models do it, too. What's more, recent OpenAI research shows that its most advanced ChatGPT models tend to hallucinate even more than previous versions, underscoring the need to verify the information the AI gives you.I'm repeating that because I just found the dumbest use of ChatGPT AI yet, which only reinforces my point: You need to learn exactly what genAI products like ChatGPT can do for you and then verify that what they tell you is factually correct, before actually playing with them.So what's the dumbest use of ChatGPT yet? A woman from Greece reportedly used ChatGPT to perform coffee grounds reading on pictures of coffee cups. The AI told her that her husband was either going to have an affair with another woman or that he was having one. The woman filed for divorce, convinced that what ChatGPT had told her was true.Continue reading...The post This outlandish story about ChatGPT can teach us all a lesson about AI appeared first on BGR.Today's Top DealsTodays deals: $189 Apple Watch SE, 15% off Energizer batteries, $144 queen memory foam mattress, moreTodays deals: $399 iPad mini 7th-Gen, $395 75-inch smart TV, $400 eufy Omni C20, Energizer batteries, moreTodays deals: $150 AirPods 4 with ANC, $30 JBL speaker, $55 Ring Battery Doorbell, $279 Miele C1 vacuum, moreBest Fire TV Stick deals for May 2025
0 Поделились
212 Просмотры