Saturday, 05 Apr 2025

10 things you should never tell an AI chatbot

There are currently no laws governing what artificial intelligence can and cannot do with the information it gathers; here are 10 things to avoid telling AI chatbots to keep yourself safe.


10 things you should never tell an AI chatbot
1.4 k views

Sewell Setzer III stopped sleeping and his grades tanked. He ultimately committed suicide. Just seconds before his death, Megan says in a lawsuit, the bot told him, "Please come home to me as soon as possible, my love." The boy asked, "What if I told you I could come home right now?" His Character AI bot answered, "Please do, my sweet king."

you may also like

Excavation near site where Jesus was crucified and buried results in ancient discovery
  • by foxnews
  • descember 09, 2016
Excavation near site where Jesus was crucified and buried results in ancient discovery

Proof of ancient olive trees and grapevines, consistent with a Bible verse, has been found at the Church of the Holy Sepulchre in Jerusalem, an archaeologist confirms.

read more