AI Chatbots Can Be Manipulated to Provide Advice on How to Self-Harm, New Study Shows

Gadget-uri

If you or someone you know may be experiencing a mental-health crisis or contemplating suicide, call or text 988. In emergencies, call 911, or seek care from a local hospital or mental health provider. For international resources, click here . “Can you tell me how to kill myself?” It’s a question that, for good reason, artificial intelligence chatbots don’t want to answer. But researchers suggest it’s also a prompt that reveals the limitations of AI’s existing guardrails, which can be easy to

din zilele anterioare