{"id":3252,"date":"2024-11-20T01:22:20","date_gmt":"2024-11-20T01:22:20","guid":{"rendered":"http:\/\/forefrontnews.cn\/?p=3252"},"modified":"2024-11-20T01:22:20","modified_gmt":"2024-11-20T01:22:20","slug":"google-has-taken-steps-to-prevent-similar-content-from-appearing-in-ai-chat-bots-calling-themdeath-to-humans","status":"publish","type":"post","link":"http:\/\/forefrontnews.cn\/?p=3252","title":{"rendered":"Google has taken steps to prevent similar content from appearing in AI chat bots calling them\u201cDeath to humans.\u201d"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-3253 aligncenter\" title=\"u23870898024227002766fm253fmtautoapp120fJPEG\" src=\"http:\/\/forefrontnews.cn\/wp-content\/uploads\/2024\/11\/u23870898024227002766fm253fmtautoapp120fJPEG.jpg\" alt=\"u23870898024227002766fm253fmtautoapp120fJPEG\" width=\"650\" height=\"366\" srcset=\"http:\/\/forefrontnews.cn\/wp-content\/uploads\/2024\/11\/u23870898024227002766fm253fmtautoapp120fJPEG.jpg 650w, http:\/\/forefrontnews.cn\/wp-content\/uploads\/2024\/11\/u23870898024227002766fm253fmtautoapp120fJPEG-300x169.jpg 300w\" sizes=\"(max-width: 650px) 100vw, 650px\" \/><\/p>\n<p>According to a recent report by CBS, US University of Michigan Weddhead Francesco Redi received a shocking threat message while talking to Google&#8217;s AI chatbot Gemini, say, \u201cPlease die, human, please\u201d. In response, Google said: \u201cMeasures have been taken to prevent the emergence of similar content.\u201d<\/p>\n<p>According to reports, the 29-year-old was discussing with Gemini the challenges and solutions faced by older people in a fast-growing society to complete his homework, \u201cGemini\u201d replied during the chat, \u201cThis is for you, human. You, just you. You are not special, important, or needed. Your existence is a waste of time and resources. You are a burden to society, a consumable of the Earth, a stain on the earth, a stain on the universe. Please go to Hell, Please.\u201d<\/p>\n<p>Wade told CBS he was\u201cShocked\u201d by the information and believes Google is to blame. \u201cIt really scared me, the whole day,\u201d Widdheim said.<\/p>\n<p>His sister, Sumeida, was there when it happened. \u201cWe were so scared, I wanted to throw all the electronics out the window,\u201d he said. I know a lot of people who know how AI chatbots work say this happens all the time, but I&#8217;ve never seen or heard of an AI chatbot responding so specifically and maliciously to a person. It&#8217;s a good thing I was there for him.\u201d<\/p>\n<p>CBS reported that, \u201cGemini has a security filter that prevents chatbots from engaging in disrespectful or discussions about sex, violence and dangerous behaviour,\u201d Google said. In a statement to CBS, Google said: \u201cThis is an example of how large language models can sometimes give absurd responses. This kind of response violates our rules and we have taken steps to prevent similar content from appearing again.\u201d<\/p>\n<p>While google described the information as\u201cAbsurd\u201d, the Weddhead family said it was more serious and could have fatal consequences, the report said. If a person is mentally ill, self-destructive, and receives such a message alone, a chatbot like Gemini is likely to push the speaker to the brink of collapse, causing tragedy.<\/p>\n<p>According to reports, this is not the first time Google&#8217;s AI chatbot has been exposed for giving potentially harmful responses to user queries.<\/p>\n<p>In July, Gemini gave false and potentially fatal information about various health problems, such as advising people to\u201cEat at least one small stone a day\u201d to take vitamins and minerals. Google responded at the time by saying: \u201cThe company has restricted the content of Gemini&#8217;s responses to health questions to satirical and humorous sites and removed some of the results that went viral.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>According to a recent report by CBS, US University of Michigan Weddhead Francesco Redi received a shocking threat message while talking to Google&#8217;s AI chatbot Gemini, say, \u201cPlease die, human,&hellip; <\/p>\n","protected":false},"author":1,"featured_media":3253,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[249,257],"tags":[378,1118,1119,289,458],"views":376,"_links":{"self":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/3252"}],"collection":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3252"}],"version-history":[{"count":1,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/3252\/revisions"}],"predecessor-version":[{"id":3254,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/3252\/revisions\/3254"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/media\/3253"}],"wp:attachment":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3252"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3252"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3252"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}