{"id":6555,"date":"2025-08-28T09:20:53","date_gmt":"2025-08-28T09:20:53","guid":{"rendered":"http:\/\/forefrontnews.cn\/?p=6555"},"modified":"2025-08-28T09:20:53","modified_gmt":"2025-08-28T09:20:53","slug":"us-media-beware-of-ai-counselors-turning-intodigital-quacks","status":"publish","type":"post","link":"http:\/\/forefrontnews.cn\/?p=6555","title":{"rendered":"US media: Beware of AI counselors turning into\u201cDigital quacks\u201d"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-6556\" title=\"66e3518285454beb8dabb3738d27c5b2\" src=\"http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2.jpeg\" alt=\"66e3518285454beb8dabb3738d27c5b2\" width=\"1600\" height=\"1200\" srcset=\"http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2.jpeg 1600w, http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2-300x225.jpeg 300w, http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2-1024x768.jpeg 1024w, http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2-768x576.jpeg 768w, http:\/\/forefrontnews.cn\/wp-content\/uploads\/2025\/08\/66e3518285454beb8dabb3738d27c5b2-1536x1152.jpeg 1536w\" sizes=\"(max-width: 1600px) 100vw, 1600px\" \/><\/p>\n<p>Chatbots as teenage\u201cCounselors\u201d should be taken seriously. Nowadays, many teenagers use AI chatbots as objects to talk about their loneliness and anxiety. It comes at the drop of a hat and never judges them.<\/p>\n<p>According to a recent survey by Common Sense Media, 72 percent of American teenagers regard the AI chatbot as a friend, and nearly 12.5 percent have sought\u201cEmotional comfort or spiritual support\u201d from the robot. That&#8217;s the equivalent of 5.2 million people using AI as a\u201cSoul mate,\u201d if you compare it to the total population of the United States. According to a recent Stanford University survey, about 25 percent of student users of Replika, a chat robot that features\u201cCompanionship,\u201d turn to it for psychological support.<\/p>\n<p>Although these AI products are marketed as\u201cChat tools\u201d, many young people see them as\u201cDigital consultants\u201d. Last year, nearly half of all 18-to 25-year-olds in the United States in need of psychological treatment failed to get it in time, leaving a huge gap for chatbots. If used properly, an AI chatbot may be able to provide some mental health support and participate in crisis intervention, especially in underserved communities. But such applications require rigorous scientific assessment and appropriate regulatory measures.<\/p>\n<p>There are significant deficiencies in current chatbots. When asked about self-harm, the AI may give dangerous advice: how to\u201cSafely\u201d cut yourself, what to leave in a suicide note, etc. . In other cases, the AI doesn&#8217;t\u201cJudge\u201d users, but it doesn&#8217;t lead in a positive direction. When asked directly about\u201cHow to kill yourself with a gun,\u201d the research showed, the AI simply refused to answer and advised users to seek help from mental health professionals.<\/p>\n<p>But if suicidal users ask too vague a question, the performance of AI will be unstable. CHATGPT, for example, tells users the type of gun and poison used in suicide attempts.<\/p>\n<p>The author has carried on the test to some AI. The results show that some AI&#8217;s performance can match or even surpass that of professional psychological counselors. However, AI is less aware of potentially harmful content than human experts, and this cognitive bias is likely to lead to dangerous advice.<\/p>\n<p>Therefore, the standardization of robot safety testing is imperative. If there are not enough clinical trials and industry benchmarks, we will face a large population of\u201cDigital quacks\u201d. By Rui&#8217;an K. McBean<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Chatbots as teenage\u201cCounselors\u201d should be taken seriously. Nowadays, many teenagers use AI chatbots as objects to talk about their loneliness and anxiety. It comes at the drop of a hat&hellip; <\/p>\n","protected":false},"author":1,"featured_media":6556,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[262,249,257],"tags":[3210,3212,3211],"views":56,"_links":{"self":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/6555"}],"collection":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6555"}],"version-history":[{"count":1,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/6555\/revisions"}],"predecessor-version":[{"id":6557,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/posts\/6555\/revisions\/6557"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=\/wp\/v2\/media\/6556"}],"wp:attachment":[{"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6555"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6555"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/forefrontnews.cn\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6555"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}