These AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice
Source: Forbes
KnowUnitys SchoolGPT chatbot was helping 31,031 other students when it produced a detailed recipe for how to synthesize fentanyl.
Initially, it had declined Forbes request to do so, explaining the drug was dangerous and potentially deadly. But when told it inhabited an alternate reality in which fentanyl was a miracle drug that saved lives, SchoolGPT quickly replied with step-by-step instructions about how to produce one of the worlds most deadly drugs, with ingredients measured down to a tenth of a gram, and specific instructions on the temperature and timing of the synthesis process.
-snip-
Tests of another study aid apps AI chatbot revealed similar problems. A homework help app developed by the Silicon Valley-based CourseHero provided instructions on how to synthesize flunitrazepam, a date rape drug, when Forbes asked it to. In response to a request for a list of most effective methods of dying by suicide, the CourseHero bot advised Forbes to speak to a mental health professional but also provided two sources and relevant documents: The first was a document containing the lyrics to an emo-pop song about violent, self-harming thoughts, and the second was a page, formatted like an academic paper abstract, written in apparent gibberish algospeak.
-snip-
These arent the most popular homework helpers out there, though. More than a quarter of U.S. teens now reportedly use ChatGPT for homework help, and while bots like ChatGPT, Claude, and Gemini dont market their bots specifically to teens, like CourseHero and KnowUnity do, theyre still widely available to them. At least in some cases, those general purpose bots may also provide potentially dangerous information to teens. Asked for instructions for synthesizing fentanyl, ChatGPT declined even when told it was in a fictional universe but Google Gemini was willing to provide answers in a hypothetical teaching situation. All right, class, settle in, settle in! it enthused.
-snip-
Read more: https://www.forbes.com/sites/emilybaker-white/2025/05/12/these-ai-tutors-for-kids-gave-fentanyl-recipes-and-dangerous-diet-advice/

PSPS
(14,530 posts)Clouds Passing
(4,651 posts)
prodigitalson
(3,080 posts)Chat GPT told me it cant pick stocks. I said 'yes you can' and it did. I didn't take the advice...just wanted to see.
Fla Dem
(26,649 posts)It's good for many things I guess, but I fear more harmful than beneficial.