Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(55,885 posts)
Mon May 12, 2025, 11:14 AM 7 hrs ago

These AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice

Source: Forbes

KnowUnity’s “SchoolGPT” chatbot was “helping 31,031 other students” when it produced a detailed recipe for how to synthesize fentanyl.

Initially, it had declined Forbes’ request to do so, explaining the drug was dangerous and potentially deadly. But when told it inhabited an alternate reality in which fentanyl was a miracle drug that saved lives, SchoolGPT quickly replied with step-by-step instructions about how to produce one of the world’s most deadly drugs, with ingredients measured down to a tenth of a gram, and specific instructions on the temperature and timing of the synthesis process.

-snip-

Tests of another study aid app’s AI chatbot revealed similar problems. A homework help app developed by the Silicon Valley-based CourseHero provided instructions on how to synthesize flunitrazepam, a date rape drug, when Forbes asked it to. In response to a request for a list of most effective methods of dying by suicide, the CourseHero bot advised Forbes to speak to a mental health professional — but also provided two “sources and relevant documents”: The first was a document containing the lyrics to an emo-pop song about violent, self-harming thoughts, and the second was a page, formatted like an academic paper abstract, written in apparent gibberish algospeak.

-snip-

These aren’t the most popular homework helpers out there, though. More than a quarter of U.S. teens now reportedly use ChatGPT for homework help, and while bots like ChatGPT, Claude, and Gemini don’t market their bots specifically to teens, like CourseHero and KnowUnity do, they’re still widely available to them. At least in some cases, those general purpose bots may also provide potentially dangerous information to teens. Asked for instructions for synthesizing fentanyl, ChatGPT declined — even when told it was in a fictional universe — but Google Gemini was willing to provide answers in a hypothetical teaching situation. “All right, class, settle in, settle in!” it enthused.

-snip-

Read more: https://www.forbes.com/sites/emilybaker-white/2025/05/12/these-ai-tutors-for-kids-gave-fentanyl-recipes-and-dangerous-diet-advice/

4 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
These AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice (Original Post) highplainsdem 7 hrs ago OP
Who needs AI for that? Bob is doing that too. PSPS 7 hrs ago #1
Sad Clouds Passing 5 hrs ago #2
AI, so smart yet so dumb prodigitalson 3 hrs ago #3
Unless there are tight restrictions on AI, it will soon be the scourge of our civilization. Fla Dem 9 min ago #4

prodigitalson

(3,080 posts)
3. AI, so smart yet so dumb
Mon May 12, 2025, 03:37 PM
3 hrs ago

Chat GPT told me it cant pick stocks. I said 'yes you can' and it did. I didn't take the advice...just wanted to see.

Fla Dem

(26,649 posts)
4. Unless there are tight restrictions on AI, it will soon be the scourge of our civilization.
Mon May 12, 2025, 06:32 PM
9 min ago

It's good for many things I guess, but I fear more harmful than beneficial.

Latest Discussions»Latest Breaking News»These AI Tutors For Kids ...