This is really bad. Children don't need hallucinating LLMs to confuse them as they try to learn new things. It would be different if we actually had artificial intelligence not just a massive Reddit search engine giving weird answers.
This is really bad. Children don't need hallucinating LLMs to confuse them as they try to learn new things. It would be different if we actually had artificial intelligence not just a massive Reddit search engine giving weird answers.
No replies