avatar
AkivaMCohen @akivamcohen.bsky.social

I don't think there are, at least where the substance of the chat isn't selected for by the author of the product

aug 31, 2025, 5:23 pm • 7 0

Replies

avatar
Ari Cohn @aricohn.com

Sure there are, it's in the aeronautical charts line of cases

aug 31, 2025, 9:39 pm • 1 0 • view
avatar
AkivaMCohen @akivamcohen.bsky.social

Citation?

sep 1, 2025, 2:52 pm • 1 0 • view
avatar
Ari Cohn @aricohn.com

From Winter v. GP Putnam, here are cites to some of the existing cases at that time; there are further ones but the reasoning of the 9th here makes there relevance explicit so wanted to provide that.

Plaintiffs' argument is stronger when they assert that The Encyclopedia of Mushrooms should be analogized to aeronautical charts. Several jurisdictions have held that charts which graphically depict geographic features or instrument approach information for airplanes are
sep 1, 2025, 5:21 pm • 1 0 • view
avatar
Ran Ari-Gur @ran-arigur.bsky.social

That passage suggests that software is potentially a "product" to the extent that it doesn't give the result intended by the designer. Since OpenAI does try to prevent harmful output — and anyway wouldn't want to admit that it intended this output — that puts them in a bad position here. No?

sep 1, 2025, 5:42 pm • 1 0 • view
avatar
Ari Cohn @aricohn.com

No it suggests that if its a specialized technical tool, that's the case

sep 1, 2025, 5:44 pm • 1 0 • view
avatar
Joe Haydu @joehaydu.bsky.social

That’s just it, the substance of a driving AI/ML system is always selected by the designer, it’s trained to a specific outcome. It’s very different, technically, from an LLM, the exception you just proposed doesn’t actually exist.

aug 31, 2025, 5:30 pm • 1 0 • view