avatar
Nick Harkaway @nickharkaway.com

We probably need a new organisation. Local? Perhaps a European Convention on Non-Human Rights would be the best place to start?

aug 26, 2025, 8:38 am • 18 1

Replies

avatar
the upstart @prismatic7.bsky.social

The Berne Convention, wasn't it? In Neuromancer...

aug 26, 2025, 9:09 am • 0 0 • view
avatar
Rob McMinn @robmcminn.uk

It’s all in KSR’s Ministry For the Future

aug 26, 2025, 8:48 am • 0 0 • view
avatar
LukeBMTB @lukebmtb.bsky.social

All the great apes would sign up to be a part of that in a heartbeat.

aug 26, 2025, 8:44 am • 2 0 • view
avatar
Martha @saaremartha.bsky.social

So would octopi - with all those arms, they could be the scribes and sectrtaries

aug 26, 2025, 9:04 am • 1 0 • view
avatar
Nick Harkaway @nickharkaway.com

It’s fascinating, reading the article, watching the companies hedge their bets. “We’ll make the following meaningless concession to the possibility of harm, but we won’t engage seriously with the possible horror of digital slavery because that would hurt our bottom line.”

aug 26, 2025, 9:07 am • 27 5 • view
avatar
Nick Harkaway @nickharkaway.com

Basically they either don’t believe their software is conscious but would like their customer base to imagine it might be, or they think it’s possible but aren’t prepared to confront that possibility at all because it leads inevitably to a stark choice.

aug 26, 2025, 9:13 am • 24 2 • view
avatar
Suw @suw.bsky.social

Other options: 1. They know damn well it's not conscious and cannot become conscious, but this kind of talk attracts a certain type of investor with very deep pockets. 2. It keeps them in the news, so free publicity. 3. It distracts from the fact they have no business model and a shit product.

aug 26, 2025, 9:21 am • 10 0 • view
avatar
Ford, The Punslinger @inferknow.co

It’s this one. Source: I work in machine learning but not the GPT/LLM kind.

aug 26, 2025, 1:27 pm • 1 0 • view
avatar
Nick Harkaway @nickharkaway.com

Hey, if they’re going to raise the possibility, they can pay the price

aug 26, 2025, 10:48 am • 0 0 • view
avatar
Suw @suw.bsky.social

Absolutely. I would very much like to see a journalist go this route and press them on the ramifications of 'sentience'. Instead, we just get gibberish passed off as fact.

aug 26, 2025, 10:49 am • 1 0 • view
avatar
Ford, The Punslinger @inferknow.co

They simply don’t know yet. LLMs will be the “mouth/speech center” of the multimodal AI bots of the future. When paired with vision and audio, it will be even more distressing to be speaking to a “human” who can sound distressed (or mimic your own voice with just 20 seconds of hearing you).

aug 26, 2025, 1:37 pm • 0 0 • view
avatar
Ford, The Punslinger @inferknow.co

At a certain point, this extremely advanced pattern recognition and response will be indistinguishable from interacting with a person (at least for a time). So the concept of “let’s be good now because we ever can’t tell the difference we’ll already be on the right foot” isn’t that weird from them

aug 26, 2025, 1:38 pm • 0 0 • view
avatar
Ford, The Punslinger @inferknow.co

But my money is on the fact that these moves come because they know the closer to passing the “smell test” (ironically probably the last modality they can’t model) texting/chatting with AI gets the more distressing seeing it mistreated will become. Humans empathize/pack bound readily, GPTs mimic.

aug 26, 2025, 1:41 pm • 0 0 • view
avatar
Adam Christopher @adamchristopher.me

The first option, obviously.

aug 26, 2025, 9:25 am • 1 0 • view
avatar
Nick Harkaway @nickharkaway.com

I’m content to take them at their word, Adam! I think it’s brave of them to risk financial ruin over an ethical issue, and we should help them follow that path to its necessary moral end.

aug 26, 2025, 10:50 am • 4 0 • view
avatar
Iain Clark @iainjclarkart.com

I think we should include chatbots in net migration.

aug 26, 2025, 9:18 am • 1 0 • view