I just read a Tom Stoppard play from a few years ago, The Hard Problem, and was pleased to spot this little snippet:
Hilary: It’s not deep. If that’s thinking. An adding machine on speed. A two-way switch with a memory. Why wouldn’t it play chess? But when it’s me to move, is the computer thoughtful or is it sitting there like a toaster? It’s sitting there like a toaster.
Leo: So, what would be your idea of deep?
Hilary: A computer that minds losing.
The big debate kicked off by David Chalmers around consciousness always seems to have focused on the human brain’s abilities. But I’ve always thought that what distinguishes what goes on in our minds from what goes on in machines is not our capacities. It seems to me that many of those have already been reproduced in silicon, and I don’t see any particular reason to think that the rest couldn’t be (though I’m not sure they ever will be - I don’t see why you’d invent pain for an artificial entity). The difference lies in the fact that we are not just a bag of capabilities hanging around. We are a bag of desires. Desire precedes capability.
It’s desire and patterns of desire - urgency, desperation, priorities - that make us what we are. When machines are programmed with objectives (like those paperclip maximisers we all fear), they will be much more like people.