How should we treat thinking machines and human-like robots? David Gelernter, is a professor of computer science at Yale University, says that Jewish thought offers us a way to proceed :
One way to discuss the problem is in the terms developed by Martin Buber, who created an ethics and theology based on relations among I , you , and it . For Buber, I and you can enter sympathetically into each others lives; our mental worlds flow together. But I and it are permanently separate. When I converse with an it , I do not actually converse at all; I conduct a monologue in which one party is me and the other is also me. This other is my own private, personal conception of someone or something else.Buber used these terms to describe relations among human beings and between human beings and God. But we can press them into service in a different, simpler context. We can say that an I always has moral duties to a you . But ordinarily, an I has no moral duties to an it .
Does a machine, once it has become intelligent, make the transition from it to you ? Or do I sometimes have moral duties to an it as if it were a you ? Could I have moral duties to a mere thing that is unconscious, has never been conscious, and never will be?
Read more . . .
While I have you, can I ask you something? I’ll be quick.
Twenty-five thousand people subscribe to First Things. Why can’t that be fifty thousand? Three million people read First Things online like you are right now. Why can’t that be four million?
Let’s stop saying “can’t.” Because it can. And your year-end gift of just $50, $100, or even $250 or more will make it possible.
How much would you give to introduce just one new person to First Things? What about ten people, or even a hundred? That’s the power of your charitable support.
Make your year-end gift now using this secure link or the button below.