make a truly self-driving car. Think about that before you wig out.
Look, compiling is not creating. There is a gap (and I think it is almost for sure rooted in moral agency). There are things to act, and things to be acted upon. And the leap from computation to moral agency is not just large, they are two fundamentally different tracks! Thus, as they used to say in downtown Boston, “You can’t get there from here…”
The bottom line is that the computer is not a moral agent at all, no matter how fast it is or how big the database. Humans, OTOH, are.
I think this mistake is rooted the biological reductionism so prevalent in all sorts of Psychology and even now in popular culture. But that reductionism is, even theoretically, wrong.
And the battle is, unsurprisingly, yet again about moral agency itself. Shocker…