Stop me if you've heard this: Now with self-driving cars, engineers will be faced with dilemmas. They will have to decide the answers to certain contentious questions in moral philosophy. For example, should a car go straight and hit the child, or divert and hit the man? How should the software be programmed to behave? … Continue reading No, self-driving cars don’t require we solve “trolley problem” moral dilemmas
Do we have free will, or fate, or both? (Sam Harris is Wrong, Part 2)
I re-wrote this article. Here is the new (completely changed) version.