Robot cars could very well be programmed to kill their passengers under certain circumstances, namely, when doing so would save a larger number of lives. In other words, they would be programmed to be utilitarians.
The idea that the end justifies the means has been heavily criticized as the basis for a general moral theory. Given the wide variety of viewpoints on the matter, it seems to me that the only sensible approach is to ensure that passengers of robot cars have full control over the software that runs them. The free software movement has been highly successful in their endeavor of giving users control over desktop PCs. The movement is working towards the same goals with regard to smartphones.
For the sake of safety and giving users control over their environment, extending the free software movement to robots of all kinds may well be a moral imperative.