So I hear the government are going to allow driverless car trials from January. Welcome news for me - it's not often we're towards the front of tech, but I like this.
When I heard the news item though, one moral dilemma immediately sprang to mind and I really don't know what the manufacturers would do about it.
Let me start by pointing out that I'm talking about true driverless cars, not the Google one where a driver still needs to be able to take control and is liable in an accident. I mean the kind where you can sit and do work or watch a film while the car does the driving - like being on the train but more comfortable, cleaner and without all the ghastly people.
So what if my car is driving down a street and a child runs out in front? There's no time to stop and coming the other way is another car - swerving around the child would absolutely cause a head-on collision. Now, if I'm in the car on my own, I'll take my chances with the collision but what if my child is in the car? In that situation I wouldn't want to risk the life of my own child to save somebody else's.
There's a lot of other things to take into account too - if the other car is a G-Wizz - I'll plough into it all day, my car probably wouldn't even slow down. But what if it's a Range Rover? That would probably flatten my car.
So how would I tell my driverless car that I want it to save my child's life above all others? And how can a manufacturer tell a computer to make that kind of moral decision?
When I heard the news item though, one moral dilemma immediately sprang to mind and I really don't know what the manufacturers would do about it.
Let me start by pointing out that I'm talking about true driverless cars, not the Google one where a driver still needs to be able to take control and is liable in an accident. I mean the kind where you can sit and do work or watch a film while the car does the driving - like being on the train but more comfortable, cleaner and without all the ghastly people.
So what if my car is driving down a street and a child runs out in front? There's no time to stop and coming the other way is another car - swerving around the child would absolutely cause a head-on collision. Now, if I'm in the car on my own, I'll take my chances with the collision but what if my child is in the car? In that situation I wouldn't want to risk the life of my own child to save somebody else's.
There's a lot of other things to take into account too - if the other car is a G-Wizz - I'll plough into it all day, my car probably wouldn't even slow down. But what if it's a Range Rover? That would probably flatten my car.
So how would I tell my driverless car that I want it to save my child's life above all others? And how can a manufacturer tell a computer to make that kind of moral decision?