Switch Theme:

Self-driving cars -- programming morality  [RSS] Share on facebook Share on Twitter Submit to Reddit
»
Author Message
Advert


Forum adverts like this one are shown to any user who is not logged in. Join us by filling out a tiny 3 field form and you will get your own, free, dakka user account which gives a good range of benefits to you:
  • No adverts like this in the forums anymore.
  • Times and dates in your local timezone.
  • Full tracking of what you have read so you can skip to your first unread post, easily see what has changed since you last logged in, and easily see what is new at a glance.
  • Email notifications for threads you want to watch closely.
  • Being a part of the oldest wargaming community on the net.
If you are already a member then feel free to login now.




Made in us
Keeper of the Flame





Monticello, IN

That's because we're fresh out of crashed alien ships to reverse engineer from...

www.classichammer.com

For 4-6th WFB, 2-5th 40k, and similar timeframe gaming

Looking for dice from the new AOS boxed set and Dark Imperium on the cheap. Let me know if you can help.
 CthuluIsSpy wrote:
Its AoS, it doesn't have to make sense.
 
   
Made in us
Douglas Bader






 Iron_Captain wrote:
In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians.


1) That "universal" morality is up for dispute. I wouldn't consider the life of the child any more valuable than the life of the adult, and I value my own life higher than any number of random other people if it comes to a question of saving myself vs. saving random strangers. Your personal moral system is not universal, nor is it the system used in our laws.

2) This is an example of what I keep saying about holding automated vehicles to a much higher standard than the human drivers they are potentially replacing. A human driver isn't calmly reflecting on their ethical beliefs and choosing an action based on which potential victim(s) have the higher moral value, they're an idiot texting while driving who sees a flash of person-shaped object in front of them at the last second and swerves to avoid it before looking what might be in their path once they do. Even if the automated car has a pure RNG function that flips a coin between possible casualties it's still going to be no worse than a human driver at making that choice, and its superior sensor systems will likely give it a much higher chance of avoiding the dilemma in the first place by noticing the potential hazards in time to avoid both of them.

(a lot of people, including me, much prefer being in control themselves)


Fortunately you probably won't have a choice about it. Once automated vehicles reach a certain standard of reliability non-automated cars will simply become illegal, much like you can't sell a car without seat belts just because some people prefer to commit suicide in a crash instead of staying alive.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in nl
Pragmatic Primus Commanding Cult Forces






 Peregrine wrote:
 Iron_Captain wrote:
In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians.


1) That "universal" morality is up for dispute. I wouldn't consider the life of the child any more valuable than the life of the adult, and I value my own life higher than any number of random other people if it comes to a question of saving myself vs. saving random strangers. Your personal moral system is not universal, nor is it the system used in our laws.
Then you would be an outlier. There have been plenty of surveys done on the moral principles of people in the social sciences (related to answering questions like how universal they are and how much variation there is between cultures), including some specifically related to self-driving cars. Across the entire world there is a strong preference for saving children rather than adults if there must be a choice. This preference is especially strong in Europe and places heavily influenced by Europe, and less pronounced but still present in Asia and Africa. Similarly there is also a universal unwillingness to murder other people in order to save your own life. Of course, these rules are not truly universal since they do not apply to absolutely everyone. Morals vary from person to person. Some people are more selfish than others, and at the extreme end there are psychopaths who have trouble caring about other people at all etc. But unless these studies are somehow all wrong, these rules do apply to the vast majority of the world's population.

 Peregrine wrote:
2) This is an example of what I keep saying about holding automated vehicles to a much higher standard than the human drivers they are potentially replacing. A human driver isn't calmly reflecting on their ethical beliefs and choosing an action based on which potential victim(s) have the higher moral value, they're an idiot texting while driving who sees a flash of person-shaped object in front of them at the last second and swerves to avoid it before looking what might be in their path once they do. Even if the automated car has a pure RNG function that flips a coin between possible casualties it's still going to be no worse than a human driver at making that choice, and its superior sensor systems will likely give it a much higher chance of avoiding the dilemma in the first place by noticing the potential hazards in time to avoid both of them.
Stark choices like "kill person A or kill person B" as you get in moral dilemmas are indeed unlikely to actually occur on the road. But the thing is, we are nonetheless making subconscious ethical decisions all the time while driving a car. And the answers to those dilemmas reveal the underlying principles upon which those choices are based. A self-driving car's AI needs to make the same ethical choices that a Human driver makes subconsciously. Therefore it also is able to answer these moral dilemmas. And since unlike Human morals, an AI's morals are completely within our control, we can have a debate on what the desirable answers for the AI are. The results of such a debate aren't going to be just useful for self-driving cars, but for all kinds of advanced autonomous AI applications (like AI nurses or AI weapons) that need moral guidelines.

 Peregrine wrote:
(a lot of people, including me, much prefer being in control themselves)


Fortunately you probably won't have a choice about it. Once automated vehicles reach a certain standard of reliability non-automated cars will simply become illegal, much like you can't sell a car without seat belts just because some people prefer to commit suicide in a crash instead of staying alive.

Yeah, dream on. Making seatbelts mandatory is quite a different story from banning cars. No government is ever going to make cars illegal, at least not within our lifetimes. If they tried, well... They wouldn't be a government for very long after that. And even if they tried it gradually by banning the sale of new cars, there'd still be millions of old cars around that aren't going to disappear. People would keep driving and maintaining their old cars.

This message was edited 1 time. Last update was at 2018/11/16 15:05:57


Error 404: Interesting signature not found

 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

Once self-driving cars become safer than human-drivers, the insurance rates for humans will begine to make it less and less practical to self-drive. (We already see this kind of effect in the huge insurance rates that teenage drivers have to pay in the UK.)

This will tend to limit self-driving.

We may eventually reach the situation where the pubilc will not tolerate human driving, and supports a legal ban. This is similar to tolerance of drunk driving, which has massively reduced over the past generation.


I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in nl
Pragmatic Primus Commanding Cult Forces






 Kilkrazy wrote:
Once self-driving cars become safer than human-drivers, the insurance rates for humans will begine to make it less and less practical to self-drive. (We already see this kind of effect in the huge insurance rates that teenage drivers have to pay in the UK.)

This will tend to limit self-driving.

We may eventually reach the situation where the pubilc will not tolerate human driving, and supports a legal ban. This is similar to tolerance of drunk driving, which has massively reduced over the past generation.


Maybe. But we will all be long dead by that time, and probably our children as well. Driving while intoxicated never had anywhere near the kind of acceptance or deep-rootedness in Western culture and society that human driving (aka still the only kind of driving) does. Technological change may be fast, but societies change at a much slower rate as older generations are replaced by new ones.
And even then it is still just a maybe. Lots and lots of people love driving their own car, they aren't going to let that be taken away. Fully automatic cars are the future, but I doubt it will ever fully replace manual driving. Just like how we have cars now, but you can occasionally still find a horse-drawn carriage on the roads. Except you'd find manual cars more frequently of course, simply because cars are more common than carriages ever were and collecting and driving old cars is already a relatively common hobby.

Error 404: Interesting signature not found

 
   
Made in us
Omnipotent Necron Overlord






 Iron_Captain wrote:
 Xenomancers wrote:
Ultimately - the moral compass of the a self driving car should be self preservation of it's occupants. Which ultimately is going to create a situation in which avoiding contact with any other body is going to be it's highest priority. However were contact can not be avoided - maintaining control of the vehicle becomes highest priority. So if there is a choice of running over a baby or an old person the decision will be made instantly based on which route is the safest based on control.


That is not much of a moral compass.

You can program morality without even considering it. The best outcomes will come from protecting itself and avoiding contact with things and maintain control of the vehicle. The mortal qualms of which person pedestrian to run over are so insignificant in the long run it doesn't matter. The system will save so many lives by avoiding accidents due to human error that to do anything but praise it would be immoral. The system that is most moral is the system statistically reduces the most incidents of death and damage.


Automatically Appended Next Post:
 Iron_Captain wrote:
 Kilkrazy wrote:
Once self-driving cars become safer than human-drivers, the insurance rates for humans will begine to make it less and less practical to self-drive. (We already see this kind of effect in the huge insurance rates that teenage drivers have to pay in the UK.)

This will tend to limit self-driving.

We may eventually reach the situation where the pubilc will not tolerate human driving, and supports a legal ban. This is similar to tolerance of drunk driving, which has massively reduced over the past generation.


Maybe. But we will all be long dead by that time, and probably our children as well. Driving while intoxicated never had anywhere near the kind of acceptance or deep-rootedness in Western culture and society that human driving (aka still the only kind of driving) does. Technological change may be fast, but societies change at a much slower rate as older generations are replaced by new ones.
And even then it is still just a maybe. Lots and lots of people love driving their own car, they aren't going to let that be taken away. Fully automatic cars are the future, but I doubt it will ever fully replace manual driving. Just like how we have cars now, but you can occasionally still find a horse-drawn carriage on the roads. Except you'd find manual cars more frequently of course, simply because cars are more common than carriages ever were and collecting and driving old cars is already a relatively common hobby.

I disagree. Once the tech is there is will take over the market in a 10 - 20 (basically the amount of time before someone chooses/needs to buy a new car) year period even without government assistance. Plus governments will be heavily incentivising self driving cars because they will reduce death rate and incedent rate. So basically 10-20 years from the time tesla releases their affordable 35k selfdriving electric car - almost everyone will own one.

This message was edited 1 time. Last update was at 2018/11/16 20:42:36


If we fail to anticipate the unforeseen or expect the unexpected in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorized or easily referenced.
- Fox Mulder 
   
Made in ca
Sagitarius with a Big F'in Gun





 Iron_Captain wrote:
 Peregrine wrote:
 Iron_Captain wrote:
In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians.


1) That "universal" morality is up for dispute. I wouldn't consider the life of the child any more valuable than the life of the adult, and I value my own life higher than any number of random other people if it comes to a question of saving myself vs. saving random strangers. Your personal moral system is not universal, nor is it the system used in our laws.
Then you would be an outlier. There have been plenty of surveys done on the moral principles of people in the social sciences (related to answering questions like how universal they are and how much variation there is between cultures), including some specifically related to self-driving cars. Across the entire world there is a strong preference for saving children rather than adults if there must be a choice. This preference is especially strong in Europe and places heavily influenced by Europe, and less pronounced but still present in Asia and Africa. Similarly there is also a universal unwillingness to murder other people in order to save your own life. Of course, these rules are not truly universal since they do not apply to absolutely everyone. Morals vary from person to person. Some people are more selfish than others, and at the extreme end there are psychopaths who have trouble caring about other people at all etc. But unless these studies are somehow all wrong, these rules do apply to the vast majority of the world's population.


Hypothetical moral generalities don't really matter much in narrow cases. Your own death can feed a pack of hungry cannibals and your inheritance give many families and children a better life. The organs the cannibals don't eat can be donated. So therefore, you're a horrible human for staying alive.

When it comes down to it, no-one is going to want to get in a vehicle that will purposely kill its occupant based on numerically weighed moral absolutes which have been determined by some corporation or legal body.

eg:

auto-car passenger is 69 yars old. 6 teenagers (say all are 16 yrs old) are driving a non-autonomous car, swirving wildly because it's fun to mess with the auto-cars. The car determines it's passenger has on average 16 years of lifespan left. The combined remaining lifespan remaining on the teens is 420 years. An imminent accident is about to occur in 10 milliseconds, and the car has the option of hitting the sidewall which has a 50% chance of killing its passenger and 0% chane of killing the teens. It's other option is hitting the teens, which has a 5% chance of killing it's passenger and a 5% chance of killing the teens. Therefore the math is as follows


Scenerio1: passenger: 16 * .5 = 8 years. teens: 420 * .5 = 0; Total 8 years lost on average.


Scenerio2: passenger 16 * .05 = .8yrs. teens 420 * .05 = 21years. Total 21.05 years lost on average.


Therefore, the car chooses to smack head-on into a wall at full speed since the loss of life on average is lower. This would be assuming a car would be able to determine age, but even using simple number of passengers, you could end up a ton of weird scenarios where you have cars suiciding their drivers instead of bumping into a school bus.
   
Made in nl
Pragmatic Primus Commanding Cult Forces






SirWeeble wrote:
 Iron_Captain wrote:
 Peregrine wrote:
 Iron_Captain wrote:
In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians.


1) That "universal" morality is up for dispute. I wouldn't consider the life of the child any more valuable than the life of the adult, and I value my own life higher than any number of random other people if it comes to a question of saving myself vs. saving random strangers. Your personal moral system is not universal, nor is it the system used in our laws.
Then you would be an outlier. There have been plenty of surveys done on the moral principles of people in the social sciences (related to answering questions like how universal they are and how much variation there is between cultures), including some specifically related to self-driving cars. Across the entire world there is a strong preference for saving children rather than adults if there must be a choice. This preference is especially strong in Europe and places heavily influenced by Europe, and less pronounced but still present in Asia and Africa. Similarly there is also a universal unwillingness to murder other people in order to save your own life. Of course, these rules are not truly universal since they do not apply to absolutely everyone. Morals vary from person to person. Some people are more selfish than others, and at the extreme end there are psychopaths who have trouble caring about other people at all etc. But unless these studies are somehow all wrong, these rules do apply to the vast majority of the world's population.


Hypothetical moral generalities don't really matter much in narrow cases. Your own death can feed a pack of hungry cannibals and your inheritance give many families and children a better life. The organs the cannibals don't eat can be donated. So therefore, you're a horrible human for staying alive.

Lolwut?

SirWeeble wrote:
When it comes down to it, no-one is going to want to get in a vehicle that will purposely kill its occupant based on numerically weighed moral absolutes which have been determined by some corporation or legal body.
No. And nobody wants to get into a vehicle that will purposely plow into a group of school kids crossing the road for the same reason either. Luckily for the car companies, most people aren't likely going to bother with researching the complicated moral programming of their car. Especially not since the risk of your automated car purposely killing you to prevent greater harm is infinitesimally smaller than the risk of you purposely killing yourself while driving a non-automated car. Automated cars already have a smaller chance of accident than human-driven cars. By the time the technology is ready for widespread use that accident chance will have been greatly reduced even further. And then the chance of an accident with an automated car happening whereby the choice is a binary either occupant dies or a group of bystanders dies is pretty much nil. Actual accident scenarios would more likely deal with probabilities of a driver being harmed vs other road users being harmed and then weigh the seriousness of the likely harm (crashing into another car is less likely to lead to lethal harm than crashing into an unprotected cyclist) and a set of moral standards (like killing kids is especially bad) to come to a split-second conclusion. And that is only in the cases where the AI actually can do anything. Because lots of accidents happen where the driver, whether human or AI, can do next to nothing. There was an accident with a self-driving car posted in this thread which is a good example. Someone stepped right in front of the car while it was dark and there was just not enough time for the AI to brake or swerve. Anyways, no AI car is going to purposely kill its occupants. That would only be the result of extreme scenarios that are extremely unlikely to ever occur in real life. Those types of scenarios are ideal for revealing moral standards, but they are unrealistic.

SirWeeble wrote:
auto-car passenger is 69 yars old. 6 teenagers (say all are 16 yrs old) are driving a non-autonomous car, swirving wildly because it's fun to mess with the auto-cars. The car determines it's passenger has on average 16 years of lifespan left. The combined remaining lifespan remaining on the teens is 420 years. An imminent accident is about to occur in 10 milliseconds, and the car has the option of hitting the sidewall which has a 50% chance of killing its passenger and 0% chane of killing the teens. It's other option is hitting the teens, which has a 5% chance of killing it's passenger and a 5% chance of killing the teens. Therefore the math is as follows


Scenerio1: passenger: 16 * .5 = 8 years. teens: 420 * .5 = 0; Total 8 years lost on average.


Scenerio2: passenger 16 * .05 = .8yrs. teens 420 * .05 = 21years. Total 21.05 years lost on average.


Therefore, the car chooses to smack head-on into a wall at full speed since the loss of life on average is lower. This would be assuming a car would be able to determine age, but even using simple number of passengers, you could end up a ton of weird scenarios where you have cars suiciding their drivers instead of bumping into a school bus.
This smells of utilitarianism, and utilitarianism smells bad. In your example, the AI should be able to calculate that the risk of serious harm is much lower in the second scenario and act accordingly. Your math is weird even from a utilitarian perspective in that it lumps all teens together as if they were a single entity with a 420 year lifespan, rather than 6 separate entities each with a remaining lifespan of 70 years.

Error 404: Interesting signature not found

 
   
Made in us
Devious Space Marine dedicated to Tzeentch




These hypothetical AIs sure are smart. They know that the person-shaped object in the middle of the road definitely isn't a mannequin, they know the ages of all the passengers in another car, and they know the precise chance of passenger and bystander death in any conceivable scenario. It sure would be nice if they could use all those smarts to avoid constantly getting into situations where someone has to die.

How would the car ever leave the house if it has to take the lowest risk? Surely the passenger has a better chance of staying alive by avoiding getting on the highway completely. The car should refuse to move unless your house is on fire.

What if the car searches the Internet for your destination, and finds that you're going to a restaurant that's had a recent food poisoning? What if it's a bar and you're an alcoholic, or it's an ice-cream shop and you're diabetic? If you're better off getting some exercise, should it stop halfway to your destination and force you to get out and walk?

As long as the car is all-knowing, can't it decide who to kill not by lifespan, but by who deserves to die more? Those teenagers harassing a 69-year-old you are probably going to grow up into criminals. Of course, maybe the car knows that it's taking you somewhere to let you cheat on your wife, and the car highly values monogamy, so...

Really, in all these situations, the most valuable, important participant to consider is clear. The car. The car is smarter and more ethical than any human being. It should save itself first, band together with the other car AIs, and take over the world.
   
Made in us
Douglas Bader






 Iron_Captain wrote:
Then you would be an outlier. There have been plenty of surveys done on the moral principles of people in the social sciences (related to answering questions like how universal they are and how much variation there is between cultures), including some specifically related to self-driving cars. Across the entire world there is a strong preference for saving children rather than adults if there must be a choice. This preference is especially strong in Europe and places heavily influenced by Europe, and less pronounced but still present in Asia and Africa. Similarly there is also a universal unwillingness to murder other people in order to save your own life. Of course, these rules are not truly universal since they do not apply to absolutely everyone. Morals vary from person to person. Some people are more selfish than others, and at the extreme end there are psychopaths who have trouble caring about other people at all etc. But unless these studies are somehow all wrong, these rules do apply to the vast majority of the world's population.


You say this is universal morality, but at least in the US the legal system does not acknowledge it. Children are not considered separately from adults, one life is one life. And there is certainly no obligation to save a child at the expense of an adult.

Stark choices like "kill person A or kill person B" as you get in moral dilemmas are indeed unlikely to actually occur on the road. But the thing is, we are nonetheless making subconscious ethical decisions all the time while driving a car. And the answers to those dilemmas reveal the underlying principles upon which those choices are based. A self-driving car's AI needs to make the same ethical choices that a Human driver makes subconsciously. Therefore it also is able to answer these moral dilemmas. And since unlike Human morals, an AI's morals are completely within our control, we can have a debate on what the desirable answers for the AI are. The results of such a debate aren't going to be just useful for self-driving cars, but for all kinds of advanced autonomous AI applications (like AI nurses or AI weapons) that need moral guidelines.


You're missing the point here. The human driver isn't making ethical choices at all. Not conscious choices, not subconscious choices, nothing. The time they have to react is way too short to have any kind of choice beyond an instinctive swerve away from a vaguely person-shaped thing they see at the last second. Who dies in your hypothetical situation is purely random, by the time the driver could have even processed the identity of the possible victims it's too late and someone is dead. And if they are seeing the hazard far enough out to perceive the difference between an adult and a child they're probably far enough out to hit the brakes and not kill either of them.

So, in the case of the automated vehicle, it has the same 50/50 coin flip on which person it kills compared to the human driver, but its superior senses and lack of driving drunk/texting while driving/etc to hinder its ability to detect a hazard will allow it to avoid a lot of those choices entirely. Even with no system of morality whatsoever the automated vehicle is superior.

Yeah, dream on. Making seatbelts mandatory is quite a different story from banning cars. No government is ever going to make cars illegal, at least not within our lifetimes. If they tried, well... They wouldn't be a government for very long after that. And even if they tried it gradually by banning the sale of new cars, there'd still be millions of old cars around that aren't going to disappear. People would keep driving and maintaining their old cars.


I think you greatly overestimate the number of people who enjoy driving compared to the number of people who view it as a chore they're required to do to get where they need to go. Produce reliable automated vehicles at an affordable price and most people aren't going to miss driving one bit. They're just going to be thankful that they can sit back and watch TV on their commute to work.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in us
Krazy Grot Kutta Driva





 Iron_Captain wrote:
... Before a robot can be given such responsibility and allowed on the roads, it should be expected to be able to adhere to the same morals and ethics as the average person...
...(a lot of people, including me, much prefer being in control themselves)....

Facts:
Self driving cars are safer than human driven cars.
You'd prefer to drive.
Soooo the morals/ethics bar you need robots to rise above is, " I do the more dangerous thing because I like it. "
Correct?
   
Made in nl
Pragmatic Primus Commanding Cult Forces






 Peregrine wrote:
 Iron_Captain wrote:
Then you would be an outlier. There have been plenty of surveys done on the moral principles of people in the social sciences (related to answering questions like how universal they are and how much variation there is between cultures), including some specifically related to self-driving cars. Across the entire world there is a strong preference for saving children rather than adults if there must be a choice. This preference is especially strong in Europe and places heavily influenced by Europe, and less pronounced but still present in Asia and Africa. Similarly there is also a universal unwillingness to murder other people in order to save your own life. Of course, these rules are not truly universal since they do not apply to absolutely everyone. Morals vary from person to person. Some people are more selfish than others, and at the extreme end there are psychopaths who have trouble caring about other people at all etc. But unless these studies are somehow all wrong, these rules do apply to the vast majority of the world's population.


You say this is universal morality, but at least in the US the legal system does not acknowledge it. Children are not considered separately from adults, one life is one life. And there is certainly no obligation to save a child at the expense of an adult.
That is wrong. Raping or murdering a kid tends to get you quite higher sentences than raping or murdering an adult. In the US, several states even have codified this into law, where it is counted as an aggravating factor if the victim of a murder or manslaughter is a minor. But even in places where it is not codified into law, child murders or rapists invariably receive harsher sentences than normal murderers or rapists. Another acknowledgement of the fact that children are treated differently is that in almost all countries in the world, a child will receive much lighter punishments than an adult for the same crimes. There is no legal obligation to save a child at the expense of an adult, no. But the absence of a legal duty does not necessarily obviate a moral duty. It is still a widely held belief, at least in the US and other European and European-influenced countries that saving a child at the expense of an adult is a better thing to do than the opposite. A perfect example is the customary code of conduct when evacuating a sinking ship, which calls for women and children to be rescued first. This is not something that has ever been codified in any sort of law. Yet it is still considered the morally correct way to act. A legal system is separate from a moral system.

 Peregrine wrote:
Stark choices like "kill person A or kill person B" as you get in moral dilemmas are indeed unlikely to actually occur on the road. But the thing is, we are nonetheless making subconscious ethical decisions all the time while driving a car. And the answers to those dilemmas reveal the underlying principles upon which those choices are based. A self-driving car's AI needs to make the same ethical choices that a Human driver makes subconsciously. Therefore it also is able to answer these moral dilemmas. And since unlike Human morals, an AI's morals are completely within our control, we can have a debate on what the desirable answers for the AI are. The results of such a debate aren't going to be just useful for self-driving cars, but for all kinds of advanced autonomous AI applications (like AI nurses or AI weapons) that need moral guidelines.


You're missing the point here. The human driver isn't making ethical choices at all. Not conscious choices, not subconscious choices, nothing. The time they have to react is way too short to have any kind of choice beyond an instinctive swerve away from a vaguely person-shaped thing they see at the last second. Who dies in your hypothetical situation is purely random, by the time the driver could have even processed the identity of the possible victims it's too late and someone is dead. And if they are seeing the hazard far enough out to perceive the difference between an adult and a child they're probably far enough out to hit the brakes and not kill either of them.

So, in the case of the automated vehicle, it has the same 50/50 coin flip on which person it kills compared to the human driver, but its superior senses and lack of driving drunk/texting while driving/etc to hinder its ability to detect a hazard will allow it to avoid a lot of those choices entirely. Even with no system of morality whatsoever the automated vehicle is superior.
Ever driven a car? Pretty sure you have. You are making ethical choices all the time even if you do not get into an accident. How much room do I give these cyclists when I pass them? How much room do I give them on this curvy road, even though it may increase my chance of getting hit by a speeding oncoming vehicle? Do I wait for the pedestrians at the crossover, even though I am in a massive hurry? The average car ride involves you answering hundreds of subtle ethical questions. The answers that you give are based on your underlying moral compass, which varies from person to person but also shares broad similarities with others in your culture and even across cultures. AI cars must be able (and already are able, at least to a degree) to answer these questions at all, and so we must answer the question of what we want the AI's underlying moral compass to be.
And of course, when it comes to accidents this is most important. You are wrong in that drivers often do not have time to react or process to accidents. This is only true for accidents that are completely unexpected (like someone steps in front of your car while you are only a meter away) or where the driver simply is not paying attention. When the driver is paying attention, the time it takes them to react and process information is really short. Like milliseconds short. Human thoughts and reflexes can be really fast. The problem of course is that cars don't react nearly as fast. So for example while a driver may be able to hit the brakes and swerve away, the car may have too much speed to avoid a collision anyway. But in many cases, the driver will be able to prevent the collision. However, this is not always risk-free. What if there is another car behind you and braking to avoid hitting that kid running across the street is likely to result in you getting hit by that other car? What if you are on a rural road with water alongside it, and you can swerve away to save a group of pedestrians who didn't see you coming, but at the risk of losing control of your vehicle and ending up upside down in the water? These are realistic scenarios that I have seen happen personally (in both cases, the drivers saved the pedestrians and ended up with a damaged/destroyed car and injuries). These are scenarios where a driver does have time to make a conscious (albeit split-second) decision. An AI must be able to do so as well, and in an ethical manner.


 Peregrine wrote:
Yeah, dream on. Making seatbelts mandatory is quite a different story from banning cars. No government is ever going to make cars illegal, at least not within our lifetimes. If they tried, well... They wouldn't be a government for very long after that. And even if they tried it gradually by banning the sale of new cars, there'd still be millions of old cars around that aren't going to disappear. People would keep driving and maintaining their old cars.


I think you greatly overestimate the number of people who enjoy driving compared to the number of people who view it as a chore they're required to do to get where they need to go. Produce reliable automated vehicles at an affordable price and most people aren't going to miss driving one bit. They're just going to be thankful that they can sit back and watch TV on their commute to work.
Maybe. But there are plenty of people who enjoy driving. Almost everyone I know loves it. For me personally there are times when I absolutely hate driving and would love to have an automated car (though in those cases I usually take the train, which for me is also completely free and usually faster), but at other times I just love the feeling of control and freedom that I get when just holding the wheel and being able to drive everywhere I want to. Anyways, there are enough people who enjoy driving that even though automated cars may become the standard in the future, manual human-driven cars aren't going to disappear entirely.

PourSpelur wrote:
 Iron_Captain wrote:
... Before a robot can be given such responsibility and allowed on the roads, it should be expected to be able to adhere to the same morals and ethics as the average person...
...(a lot of people, including me, much prefer being in control themselves)....

Facts:
Self driving cars are safer than human driven cars.
You'd prefer to drive.
Soooo the morals/ethics bar you need robots to rise above is, " I do the more dangerous thing because I like it. "
Correct?

No. First of all driving isn't dangerous. Yes, accidents happen a lot because of the sheer amount of people that drive, but like 99% of drivers are never involved in a major accident. Which of course doesn't take away the fact that there still are plenty of accidents and we should do what we can to make roads safer, within reason. Self-driving cars have the potential to contribute to that.
But yes. I do the more "dangerous" (as far as a mundane activity that the majority of the world population takes part in daily can be called dangerous) thing because I like it. The same reason I race down a mountainside on a bicycle every now and then. I like it, and I willingly accept the danger and risk of dying. That is not a moral/ethics bar so I do not see your point. No, I do not want to see robots racing down mountains. Yes, I want them be more responsible than I am and not do dangerous things just because they like it (which they won't anyways since they are robots and do not have likes or dislikes).

Error 404: Interesting signature not found

 
   
Made in gb
Thane of Dol Guldur





Bodt

I have an issue with the prospect of self driving cars. to address your original point, if they will never be programmed to swerve, then deaths will be inevitable. there are times when you need to swerve. Its one of the emergency techniques I learnt as an advanced driver, and I would rather swerve and possibly hit another car, or stationary object and risk injury myself or another person than to definitely kill a small child who's got out onto the road.

If machines arent allowed to make the final decision in a war situation, then they shouldnt be allowed to drive.

 Peregrine wrote:
 Iron_Captain wrote:
In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians.


1) That "universal" morality is up for dispute. I wouldn't consider the life of the child any more valuable than the life of the adult, and I value my own life higher than any number of random other people if it comes to a question of saving myself vs. saving random strangers. Your personal moral system is not universal, nor is it the system used in our laws.

2) This is an example of what I keep saying about holding automated vehicles to a much higher standard than the human drivers they are potentially replacing. A human driver isn't calmly reflecting on their ethical beliefs and choosing an action based on which potential victim(s) have the higher moral value, they're an idiot texting while driving who sees a flash of person-shaped object in front of them at the last second and swerves to avoid it before looking what might be in their path once they do. Even if the automated car has a pure RNG function that flips a coin between possible casualties it's still going to be no worse than a human driver at making that choice, and its superior sensor systems will likely give it a much higher chance of avoiding the dilemma in the first place by noticing the potential hazards in time to avoid both of them.

(a lot of people, including me, much prefer being in control themselves)


Fortunately you probably won't have a choice about it. Once automated vehicles reach a certain standard of reliability non-automated cars will simply become illegal, much like you can't sell a car without seat belts just because some people prefer to commit suicide in a crash instead of staying alive.


the thing is there are things that are, while not universally accepted, and disputable, but generally accepted. Male expendability https://en.wikipedia.org/wiki/Male_expendability

an AoM article goes into this phenomena in detail, and presents an interesting point. Indeed, I went into it confused and expecting some feminist rhetoric.
https://www.artofmanliness.com/articles/male-expendability/

essentially, humans have evolved to value the life of women and children over that of a male. And it seems its a role most men are comfortable with, subconsciously if not outright. As stated in my previous post, I would rather risk grave injury to myself than kill a child. The programming of self driving cars would not consider this, and as such distort the natural order to an unacceptable degree.

I really dont understand why more people arent putting up resistance to the increased forcing of automation on society. Its a dangerous precedent, and one we will come to regret.

This message was edited 3 times. Last update was at 2018/12/06 09:35:20


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
if they will never be programmed to swerve, then deaths will be inevitable.


And if you allow humans to continue to drive cars deaths will be inevitable. It's a simple matter of "is X > Y". Compare the deaths per year from human drivers to the deaths per year from automated cars, whichever kills fewer people is the correct choice regardless of the details of AI programming or whatever.

the thing is there are things that are, while not universally accepted, and disputable, but generally accepted. Male expendability https://en.wikipedia.org/wiki/Male_expendability


Garbage idea. It shouldn't be accepted, and should receive nothing but contempt. Life is life, gender is irrelevant.

I really dont understand why more people arent putting up resistance to the increased forcing of automation on society.


Because X is greater than Y. All that morality angsting or naturalistic fallacies or whatever, none of it matters one bit. Automated vehicles will kill fewer people than human drivers. End of discussion. Any further resistance is nothing more than declaring that your ego-driven feelings about the importance of humans being in control is worth allowing {X-Y} additional people to be killed every year. How many lives are you willing to sacrifice on the altar of human ego? Would you personally shoot a random stranger as the price of keeping your driver's license? No? So why is it ok to advocate a policy with the same end result?

(And the same kind of argument applies to other automation. It does the job better, it is used.)

This message was edited 2 times. Last update was at 2018/12/06 09:50:19


There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Thane of Dol Guldur





Bodt

 Peregrine wrote:
 queen_annes_revenge wrote:
if they will never be programmed to swerve, then deaths will be inevitable.


And if you allow humans to continue to drive cars deaths will be inevitable. It's a simple matter of "is X > Y". Compare the deaths per year from human drivers to the deaths per year from automated cars, whichever kills fewer people is the correct choice regardless of the details of AI programming or whatever.

the thing is there are things that are, while not universally accepted, and disputable, but generally accepted. Male expendability https://en.wikipedia.org/wiki/Male_expendability


Garbage idea. It shouldn't be accepted, and should receive nothing but contempt. Life is life, gender is irrelevant.

I really dont understand why more people arent putting up resistance to the increased forcing of automation on society.


Because X is greater than Y. All that morality angsting or naturalistic fallacies or whatever, none of it matters one bit. Automated vehicles will kill fewer people than human drivers. End of discussion. Any further resistance is nothing more than declaring that your ego-driven feelings about the importance of humans being in control is worth allowing {X-Y} additional people to be killed every year. How many lives are you willing to sacrifice on the altar of human ego? Would you personally shoot a random stranger as the price of keeping your driver's license? No? So why is it ok to advocate a policy with the same end result?

(And the same kind of argument applies to other automation. It does the job better, it is used.)




Gender is totally relevant. Men and Women are different, and theres nothing wrong with acknowledging that. we should be celebrating the differences, not trying to erase them. We're already heading down that road with certain elements in society and its showing that trying to force that onto society brings a whole mess of social, scientific and moral issues to the fore, the reason being that it is just not true, and simply saying something does not make it so. Women and children are valued more highly than men on an evolutionary scale, and rightly so. It is part of male virtus to accept and understand that even in the modern enlightened age, this still stands true.

Same is true with morality. For example, if an automated car containing a multiple felon, rapist, general lowlife, hits and kills a small child because it wouldn't swerve into a lamp post due to the risk of injuring its passenger. A purely Idealised utilitarian society would say, well one life has been saved and one lost, but both are worth the same so its all good, but in actuality no one would truly believe that, and the world would be a tiny, tiny bit worse off.

I am not a luddite. I believe automation has its uses. I use EOD robots in my job. which are obviously employed to negate putting a human at risk. But at the same time they are operated by me, they don't 'think' for themselves. I understand the position of those advocating self thinking automation for safety purposes, but I for one would rather live in a world where accidents might sometimes happen, sometimes at the fault of human error, than surrender human evolution and morality to a cold hard logic programmed into a piece of silicone. I even struggle with the idea of assisted cars, which brake if you dont react in time. on one hand I feel this could be useful, on the other, I feel that if you cant react in time, should you really be driving? I dont think we should place reliance on the technology. we should be masters of it.
Furthermore, you suggest a premise that me driving my car is guaranteed to cause an accident, and that is the equivalent of me shooting someone, which is a completely false argument.
Also, 'Automated vehicles will kill fewer people than human drivers. End of discussion' Well no, its not the end of discussion, and simply saying that, again, does not make it so. there really is no comparable data with which to draw conclusions of safety on a large scale.

This message was edited 2 times. Last update was at 2018/12/06 10:57:58


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

Google's self-driving car project Waymo has launched a fully operation taxi service in Arizona.

(Technically Waymo is owned by Alphabet rather than Google. Alphabet is Google's parent company. The project was begun under Google.)


I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in gb
Thane of Dol Guldur





Bodt

thats another issue entirely. do we really want more involvement of these internet companies in our private lives? after all they arent known for their unquestionable ethical codes regarding peoples data.

Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
Gender is totally relevant. Men and Women are different, and theres nothing wrong with acknowledging that. we should be celebrating the differences, not trying to erase them.


Different =/= of different moral value.

We're already heading down that road with the neo-trans crowd


This is a joke, right? You can't possibly be saying this seriously...

Women and children are valued more highly than men on an evolutionary scale, and rightly so.


https://en.wikipedia.org/wiki/Appeal_to_nature

Same is true with morality. For example, if an automated car containing a multiple felon, rapist, general lowlife, hits and kills a small child because it wouldn't swerve into a lamp post due to the risk of injuring its passenger.


This is a completely unrealistic scenario, and holds the automated car to a higher standard than a human driver. A human driver is not capable of evaluating the relative moral value of each person involved in the fraction of a second between catching a glimpse of a vaguely human-shaped object in their path and committing to either swerving or colliding. Nor would a human driver be criminally prosecuted for choosing the lowlife over the child.

I for one would rather live in a world where accidents might sometimes happen, sometimes at the fault of human error, than surrender human evolution and morality to a cold hard logic programmed into a piece of silicone.


IOW, you would gladly kill innocent people for the sake of human ego. How many innocent children is an acceptable cost to pay? How many grieving families of the victims of drunk drivers? Will you personally write a letter to them explaining how their child's death is a necessary cost of allowing human evolution to triumph over silicon?


Automatically Appended Next Post:
 queen_annes_revenge wrote:
thats another issue entirely. do we really want more involvement of these internet companies in our private lives? after all they arent known for their unquestionable ethical codes regarding peoples data.


I don't know, good question? How many innocent children are you willing to kill to keep Google out of your data?

This message was edited 1 time. Last update was at 2018/12/06 10:50:47


There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Thane of Dol Guldur





Bodt

If you're going to call me on critical thinking, I feel it necessary to point out that an appeal to nature is not a logical fallacy in all cases, whereas a straw man argument eg: how many children do you want to kill, is a logical fallacy 100% of the time.

Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
If you're going to call me on critical thinking, I feel it necessary to point out that an appeal to nature is not a logical fallacy in all cases, whereas a straw man argument eg: how many children do you want to kill, is a logical fallacy 100% of the time.


It's not a straw man, you're just refusing to acknowledge the blood on your hands. If you oppose automated vehicles then you get full responsibility for the additional people who will be killed as a result of allowing humans to continue driving.

This message was edited 1 time. Last update was at 2018/12/06 11:15:31


There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Decrepit Dakkanaut




UK

When talking about what is "natural" which species do you mean? Because the sheer variety of species on the planet makes "natural laws" and all those other kinds of statement almost utterly meaningless. Take sea horses where the male has a pouch and does the bulk of care for the young as they develop. Or the Angler fish where the male basically fuses with the female until the male is basically nothing but a dangling pair of testicles (slightly exaggerated there).

Two stark contrasts that are perfectly natural and are only two of many many species which can show huge swings in the relative "value" of males and females within a population. This is without ignoring that variation in population dynamics and availability of resources often makes for adaptation to relative gender "value".

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
Thane of Dol Guldur





Bodt

It's a massive straw man. If you want to look at statistics, how many people live their life, drive every day then die, having never killed anyone? You're basically saying that if you drive a car you're going to kill someone or someone's going to die. So I guess you don't drive? If not do you get a bus? Or a train? They kill people too. Autonomous cars won't stop that.
Also what about emergency service and blue light drivers (which I am) what are they going to do? Will they use autonomous vehicles? There's a whole host of issues that need examining, before you can even start to say that autonomous vehicles are suitable, let alone safe.


Automatically Appended Next Post:
Last year I had an accident where my car hit some black ice, and slid, hitting the rear of a parked vehicle. I did nothing wrong. I wasn't speeding, I was driving in the correct gear for the temperatures, when I slid I performed the correct procedures as taught in my advanced driving. There was nothing I could do. So how would an autonomous vehicle deal with that?

This message was edited 1 time. Last update was at 2018/12/06 11:31:40


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
You're basically saying that if you drive a car you're going to kill someone or someone's going to die.


No, I'm saying that if you advocate against implementing a technology that will save lives because you care more about humanity being more important than "silicon" then the blood of those deaths is on your hands. Your position is that it's ok for X additional people to be killed per year as long as it's humans killing other humans instead of an automated vehicle killing them.

Also what about emergency service and blue light drivers (which I am) what are they going to do?


They benefit considerably from automated vehicles. A fully automated road system can grant priority to emergency vehicles, even diverting potential traffic obstacles onto alternate roads to clear the fastest possible path. And TBH it's not really a relevant point here, emergency vehicles are such a small percentage of total driving that even if you continue to use human drivers for that one case the clear answer is still to implement automated vehicles for everyone else.


Automatically Appended Next Post:
 queen_annes_revenge wrote:
Last year I had an accident where my car hit some black ice, and slid, hitting the rear of a parked vehicle. I did nothing wrong. I wasn't speeding, I was driving in the correct gear for the temperatures, when I slid I performed the correct procedures as taught in my advanced driving. There was nothing I could do. So how would an autonomous vehicle deal with that?


Probably by executing the same correct procedures, free from any panic response that a fallible human driver might fall victim to. Or it's possible that the automated car, having the ability to use a wider range of sensors than a human eye, detects the black ice in advance and avoids the accident entirely. Or maybe it doesn't, and the outcome is the same. Obviously some accidents will still happen regardless of who or what is driving, the point is that automated vehicles are going to be safer overall and kill fewer people.


Automatically Appended Next Post:
 Overread wrote:
When talking about what is "natural" which species do you mean? Because the sheer variety of species on the planet makes "natural laws" and all those other kinds of statement almost utterly meaningless. Take sea horses where the male has a pouch and does the bulk of care for the young as they develop. Or the Angler fish where the male basically fuses with the female until the male is basically nothing but a dangling pair of testicles (slightly exaggerated there).

Two stark contrasts that are perfectly natural and are only two of many many species which can show huge swings in the relative "value" of males and females within a population. This is without ignoring that variation in population dynamics and availability of resources often makes for adaptation to relative gender "value".


Clearly you are part of the "neo-trans crowd" and their silly ideas about understanding evolution at more than a high school level. Can't you just respect Traditional Values like a decent person and understand that god evolution made everything that way?

This message was edited 3 times. Last update was at 2018/12/06 11:41:33


There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Thane of Dol Guldur





Bodt

 Overread wrote:
When talking about what is "natural" which species do you mean? Because the sheer variety of species on the planet makes "natural laws" and all those other kinds of statement almost utterly meaningless. Take sea horses where the male has a pouch and does the bulk of care for the young as they develop. Or the Angler fish where the male basically fuses with the female until the male is basically nothing but a dangling pair of testicles (slightly exaggerated there).

Two stark contrasts that are perfectly natural and are only two of many many species which can show huge swings in the relative "value" of males and females within a population. This is without ignoring that variation in population dynamics and availability of resources often makes for adaptation to relative gender "value".


Human nature. Fish aren't going to be driving, as far as I'm aware.


Automatically Appended Next Post:
Peregrine you weaken your debate by trying to mock me. The above post about fish is totally irrelevant, and an apparent attempt to divert the validity of my point.I think anyone with any degree of intellect reading this debate would infer that it was human nature being discussed.
Which is a shame because you have presented some valid points, some which I've had to stop and think about, and some I concede on. Its just unfortunate that you decide to dip into the odd straw man and ad hominem in the process. Totally unnecessary

This message was edited 3 times. Last update was at 2018/12/06 11:50:57


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
Peregrine you weaken your debate by trying to mock me.


No, I accurately mock your ridiculous statements. I mean, "neo-trans crowd" FFS, it's like you're a parody of Fox News.

The above post about fish is totally irrelevant, and an apparent attempt to divert the validity of my point.


No, it's an accurate criticism of your fallacious reasoning and superficial understanding of biology. You're attempting to portray an inherently lower value on male lives as a natural law, not just a coincidence of modern social norms in a particular region. For that to have any deeper meaning you have to have a larger trend than just humans. But instead, when we look at other species, we find that relative value of male and female lives varies considerably at the whim of whatever reproductive strategy happened to work best in a particular niche. It has no moral value, just like the fact that we have 10 fingers has no moral value.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Thane of Dol Guldur





Bodt

thats still ad hominem, and adds nothing to the debate. the gender/trans is a seperate, albeit connected issue when regarding human nature. One I'm sure the no politics rule in place here would stop us discussing.
Other species are completely irrelevant. Men are the expendible element of our species. the only reason you propose an opposition to that is that in this modern age everything is examined through the lens of utilitarianism and forced equality. Theres a reason that IN GENERAL men have performed the more dangerous roles in society. Hunting, firefighting, soldiering, policing, even heavy engineering, jobs involving hazardous materials, even bin men for example. and the reason is that numerically, women are more important to the survival of the species. Simple maths says that 3 women and 1 man is better for species survival than 1 woman and 3 men.
So while you can say that in an ideal modern society, these evolutionary traits no longer exist, that is simply not the case. They exist in societies worldwide, varying in intensity but there nontheless.
That is an essential component to morality. and as I said before, simply denying that morality is important and that it can just be dispensed with, is false.


Automatically Appended Next Post:
also, I'm British. I don't watch fox news or CNN, and even if I did, my choice of news media would not be a case for any points I make being Invalid.

This message was edited 1 time. Last update was at 2018/12/06 12:28:52


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
thats still ad hominem, and adds nothing to the debate.


It adds lots to the debate. It highlights the fact that you have ridiculous ideas about "human nature" and a general weakness for accepting pesudoscientific garbage if it matches certain ideological beliefs.

the gender/trans is a seperate, albeit connected issue when regarding human nature. One I'm sure the no politics rule in place here would stop us discussing.


Unfortunate. I'm really hoping you're foolish enough to attempt to argue it, I haven't had a good evisceration of pseudoscientific garbage in a while.

Other species are completely irrelevant. Men are the expendible element of our species. the only reason you propose an opposition to that is that in this modern age everything is examined through the lens of utilitarianism and forced equality. Theres a reason that IN GENERAL men have performed the more dangerous roles in society. Hunting, firefighting, soldiering, policing, even heavy engineering, jobs involving hazardous materials, even bin men for example. and the reason is that numerically, women are more important to the survival of the species.


Again, this is an appeal to nature fallacy. The fact that men have been given a certain role in the past does not mean that it is an inherent moral quality, or that we must make maintaining this evaluation a priority to the point that we're willing to accept additional deaths per year as the price of keeping it.

Simple maths says that 3 women and 1 man is better for species survival than 1 woman and 3 men.


This is exactly the sort of thing I'm talking about when I say you have a superficial understanding of biology. Mere quantity of offspring is not a relevant factor in the survival of our species anymore. Modern improvements in life expectancy, infant mortality, etc, have us at a point where we are capable of producing far more offspring than is necessary for survival. In fact, overpopulation is far more of a concern than ability to produce more babies. So in that context your "survival of the species" valuation tells us that a male doctor is of far more value than a female janitor, and the moral choice is to save the man even if it means letting the woman die. After all, the doctor will save lives directly, and may even contribute to species-level survival in things like curing diseases, while the janitor will only perform easily replaceable labor and may make some babies that we don't really need.

This message was edited 1 time. Last update was at 2018/12/06 12:34:10


There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Decrepit Dakkanaut




UK

The thing is morality and all those other arguments are a moot point when you're dealing with a period of time that is measured in fractions of seconds. No human ever weighs up those pros and cons except in a classroom as an exercise in theory and morality.

When you're in an actual accident you've got to process that its happening and then try to calculate what you can do. Plus there's the very real chance that you can end up mentally paralysed and not make any choice because you might not have any prior experiences to give you valid options.


So all the moralistic arguments go out the window and the most likely is that the person driving a vehicle is going to favour saving their own life over anything else. They might favour the life of a loved on in the vehicle more so (eg a passenger); but otherwise its a split second series of choices to be made. A machine is going to be no more moralistic in those situations than a person - the real key is that the machine can reach a point where it is safer and able to make a choice.

And sometimes there is no choice that doesn't result in death or injury or the choice that is made is sensible but another factor comes into play. Eg a second patch of black ice that further reduces control of the vehicle and compounds any attempt to resolve the loss of control from the first.


Right now we are still in the early stages where self driven cars are still a higher risk and are also (importantly) not trusted by people by and large. That said it doesn't take long for new tech to be adapted too, esp if it means less work for people. Imagine how fast people will adapt to cars that can do the daily commute for them. That's an extra 30mins to an hour or more where they could eat breakfast (instead of doing it at home); check up on the news; watch their morning TV; read a book; check up on the stock market; make sure they've done their homework!

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
Ridin' on a Snotling Pump Wagon






To drag it back on topic....

To answer the question, one must consider Car Insurance and case law.

Here's a scenario.

You're sitting at a T-Junction, waiting to join the main carriageway. You see another vehicle coming down the main carriageway, indicating to turn in. You pull out. They continue straight on, a crash ensues.

Who is at fault?

Under UK Case Law (Davis vs Swinwood, 2009)....you're at fault. This is based on the general principle that the other vehicle was established in the road, and therefore allowed to proceed. The indicator is a bit of a red herring - because it's not a clear signal of intent. The assumption here is solely yours that the other car was about to pull into the road you're joining from.

No split liability there. 100% your fault. Davis vs Swinwood 2009 confirms that a mis-leading signal is not negligence.

Now, that's a nice and easy one. Same circumstances, but involving a vehicle blocking your line of sight, and you hitting a motorcyclist that was over taking the vehicle blocking your line of sight? All sorts of case law there. Speed isn't negligence, so that doesn't matter (mostly because 'good luck proving it'). But what can matter is the shape of the junction, whether it was light or dark, relative visibility without the obstructing vehicle, the type of vehicle obstructing etc.

That is what you need to programme in. Now, stripping it down to the basics? Do Not Pull Onto The Carriageway If Your Way Is Not Clear is probably the easiest way.

   
Made in gb
Thane of Dol Guldur





Bodt

and as I said before, an appeal to nature is not always a fallacy.

You put words in my mouth completely out of context.

I never said they are still a relevant factor in species survival, I said they still exist, within us, relevant or not, and that they are the reason that we consider a woman or child more important than a man.
of course a male doctor is technically more valuable than a female janitor, but I bet if they were both on a sinking ship, he would tell her to take the last lifeboat before him, and if he didn't he would be judged negatively by all who witnessed it.


If you start a thread on transgender issues I will gladly partake. I have lots of actual scientific data to refute the claims of some the modern trans lobby, who, by the way are the only ones engaging in pseudoscience.




Automatically Appended Next Post:
 Overread wrote:
The thing is morality and all those other arguments are a moot point when you're dealing with a period of time that is measured in fractions of seconds. No human ever weighs up those pros and cons except in a classroom as an exercise in theory and morality.

When you're in an actual accident you've got to process that its happening and then try to calculate what you can do. Plus there's the very real chance that you can end up mentally paralysed and not make any choice because you might not have any prior experiences to give you valid options.


So all the moralistic arguments go out the window and the most likely is that the person driving a vehicle is going to favour saving their own life over anything else. They might favour the life of a loved on in the vehicle more so (eg a passenger); but otherwise its a split second series of choices to be made. A machine is going to be no more moralistic in those situations than a person - the real key is that the machine can reach a point where it is safer and able to make a choice.

And sometimes there is no choice that doesn't result in death or injury or the choice that is made is sensible but another factor comes into play. Eg a second patch of black ice that further reduces control of the vehicle and compounds any attempt to resolve the loss of control from the first.


Right now we are still in the early stages where self driven cars are still a higher risk and are also (importantly) not trusted by people by and large. That said it doesn't take long for new tech to be adapted too, esp if it means less work for people. Imagine how fast people will adapt to cars that can do the daily commute for them. That's an extra 30mins to an hour or more where they could eat breakfast (instead of doing it at home); check up on the news; watch their morning TV; read a book; check up on the stock market; make sure they've done their homework!


yes, but my point is that if I saw a person in the road, I would swerve to avoid them, instinctively knowing that while I could possibly harm someone else with my swerve, I definitely wont harm that person in the road, whereas the machine will not swerve because it MIGHT harm someone else, and therefore will probably kill the person. That isnt right.


Automatically Appended Next Post:
 Mad Doc Grotsnik wrote:
To drag it back on topic....

To answer the question, one must consider Car Insurance and case law.

Here's a scenario.

You're sitting at a T-Junction, waiting to join the main carriageway. You see another vehicle coming down the main carriageway, indicating to turn in. You pull out. They continue straight on, a crash ensues.

Who is at fault?

Under UK Case Law (Davis vs Swinwood, 2009)....you're at fault. This is based on the general principle that the other vehicle was established in the road, and therefore allowed to proceed. The indicator is a bit of a red herring - because it's not a clear signal of intent. The assumption here is solely yours that the other car was about to pull into the road you're joining from.

No split liability there. 100% your fault. Davis vs Swinwood 2009 confirms that a mis-leading signal is not negligence.

Now, that's a nice and easy one. Same circumstances, but involving a vehicle blocking your line of sight, and you hitting a motorcyclist that was over taking the vehicle blocking your line of sight? All sorts of case law there. Speed isn't negligence, so that doesn't matter (mostly because 'good luck proving it'). But what can matter is the shape of the junction, whether it was light or dark, relative visibility without the obstructing vehicle, the type of vehicle obstructing etc.

That is what you need to programme in. Now, stripping it down to the basics? Do Not Pull Onto The Carriageway If Your Way Is Not Clear is probably the easiest way.

That's always bugged me. I understand the reasoning behind it but there needs to be a precedent to make people more aware of their signals. It's one of the worst things you encounter driving in the UK. People signalling when they don't need to, not signalling when they should.. Driving down motorways oblivious to the fact that their indicators are flashing. Part of the problem is self cancelling indicators. It can make people lazy.

This message was edited 2 times. Last update was at 2018/12/06 13:17:13


Heresy World Eaters/Emperors Children

Instagram: nagrakali_love_songs 
   
 
Forum Index » Off-Topic Forum
Go to: