Switch Theme:

Self-driving cars -- programming morality  [RSS] Share on facebook Share on Twitter Submit to Reddit
»
Author Message
Advert


Forum adverts like this one are shown to any user who is not logged in. Join us by filling out a tiny 3 field form and you will get your own, free, dakka user account which gives a good range of benefits to you:
  • No adverts like this in the forums anymore.
  • Times and dates in your local timezone.
  • Full tracking of what you have read so you can skip to your first unread post, easily see what has changed since you last logged in, and easily see what is new at a glance.
  • Email notifications for threads you want to watch closely.
  • Being a part of the oldest wargaming community on the net.
If you are already a member then feel free to login now.




Made in gb
Fixture of Dakka




UK

 queen_annes_revenge wrote:


yes, but my point is that if I saw a person in the road, I would swerve to avoid them, instinctively knowing that while I could possibly harm someone else with my swerve, I definitely wont harm that person in the road, whereas the machine will not swerve because it MIGHT harm someone else, and therefore will probably kill the person. That isnt right.


But you're loading that statement.

First up you're stating that you don't know if you're going to hurt any one else by swerving to avoid that one person. That makes perfect sense for a human. A machine, in theory, should have 360 view and awareness. It should be able to see other cars around it, other people, other elements. IT can make a choice based upon far more data far quicker than the person and without the emotional baggage and panic that sets in. Ergo it can choose to swerve and save that pedestrian and the driver and the two kids walking on the pavement beside the car because it knows there's nothing on the other side of hte road so it knows the safe direction to swerve in. Whilst a human might only swerve to avoid one and might well swerve out of the road toward the edge and then hit the two kids.


In theory the machine has a higher chance of saving more lives - provided it is properly programmed and has a working system that can reliably identify people in the environment around it.

Plus as noted earlier, self driven cars could inter-communicate. So an incident for one car can cause others to react behind and infront of it. So your car is serving and at the very same time the car right behind you is hard braking too. Whilst the car on the other side of the road is also hard braking. Now the cars have stopped and prevented pileup.

This message was edited 1 time. Last update was at 2018/12/06 13:20:24


A Blog in Miniature - now featuring reviews of many new Black Library books (latest Novellas) 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in southern England.

 Mad Doc Grotsnik wrote:
To drag it back on topic....

To answer the question, one must consider Car Insurance and case law.

Here's a scenario.

You're sitting at a T-Junction, waiting to join the main carriageway. You see another vehicle coming down the main carriageway, indicating to turn in. You pull out. They continue straight on, a crash ensues.

Who is at fault?

Under UK Case Law (Davis vs Swinwood, 2009)....you're at fault. This is based on the general principle that the other vehicle was established in the road, and therefore allowed to proceed. The indicator is a bit of a red herring - because it's not a clear signal of intent. The assumption here is solely yours that the other car was about to pull into the road you're joining from.

No split liability there. 100% your fault. Davis vs Swinwood 2009 confirms that a mis-leading signal is not negligence.

Now, that's a nice and easy one. Same circumstances, but involving a vehicle blocking your line of sight, and you hitting a motorcyclist that was over taking the vehicle blocking your line of sight? All sorts of case law there. Speed isn't negligence, so that doesn't matter (mostly because 'good luck proving it'). But what can matter is the shape of the junction, whether it was light or dark, relative visibility without the obstructing vehicle, the type of vehicle obstructing etc.

That is what you need to programme in. Now, stripping it down to the basics? Do Not Pull Onto The Carriageway If Your Way Is Not Clear is probably the easiest way.


Your examples are interesting, and very relevant in my daily experience, when I see many vehicles who don't bother to indicate, or indicate incorrectly.

The self-driving car environment will be different however. Autonomous cars will be programmed to indicate at the correct interval. They will know where they are going by GPS, and not suddenly change their minds because they see a direction sign which had been obscured. They will be aware of and in communication with the nearby vehicles. They won't overtake or pull out in potentially dangerous circumstances.

Of course this is all down to programming and sensor engineering, and that is what has to be got right.

To some degree I think the argument about morality and the trolley problem is a red herring. It's not often than human drivers get themselves into the position of having to choose whether to run over the pregnant woman, the fat schoolchild or the premiere leagure footballer with a mother with dementia, or whatever. Autonomous cars will face that kind of situation a lot less.

Petition to stop ratification of EU Article 13 on Internet Copyright

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

The morality issue I mentioned wasn't about the unthinkable choice. There weren't 2 definite bad outcomes, but one bad outcome and one possible bad outcome. The op stating that an autonomous vehicle wouldn't swerve was what I was referring to, saying sometimes you have to swerve, and if the machine can't do that due to its programming that is a mistake.


Automatically Appended Next Post:
 Overread wrote:
 queen_annes_revenge wrote:


yes, but my point is that if I saw a person in the road, I would swerve to avoid them, instinctively knowing that while I could possibly harm someone else with my swerve, I definitely wont harm that person in the road, whereas the machine will not swerve because it MIGHT harm someone else, and therefore will probably kill the person. That isnt right.


But you're loading that statement.

First up you're stating that you don't know if you're going to hurt any one else by swerving to avoid that one person. That makes perfect sense for a human. A machine, in theory, should have 360 view and awareness. It should be able to see other cars around it, other people, other elements. IT can make a choice based upon far more data far quicker than the person and without the emotional baggage and panic that sets in. Ergo it can choose to swerve and save that pedestrian and the driver and the two kids walking on the pavement beside the car because it knows there's nothing on the other side of hte road so it knows the safe direction to swerve in. Whilst a human might only swerve to avoid one and might well swerve out of the road toward the edge and then hit the two kids.


In theory the machine has a higher chance of saving more lives - provided it is properly programmed and has a working system that can reliably identify people in the environment around it.

Plus as noted earlier, self driven cars could inter-communicate. So an incident for one car can cause others to react behind and infront of it. So your car is serving and at the very same time the car right behind you is hard braking too. Whilst the car on the other side of the road is also hard braking. Now the cars have stopped and prevented pileup.


The OP stated an autonomous vehicle would not swerve due to the risk of rolling.

This message was edited 1 time. Last update was at 2018/12/06 13:37:40


Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in gb
Princeps of the Emperor's Titan!






The problem remains Self Driving Cars vs Human Drivers.

You cannot possibly programme in the whole gamut of Human Stupidity.

The moron that decides to pull a U-turn just as you pass (had that happen to me). The goon that doesn't understand what a red traffic light means. People driving erratically in general.

What will also help is, in theory, Self Driving Cars will be singularly incapable of breaching a given speed limit, or pootling along well under (20mph in a 40 zone for instance).

What won't help is Kids running out into the middle of the road from between parked cars. Goon cyclists trying to weave in and out of traffic, hopping on and off the pavement as they see fit (this only applies to Goon Cyclists. I tar no group with one brush).

No sensor and no programming can possibly account for those sorts of things. There's just too much going on, and too much could happen.

In terms of GPS? That remains imperfect in itself. It follows roads no longer there. It's not always up to date on One Way systems, which in Cities seem to change on a regular basis to 'calm traffic', and certainly just inordinately increase travel time.

Fed up of Scalpers? But still want your Exclusives?Why not join us?

 
   
Made in gb
Lit By the Flames of Prospero





Bodt

however there are times when its necessary to drop speed, or go into a lower gear, or a whole host of other variables that happen on roads, where I feel that a human would actually have quicker instinctive reactions than a machine processing information from all its sensors. also, what happens when the computer systems go down or malfunction? going back to my EOD robots, they cost about £1.2million a piece, and are constantly going t*ts, needing resets etc. they only contain 3 or 4 processing units. god knows how many a car would need for all its functions.

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in southern England.

Self-driving cars won't position themselves jigger-jagger across the whole carriageway, which prevents cyclists from taking a consistent safe path.

Self-driving cars will respect school safety zones.

The GPS problem can be solved by proper updating and push notification to the vehicles.

Petition to stop ratification of EU Article 13 on Internet Copyright

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in us
Ultramarine Chaplain with Hate to Spare






Overread has the right idea I think.

It's not a question of morality. Morality does not come into play in these situations already. Human drivers try to protect themselves before a crash. All we need to do is make an AI system when does a better job than a human.

It follows traffic laws - slows down in uncertain situations or when it's raining - it stops at red lights - it doesn't drink and drive or drive tired - it doesn't get road rage.

You've just removed 99% of traffic accidents. Making the system 99x more moral than human drivers. Effort but into thinking about the morality of machines is basically wasted thought. Think about something more important. Like what are people going to do for money when robots take all our jobs.

If we fail to anticipate the unforeseen or expect the unexpected in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorized or easily referenced.
- Fox Mulder 
   
Made in us
Charging Dragon Prince





West Lafayette, IN

 Xenomancers wrote:
Think about something more important. Like what are people going to do for money when robots take all our jobs.


Smash each other over the head for basic resources as without paying customers companies wouldn't have any reason to produce goods with all these robots in the first place? It's not that difficult to understand. Material wealth drives everything on our planet, no matter how noble the socialist professor you had tries to paint the world. Even socialist tenets are based off SOMEONE producing wealth to be shared, distributed, and utilized. Without that, we're headed for Mad Max territory. I'm game, as I will finally get to utilize two decades of military training to its fullest.

www.classichammer.com

For 4-6th WFB, 2-5th 40k, and similar timeframe gaming

Looking for dice from the new AOS boxed set and Dark Imperium on the cheap. Let me know if you can help.
 CthuluIsSpy wrote:
Its AoS, it doesn't have to make sense.
 
   
Made in gb
Fixture of Dakka




UK

Actually I'm pretty sure food production drives everything. So long as people have full bellies things will remain calm.


Also fun fact some car companies are bringing people back and kicking out robots. Robotical assembly can make huge savings, but at the same time once you move far enough past the designs the machines were made for the retooling and rebuilding of the whole factory fast eats up any savings made over employing human workers who can be far more adaptive. People you just give new schematics too, lose a few to early build errors and then let them get on with it.

A machine you have to hire skilled staff to rebuild from the ground up mostly.



Of course there is a tipping point where machines become advanced enough to be easy to adapt; right now cost and technology are the limit there; one day it will just be the cost and then nothing.

A Blog in Miniature - now featuring reviews of many new Black Library books (latest Novellas) 
   
Made in gb
Lit By the Flames of Prospero





Bodt

We just have to design a system that can do... Yup, it's that simple. Same thing they tell us every time we get a new e database to make 'things easier' at our works. And anyone who works around robotics knows how temperamental automated systems can be. I think that nothing like this should be implemented until we at least make a perfect autonomous system, and seeing as were currently incapable of making a perfect semi autonomous system, forging ahead with driverless cars is just dangerous.

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
We just have to design a system that can do... Yup, it's that simple. Same thing they tell us every time we get a new e database to make 'things easier' at our works. And anyone who works around robotics knows how temperamental automated systems can be. I think that nothing like this should be implemented until we at least make a perfect autonomous system, and seeing as were currently incapable of making a perfect semi autonomous system, forging ahead with driverless cars is just dangerous.


That's not how it works. Perfection is not the standard, "better than the human drivers they replace" is.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in de
Dakka Veteran




queen_annes_revenge wrote:The morality issue I mentioned wasn't about the unthinkable choice. There weren't 2 definite bad outcomes, but one bad outcome and one possible bad outcome. The op stating that an autonomous vehicle wouldn't swerve was what I was referring to, saying sometimes you have to swerve, and if the machine can't do that due to its programming that is a mistake.

Spoiler:

Automatically Appended Next Post:
 Overread wrote:
 queen_annes_revenge wrote:


yes, but my point is that if I saw a person in the road, I would swerve to avoid them, instinctively knowing that while I could possibly harm someone else with my swerve, I definitely wont harm that person in the road, whereas the machine will not swerve because it MIGHT harm someone else, and therefore will probably kill the person. That isnt right.


But you're loading that statement.

First up you're stating that you don't know if you're going to hurt any one else by swerving to avoid that one person. That makes perfect sense for a human. A machine, in theory, should have 360 view and awareness. It should be able to see other cars around it, other people, other elements. IT can make a choice based upon far more data far quicker than the person and without the emotional baggage and panic that sets in. Ergo it can choose to swerve and save that pedestrian and the driver and the two kids walking on the pavement beside the car because it knows there's nothing on the other side of hte road so it knows the safe direction to swerve in. Whilst a human might only swerve to avoid one and might well swerve out of the road toward the edge and then hit the two kids.


In theory the machine has a higher chance of saving more lives - provided it is properly programmed and has a working system that can reliably identify people in the environment around it.

Plus as noted earlier, self driven cars could inter-communicate. So an incident for one car can cause others to react behind and infront of it. So your car is serving and at the very same time the car right behind you is hard braking too. Whilst the car on the other side of the road is also hard braking. Now the cars have stopped and prevented pileup.


The OP stated an autonomous vehicle would not swerve due to the risk of rolling.
In a situation where a human had to swerve the autonomous vehicle would probably have access to much more useful data and much earlier than a human driver and instead of swerving abruptly, it would just slow down and adjust its course more mildly and in advance of any problems then human could perceive. When a computer system of acceptable quality needs to swerve, a human driver would probably be already in an accident/dead. There are reports that humans are already angry at autonomous vehicles because those "drive like grannies" and are extra cautious. If I had to bet on who's the safer driver, I'd bet on the AI and not the human.

queen_annes_revenge wrote:however there are times when its necessary to drop speed, or go into a lower gear, or a whole host of other variables that happen on roads, where I feel that a human would actually have quicker instinctive reactions than a machine processing information from all its sensors. also, what happens when the computer systems go down or malfunction? going back to my EOD robots, they cost about £1.2million a piece, and are constantly going t*ts, needing resets etc. they only contain 3 or 4 processing units. god knows how many a car would need for all its functions.
By that criterium we should also forbid humans from driving. Some of us drive drunk, some drive while exhausted, some diver over the speed limit, some drive while eating/drinking/texting, and being generally distracted by who knows what. Doesn't that count as the human "system going down" or "malfunctioning"?

We kill people with cars all the time. How can we even be allowed to drive?

And your feeling about humans having quickere instinctive reactions is wrong. A few decades ago it might have been true but we can't compete with specialised signal processing systems these days. Besides you are also assuming that those quickere instinctive reactions are also the correct reactions instead of panic induced errors.
   
Made in us
The Conquerer






Waiting for my shill money from Spiral Arm Studios

 Peregrine wrote:
 queen_annes_revenge wrote:
We just have to design a system that can do... Yup, it's that simple. Same thing they tell us every time we get a new e database to make 'things easier' at our works. And anyone who works around robotics knows how temperamental automated systems can be. I think that nothing like this should be implemented until we at least make a perfect autonomous system, and seeing as were currently incapable of making a perfect semi autonomous system, forging ahead with driverless cars is just dangerous.


That's not how it works. Perfection is not the standard, "better than the human drivers they replace" is.


Maybe in your perfectly logical Peregrine world. But the reality is that Autonomous Cars will be held to a much higher standard, perfection will be expected and demanded of them. They will fail of course, and we will abandon them once they fail that test. Just like the Prisoner's Dilemma, we will arrive at a suboptimal result for all parties involved and life will continue.

This message was edited 1 time. Last update was at 2018/12/07 01:10:59


Self-proclaimed evil Cat-person. Dues Ex Felines

Cato Sicarius, after force feeding Captain Ventris a copy of the Codex Astartes for having the audacity to play Deathwatch, chokes to death on his own D-baggery after finding Calgar assembling his new Eldar army.

MURICA!!! IN SPESS!!! 
   
Made in gb
Lit By the Flames of Prospero





Bodt

Exactly. I've seen a lot of 'computers are safer than humans' in this thread. yet, I haven't seen a lot of evidence to support that fact. computer systems are constantly failing, subject to hackeing, malware, malfunction, straight up failing etc etc. And if that is the case, why is a human decision the last input required for drone strikes. Surely a computer can make a better decision?

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
Exactly. I've seen a lot of 'computers are safer than humans' in this thread. yet, I haven't seen a lot of evidence to support that fact. computer systems are constantly failing, subject to hackeing, malware, malfunction, straight up failing etc etc.


Automated vehicles are already on par with human drivers, and the technology is still new. Computer systems may fail, but it's not like humans are flawless either. We're constantly driving drunk, texting while driving, driving while too tired to focus well, driving too fast because we're running late, driving aggressively because dammit that guy isn't going to cut me off, etc. Automated vehicles don't have to be perfect, they just have to be better than the horrific slaughter of human drivers.

And if that is the case, why is a human decision the last input required for drone strikes. Surely a computer can make a better decision?


Because of moral reasons. Seriously, is it that hard to understand the difference between leaving a human as the last input in a case where the question is "should this person be killed" but not when the requirement is maximum reaction speed to a physics problem of "how do I avoid hitting this potential hazard"?

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in southern England.

 queen_annes_revenge wrote:
Exactly. I've seen a lot of 'computers are safer than humans' in this thread. yet, I haven't seen a lot of evidence to support that fact. computer systems are constantly failing, subject to hackeing, malware, malfunction, straight up failing etc etc. And if that is the case, why is a human decision the last input required for drone strikes. Surely a computer can make a better decision?


A driverless car controlled by a computer will be designed to "fail safe."

If you go on airliners, you are already trusting your life to computer systems which "fail safe".

Sometimes they don't, and a whole plane falls out of the sky. It hasn't stopped millions of people flying everywhere.

Petition to stop ratification of EU Article 13 on Internet Copyright

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

autonomous vehicles arent going to stop humans being stupid.

https://www.digitaltrends.com/cars/self-driving-uber-crash-arizona/

If anything, they will probably induce more negligence like that in people who believe they dont have to take any precautions.

https://www.digitaltrends.com/opinion/self-driving-tesla-death-whos-to-blame/

https://www.digitaltrends.com/cars/tesla-driver-takes-nap-while-car-on-autopilot/

https://www.digitaltrends.com/cars/tesla-s-summon-under-trailer/

I sure as hell wouldnt drive my car under a trailer.

https://www.digitaltrends.com/cars/tesal-model-s-crash-nhtsa-investigation-fatal-crash/

And as a driver I would certainly notice a truck, even if it was white.

This message was edited 3 times. Last update was at 2018/12/07 08:53:24


Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
I sure as hell wouldnt drive my car under a trailer.


Congratulations, you're a great driver. The point you keep ignoring is that the standard is not perfection, it's being better on average than human drivers. A fallible automated system can still be better than humans as an average even if it is worse than the best human drivers. For example, yeah, it might hit a trailer because of a flaw, but in exchange it's completely removing drunk driving from the picture. You'll note that the trailer accident happened at 1mph and caused minimal damage and no injuries. I will gladly accept a higher rate of that sort of accident if it means getting drunk drivers and their frequent fatal accidents off the road entirely.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

https://www.digitaltrends.com/cars/cops-chased-a-tesla-for-7-miles-while-the-driver-apparently-slept/



Automatically Appended Next Post:
so should you be allowed to travel in one while drunk?


Automatically Appended Next Post:
https://www.digitaltrends.com/features/will-high-res-radar-make-tomorrows-cars-safer/

You'll forgive me for not putting my wholehearted trust in these things, surely.

This message was edited 2 times. Last update was at 2018/12/07 08:59:46


Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






Sigh. Posting individual accident reports is irrelevant, what matters is the accident rate.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

Tesla Autopilot:

'In one video, a car drifts out of its lane, and then swerves decisively into oncoming traffic, rather than away from it. Tesla’s Autosteer function monitors the car in front to orient itself. The Tesla driver, YouTube user RockStarTree, said he believed the car had lost sensor lock on the vehicle it was following, and mistakenly tried to “follow” an oncoming car when it came into sensor range.'

sounds really safe.

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






The plural of 'anecdote' is not 'data'.

Spamming links to descriptions of single incidents involving automated cars is not relevant, I could do the same with examples of drunk drivers causing accidents. What matters is injuries and fatalities per passenger-mile, and you're providing absolutely nothing on that.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

normally I'd agree. however, statistics arent going to cut it for me in this case when its the safety of me and my child concerned.

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
normally I'd agree. however, statistics arent going to cut it for me in this case when its the safety of me and my child concerned.


Uh, what? That's exactly the point of statistics: statistically speaking which world is more dangerous to you and your child, one where flawed automated cars exist or one where drunk drivers exist?

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
[SWAP SHOP MOD]
Pigeons in Flight






In my Austin Ambassador Y Reg

 queen_annes_revenge wrote:
normally I'd agree. however, statistics arent going to cut it for me in this case when its the safety of me and my child concerned.


This is the most fallacious argument I have read in this thread so far. The safety of you and your child is at stake every time you get in your car. Humans are deeply flawed creatures prone to all sorts of errors and mistakes in the same manner that an AI program is not - this is not under dispute. There is simply no way that a bunch of haphazard, random organisms traveling at varying speeds and extremely variable levels of skill and experience can ever be safer than a programmable intelligence.

The 'arguments' you are deploying are typical of the irrational fear of AI that certain members of the public hold that is completely contrary to any evidence presented. It is one of the reasons self-driving cars will take a while to become the norm.

This message was edited 2 times. Last update was at 2018/12/07 09:22:00


=====Begin Dakka Geek Code=====
DC:80-S--G+MB+I+Pw40k95+D++A+++/sWD144R+T(S)DM+
======End Dakka Geek Code======

Click here for retro Nintendo reviews

My Project Logs:
30K Death Guard, 30K Imperial Fists

Completed Armies so far (click to view Army Profile):
 
   
Made in gb
Lit By the Flames of Prospero





Bodt

I'd take my own judgement in avoiding drunk/stupid drivers over these computer systems.

Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in gb
[SWAP SHOP MOD]
Pigeons in Flight






In my Austin Ambassador Y Reg

 queen_annes_revenge wrote:
I'd take my own judgement in avoiding drunk/stupid drivers over these computer systems.


Then your judgement is poor.

This message was edited 1 time. Last update was at 2018/12/07 09:22:35


=====Begin Dakka Geek Code=====
DC:80-S--G+MB+I+Pw40k95+D++A+++/sWD144R+T(S)DM+
======End Dakka Geek Code======

Click here for retro Nintendo reviews

My Project Logs:
30K Death Guard, 30K Imperial Fists

Completed Armies so far (click to view Army Profile):
 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
I'd take my own judgement in avoiding drunk/stupid drivers over these computer systems.


Then you are foolish and ignorant. No amount of "judgement" is going to save you if a drunk driver swerves across the centerline and hits you head-on before you can possibly react. You're reacting emotionally and putting yourself in more danger instead of looking at the relevant statistics and making the best choice based on the evidence.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
Made in gb
Lit By the Flames of Prospero





Bodt

 filbert wrote:
 queen_annes_revenge wrote:
normally I'd agree. however, statistics arent going to cut it for me in this case when its the safety of me and my child concerned.


This is the most fallacious argument I have read in this thread so far. The safety of you and your child is at stake every time you get in your car. Humans are deeply flawed creatures prone to all sorts of errors and mistakes in the same manner that an AI program is not - this is not under dispute. There is simply no way that a bunch of haphazard, random organisms traveling at varying speeds and extremely variable levels of skill and experience can ever be safer than a programmable intelligence.

The 'arguments' you are deploying are typical of the irrational fear of AI that certain members of the public hold that is completely contrary to any evidence presented. It is one of the reasons self-driving cars will take a while to become the norm.


on what ground? Sure. your point could arguably stand, (although on a personal level I will never trust a computer over my own judgement) if the circumstances werent clouded by the fact that there are still going to be humans involved in all stages of the process, and as the links above have proven, human stupidity clearly still impacts upon the performance of the autonomous systems.

The statistics on driverless cars really arent conclusive evidence as to their absolute safety as of yet. I've yet to find any detailed information on studies, other than 700,000 autonomous miles without incident. no details of types of road travelled, speeds, times of day. etc. if someone can point this out I will happily read it.


also those statistics come from the company that produced the car, which poses its own questions concerning bias.

This message was edited 2 times. Last update was at 2018/12/07 09:35:34


Heresy World Eaters/Night Lords Genestealer cults.

Instagram: nagrakali_love_songs 
   
Made in us
Douglas Bader






 queen_annes_revenge wrote:
(although on a personal level I will never trust a computer over my own judgement)


This is irrational fear.

The statistics on driverless cars really arent conclusive evidence as to their absolute safety as of yet. I've yet to find any detailed information on studies, other than 700,000 autonomous miles without incident. no details of types of road travelled, speeds, times of day. etc. if someone can point this out I will happily read it.


That's evidence right there, and automated systems are only going to get better. Meanwhile drunk drivers, speeding, etc, all the flaws of human drivers, are not going to improve. It's inevitable that automated systems eventually get better than humans as the bugs are worked out, the only question is when and if we're already there.

There is no such thing as a hobby without politics. "Leave politics at the door" is itself a political statement, an endorsement of the status quo and an attempt to silence dissenting voices. 
   
 
Forum Index » Off-Topic Forum
Go to: