Switch Theme:

Self-driving cars -- programming morality  [RSS] Share on facebook Share on Twitter Submit to Reddit
»
Author Message
Advert


Forum adverts like this one are shown to any user who is not logged in. Join us by filling out a tiny 3 field form and you will get your own, free, dakka user account which gives a good range of benefits to you:
  • No adverts like this in the forums anymore.
  • Times and dates in your local timezone.
  • Full tracking of what you have read so you can skip to your first unread post, easily see what has changed since you last logged in, and easily see what is new at a glance.
  • Email notifications for threads you want to watch closely.
  • Being a part of the oldest wargaming community on the net.
If you are already a member then feel free to login now.




Made in gb
Decrepit Dakkanaut




UK

In contrast if two self driving cars have an incident with each other and its proven that neither is at fault - because unlike with people you can review all the computer data accurately - then the insurance companies could hate it!

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in us
The Conquerer






Waiting for my shill money from Spiral Arm Studios

 Kilkrazy wrote:
I don't believe insurance will be a problem.

Self-driving cars will be safer than human driven vehicles. There will be a lot fewer cars on the road, too. Both factors will reduce accident rates.


Its not that the insurance companies will hate it, its that the manufacturers of said cars will hate it.

Currently, if someone driving a Ford has an accident all the responsibility lies on the individuals involved in the accident. If self-driving cars become a thing, then the responsibility rests on Ford because they are effectively the ones in control of the vehicle.

Once the big corporations realize that they would be largely responsible for all car collisions involving their product they'll drop self-driving technology like a sack of potatoes.

Even if collisions were significantly reduced, from their perspective they would be increasing their liability by an insane amount.

This message was edited 2 times. Last update was at 2018/10/27 16:53:05


Self-proclaimed evil Cat-person. Dues Ex Felines

Cato Sicarius, after force feeding Captain Ventris a copy of the Codex Astartes for having the audacity to play Deathwatch, chokes to death on his own D-baggery after finding Calgar assembling his new Eldar army.

MURICA!!! IN SPESS!!! 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

So you say, but I think Toyota and Mercedes-Benz, etc have got some pretty good lawyers as well as engineers working for them.

There's no apparent reason to expect that collisions will increase due to automation. All the indicators are that they will decrease.

The idea that we can't blame an individual driver so there won't be insurance is wrong. We don't blame individual drivers now. The whole point of insurance is to pool the risk.

The risks will simply be pooled among manufacturers rather than drivers, because there won't be any drivers.

Commercial airlines and shipping lines function perfectly well without the necessity for every individual crewmember to have personal insurance.

I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in us
The Conquerer






Waiting for my shill money from Spiral Arm Studios

 Kilkrazy wrote:

The idea that we can't blame an individual driver so there won't be insurance is wrong. We don't blame individual drivers now. The whole point of insurance is to pool the risk.


You are completely misunderstanding the point. The point is that car manufacturers will be opening themselves up to massive liability with self-driving cars, thus they will NOT make self-driving cars because even if they only have to deal with a few thousand lawsuits a year they'll be paying out the nose with each one. Insurance or no insurance. Someone is going to die due to an error caused by a self-driving car with still some regularity, and each time the company will settle out of court for hundreds of millions. As an insurance company, I would never give a company that made self-driving cars liability insurance because the payout would be massive each time it happened.

And yes, individual drivers DO get blamed now. If you are "at fault" in a crash then you are in fact to blame. I mean, thats fairly freaking obvious dude.

The difference is now if you try to sue a driver for injuring/killing someone you won't get much if anything even if they are ruled completely at fault, but if you sue a huge company because of a manufacturing defect you can get a ton. My godfather was basically reduced to being a vegetable in a car crash where the other driver(a cop) was completely at fault. Yet they never got a penny. However, if the fault was caused because of faulty programming in a self-driving car you can go after the car manufacturer, who has both lots of money and will likely settle out of court.

Car manufacturers thus have a massive reason not to make self-driving cars.

Its sort of a prisoners dilemma. Even if there is one course of action which leads to the best overall results for everyone, because of selfishness we will instead arrive at a suboptimal result.


Automatically Appended Next Post:
 Kilkrazy wrote:


The risks will simply be pooled among manufacturers rather than drivers, because there won't be any drivers.


No it won't. If a Toyota gets in an accident, Ford and Mercedes sure ain't gonna share the risk.


You need to realize that effectively, each car manufacturer is the "driver" of each and every one of their vehicles in this hypothetical scenario. Thus, they are at fault each time one of their cars is involved in an incident.

This message was edited 3 times. Last update was at 2018/10/27 18:45:21


Self-proclaimed evil Cat-person. Dues Ex Felines

Cato Sicarius, after force feeding Captain Ventris a copy of the Codex Astartes for having the audacity to play Deathwatch, chokes to death on his own D-baggery after finding Calgar assembling his new Eldar army.

MURICA!!! IN SPESS!!! 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

I don't misunderstand your point. It's just that I think you are wrong.

I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in us
Stormblade



SpaceCoast

I think he's wrong in assuming its will definitely be that way but it could be that way. Remember in addition to lawyers they have business analysts. So if for investment X you get a ROI of Y with risk of Z is it worth it. If they think the value is there then yes they ll take the chance because the other possibility is other companies do and you become like Kodak in the age of digital cameras.... Am I sure either way, nope dont have the data.
   
Made in gb
Decrepit Dakkanaut




UK

Another angle is that the safety considerations (and potential to push through everyone being forced to upgrade their cars and thus reaching emissions targets and providing lots of jobs for production and manufacture) will mean governments push this kind of technology through the system once its matured enough.

Far as I recall car insurance is already a loss for many companies, they keep it only because its required by law to have insurance for driving and thus there are pressures/requirements/incentives to keep car insurance on a companies books.

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in us
Fixture of Dakka





 Just Tony wrote:
TheWaspinator wrote:We just need to avoid a 0th Law of Robotics scenario where the cars take over society to stop us from killing each other.


It's a real shame that THAT movie will be forever in the forefront of Asimov's legacy despite the fact that Robot As Menace stories were stories that he hated, and the tenets of that movie run counter to EVERYTHING he wrote about that dealt with robots.


Oh, yes, I despise that movie too. It's quite clear why they waited until after Asimov died to film it; it's a complete travesty of the story "Little Lost Robot".

CHAOS! PANIC! DISORDER!
My job here is done. 
   
Made in us
Powerful Phoenix Lord





I think the real future of AI related cars will not be AI that navigates our current motorways - however the alternative is mind crushingly expensive to consider. I think the future of driverless cars will be based around very controlled circumstances.

Consider a single lane with barricades on each side, with a "launch" point...the driver would drive to that point, then release the car which would join traffic and exit at the desire exit, etc., where the driver would take control again. This would allow long distance driverless work in a controlled environment (more controlled than normal roadways). The cars could communicate fuel status with the motorway itself and it could eject cars which are nearing empty to avoid stalling on the active passage, etc.

Likewise, small city cars might operate more efficiently running on guided tracks in the ground vs. autonomously navigating a confusing 3D environment. Again a driver would drive onto a location, activathe driverless feature etc.

The reality is that it takes one child killed by an AI car to tank the entire industry if precautions are not suitable. This is 2018, where we never hold a victim responsible - even if they made a gak decision which caused them harm, it's almost always pinned on whatever struck them or injured them. This world wouldn't take well to any accidental injury or death by AI --- even if it was directly caused by human negligence. Just my opinion. I do think the future is a more controlled option - running along a normal highway much like an HOV lane etc.
   
Made in fi
Confessor Of Sins




 Tannhauser42 wrote:
Or, if an accident happened because a sensor failed, is the liability on the owner for not having the sensors checked, the manufacturer of the sensor if it was a defect, of the programmer of the AI if the AI failed to detect the failure of the sensor?


I'm pretty sure that particular problem will work exactly like it does today with normal cars. As long as I have my car checked regularly and nothing is flagged (and it isn't immediately obvious something is broken) I'm not responsible for an accident caused by said failure. My insurance company will still handle it, ofc, but they will do all they can to put it on the manufacturer.

If I see the red light labeled "brake failure" and still decide to drive then yes, that's a different thing, and the car probably has it logged so I'll be found guilty of ignoring it if something happens...

The programmer isn't likely to be involved at all - he's far behind layers of engineers, quality testers and the manufacturer. If they let such a flaw through then everything has failed and the programmer can't be blamed, at least not alone.

So what I think the insurance thing will come out to is pretty much the same as today. You're not driving but you'll be the "designated operator" of the vehicle. You'll be responsible for checking that no warning lights are on before going. Taking direct control would probably increase your part of the liability in case something goes wrong - maybe higher self-risk on the insurance?
   
Made in nl
Pragmatic Primus Commanding Cult Forces






Spetulhu wrote:
 Tannhauser42 wrote:
Or, if an accident happened because a sensor failed, is the liability on the owner for not having the sensors checked, the manufacturer of the sensor if it was a defect, of the programmer of the AI if the AI failed to detect the failure of the sensor?


I'm pretty sure that particular problem will work exactly like it does today with normal cars. As long as I have my car checked regularly and nothing is flagged (and it isn't immediately obvious something is broken) I'm not responsible for an accident caused by said failure. My insurance company will still handle it, ofc, but they will do all they can to put it on the manufacturer.

If I see the red light labeled "brake failure" and still decide to drive then yes, that's a different thing, and the car probably has it logged so I'll be found guilty of ignoring it if something happens...

The programmer isn't likely to be involved at all - he's far behind layers of engineers, quality testers and the manufacturer. If they let such a flaw through then everything has failed and the programmer can't be blamed, at least not alone.

So what I think the insurance thing will come out to is pretty much the same as today. You're not driving but you'll be the "designated operator" of the vehicle. You'll be responsible for checking that no warning lights are on before going. Taking direct control would probably increase your part of the liability in case something goes wrong - maybe higher self-risk on the insurance?

With a self-driving car, a programmer is not behind layers of engineers. Engineers are responsible for designing the car itself, not the software and AI code. If an accident is caused by faulty software, bugs, glitches or simple oversights and mistakes in the code, the programmer is directly responsible. Of course, there is laws for corporate liability that mean that the corporation as a whole would be held responsible for it rather than individual employees, but that corporation could pass on the blame to the individual programmer by demoting or firing him. Of course, such complicated code isn't going to be written and tested by one single programmer, so it would likely be an entire team that'd get fired (or at least the person in charge of said team). Anyways, programmers will be much more directly responsible than engineers are or will be, since where an engineer can only really be blamed if an accident is caused by a faulty design (which virtually never happens) an AI programmer is de-facto the operator of a self-driving car.

Error 404: Interesting signature not found

 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

Here's how insurance could work for self-driving cars.

The owner would pay for Fire and Theft (if they wanted it.)

The 3rd party liability would be borne by the manufacturer, and passed on to the owner in the sale price. This would cover accidents caused by programming or engineering flaws.

There would be a "per mile" levy for 3rd party and accidental damage payable by the car's user. This will be necessary because cars will tend to be shared like AirBnB, they will do a lot more miles than current cars, and the more mileage per year, the greater the chance of something happening.

The important point though, is that accident rates will drop when human stupidity (speeding, tailgating and so on) is removed from the roads.

I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in us
[DCM]
Dankhold Troggoth






Shadeglass Maze

I don't know how much more self driving cars will be shared more than normal cars will be, at least in the near future.

That'd be great but just isn't practical in many cases for people who need to commute / pick up kids / etc, regardless of whether the car drives itself or not

This message was edited 1 time. Last update was at 2018/10/28 19:53:50


 
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

Yes, I don't imagine 95% of the population will decide not to buy a car, but in the UK, cars spend on average 95% of their time parked.

Allowing for non-public transport commuting, nights and so on, you could still reduce the number of cars by 50% and transport all the people.

What I think will happen is a mixture of AirBnB-style car sharing by some private owners, and companies setting up to rent out short-term shareable cars, like the various bike share and scooter share schemes around the world.

I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in de
Longtime Dakkanaut




 Kilkrazy wrote:
Here's how insurance could work for self-driving cars.
(…)
The important point though, is that accident rates will drop when human stupidity (speeding, tailgating and so on) is removed from the roads.
The insurance companies will laugh themselves silly because you'll still pay the same rates but accidents (and thus payouts) become much rarer. Of course they'll whine if/when competition brings down the rates but until that happens they won't mind AI driven cars at all.
   
Made in us
The Conquerer






Waiting for my shill money from Spiral Arm Studios

Mario wrote:
 Kilkrazy wrote:
Here's how insurance could work for self-driving cars.
(…)
The important point though, is that accident rates will drop when human stupidity (speeding, tailgating and so on) is removed from the roads.
The insurance companies will laugh themselves silly because you'll still pay the same rates but accidents (and thus payouts) become much rarer. Of course they'll whine if/when competition brings down the rates but until that happens they won't mind AI driven cars at all.


Sure, in the event that car manufacturers are willing to take on nearly 100% of the liability, which they probabably won’t once it all shakes out.

Self-proclaimed evil Cat-person. Dues Ex Felines

Cato Sicarius, after force feeding Captain Ventris a copy of the Codex Astartes for having the audacity to play Deathwatch, chokes to death on his own D-baggery after finding Calgar assembling his new Eldar army.

MURICA!!! IN SPESS!!! 
   
Made in us
5th God of Chaos! (Yea'rly!)




The Great State of Texas

 Peregrine wrote:
Simple solution: program the car to kill all pedestrians, and all other drivers not using that brand of car. Arm it with appropriate weapons to accomplish this task. Now you no longer have to worry about whether it will kill a pedestrian by accident, you have certainty.


Wait until a terrorist hacks into a car and does that, or maybe all the cars in California...

-"Wait a minute.....who is that Frazz is talking to in the gallery? Hmmm something is going on here.....Oh.... it seems there is some dispute over video taping of some sort......Frazz is really upset now..........wait a minute......whats he go there.......is it? Can it be?....Frazz has just unleashed his hidden weiner dog from his mini bag, while quoting shakespeares "Let slip the dogs the war!!" GG
-"Don't mind Frazzled. He's just Dakka's crazy old dude locked in the attic. He's harmless. Mostly."
-TBone the Magnificent 1999-2014, Long Live the King!
 
   
Made in us
Keeper of the Flame





Monticello, IN

RiTides wrote:I don't know how much more self driving cars will be shared more than normal cars will be, at least in the near future.

That'd be great but just isn't practical in many cases for people who need to commute / pick up kids / etc, regardless of whether the car drives itself or not



Kilkrazy wrote:Yes, I don't imagine 95% of the population will decide not to buy a car, but in the UK, cars spend on average 95% of their time parked.

Allowing for non-public transport commuting, nights and so on, you could still reduce the number of cars by 50% and transport all the people.

What I think will happen is a mixture of AirBnB-style car sharing by some private owners, and companies setting up to rent out short-term shareable cars, like the various bike share and scooter share schemes around the world.


This may work in Europe, but in the US individual ownership is a massive deal. It's much more practical for people to take the bus in the city where I live, but you STILL see almost all people owning cars. It's to the point that the only people in the city who don't own or drive cars are the people who physically or legally CANNOT own or drive cars. Self-driving cars won't rectify that at all, and ESPECIALLY in sparse areas or small towns where it becomes downright wasteful to institute mass transit.

www.classichammer.com

For 4-6th WFB, 2-5th 40k, and similar timeframe gaming

Looking for dice from the new AOS boxed set and Dark Imperium on the cheap. Let me know if you can help.
 CthuluIsSpy wrote:
Its AoS, it doesn't have to make sense.
 
   
Made in us
Decrepit Dakkanaut





 Just Tony wrote:


This may work in Europe, but in the US individual ownership is a massive deal. It's much more practical for people to take the bus in the city where I live, but you STILL see almost all people owning cars. It's to the point that the only people in the city who don't own or drive cars are the people who physically or legally CANNOT own or drive cars. Self-driving cars won't rectify that at all, and ESPECIALLY in sparse areas or small towns where it becomes downright wasteful to institute mass transit.


Yeah, one key element that researchers are working on "fixing" is the US's rural problem. See, as I mentioned above, the people working on this stuff generally agree that 5G wifi is what's needed to get the data transfer speeds necessary for automated driving to work. . . And while that's great in an urban area, it's not so great for all the gravel roads that exist between say. . . Omaha and Lincoln, Nebraska.
   
Made in us
Keeper of the Flame





Monticello, IN

It just means MASSIVE amount of spending necessary to wifi the entirety of the US countryside.

www.classichammer.com

For 4-6th WFB, 2-5th 40k, and similar timeframe gaming

Looking for dice from the new AOS boxed set and Dark Imperium on the cheap. Let me know if you can help.
 CthuluIsSpy wrote:
Its AoS, it doesn't have to make sense.
 
   
Made in gb
Ridin' on a Snotling Pump Wagon






The main upswing is that accidents involving self driving cars are likely to be smaller affairs.

Consider.

When I first started working in insurance, one of the senior colleagues was sorting out a personal injury case. In short, some teenaged bellend was bought a Ferrari for their 18th. Decided to show off. Promptly lost control at high speed, ploughing through his own party.

That is pure human stupidity. Self drive should have that down pay already.

Then at least in the U.K., what is the test for liability? Well, it’s all down to What Would A Reasonable Person Do In That Position. Examples include not forcing your own path between two lanes of traffic. Giving way to the right at roundabouts. Not cutting in front of another car then slamming on your breaks.

Where it’s not clear cut, case law helps distribute the liability. It becomes drawn out because there’s a lot of case law, and the argument is which one best fits.

Self-drive in theory takes care of much of that, the AI being fundamentally unable to take stupid risks. And again, in pure theory, any collision between two self drive cars should be relatively minor because they won’t be speeding, and one assumes will practice self breaking.

If there’s a bump between a self drive and a human driven, I’m going to go out on a limb and suggest in almost all cases, it’ll be the human driver that’s to blame.

   
Made in us
Omnipotent Necron Overlord






 Grey Templar wrote:
Mario wrote:
 Kilkrazy wrote:
Here's how insurance could work for self-driving cars.
(…)
The important point though, is that accident rates will drop when human stupidity (speeding, tailgating and so on) is removed from the roads.
The insurance companies will laugh themselves silly because you'll still pay the same rates but accidents (and thus payouts) become much rarer. Of course they'll whine if/when competition brings down the rates but until that happens they won't mind AI driven cars at all.


Sure, in the event that car manufacturers are willing to take on nearly 100% of the liability, which they probabably won’t once it all shakes out.

I'm not sure why you think the car owner won't be still be the one paying for the insurance. You are all caught up in the liability but that isn't actually how we currently do things and there is no reason to expect that to change - for precisely the reason you are stating. Cause it wouldn't work.

It will work almost exactly like it works now. Except your insurance rates would be based on how reliable the self driving car is on the road rather than your driving record (cause you aren't driving). The reality is - self driving cars will prevent accidents at such a high rate that withing a few decades car insurance companies will likely go out of business. Peoples premiums would have to go down like 90% to cover the absence of risk.

If we fail to anticipate the unforeseen or expect the unexpected in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorized or easily referenced.
- Fox Mulder 
   
Made in gb
Bryan Ansell





Birmingham, UK

And the Insurance companies would likely move to the subscription service model of car use/ownership which manufacturers, financiers and governments are already basing their future forecasts of vehicle use on.

   
Made in de
Longtime Dakkanaut




Grey Templar wrote:
Spoiler:
Mario wrote:
 Kilkrazy wrote:
Here's how insurance could work for self-driving cars.
(…)
The important point though, is that accident rates will drop when human stupidity (speeding, tailgating and so on) is removed from the roads.
The insurance companies will laugh themselves silly because you'll still pay the same rates but accidents (and thus payouts) become much rarer. Of course they'll whine if/when competition brings down the rates but until that happens they won't mind AI driven cars at all.


Sure, in the event that car manufacturers are willing to take on nearly 100% of the liability, which they probabably won’t once it all shakes out.
No, you are misunderstanding. If/when self-driving cars (SD cars from now on) are actually introduced (meaning level 5 automation) then insurance companies (while taking on the liability for incidents caused by SD cars they insure, equivalent to what they do for you today) will still laugh themselves silly because SD cars will have a lower incidence rate while you (the person who sitting in the car and reading a book or watching a movie) will for a while still be paying the same rates. And because SD cars will only be sold when the incidence rate is much lower than what humans can manage on average this will mean insurance companies will overall end up paying out less the more SD cars there are. Despite the fact that humans are quite error prone and cause 1.25 deaths per year (2010 numbers) we still get insurance. SD cars will only need to do better than what humans can manage and insurance companies will get higher profits. For them it's just about numbers, they might even give you a slightly better deal if you have a SD car (if their calculations show that it's save them even more money). Each SD car should be an additional more reliable driver and one less unreliable human behind the wheel.

Of course if car manufacturers take on some liabilities the insurance companies wouldn't mind that (more profit, yay!). Their job is to handle risks and make a profit doing that. And the more reliable they can model and predict that, the more money they can make. Their actual extinction level event would be a 100% fully automated driving environment where insurance becomes unnecessary and we get legislation that makes car insurance obsolete instead of mandatory.
   
Made in jp
[MOD]
Anti-piracy Officer






Somewhere in south-central England.

The Guardian is catching up with the debate.

https://www.theguardian.com/commentisfree/2018/nov/14/cars-drivers-ethical-dilemmas-machines


I'm writing a load of fiction. My latest story starts here... This is the index of all the stories...

We're not very big on official rules. Rules lead to people looking for loopholes. What's here is about it. 
   
Made in us
Omnipotent Necron Overlord






Ultimately - the moral compass of the a self driving car should be self preservation of it's occupants. Which ultimately is going to create a situation in which avoiding contact with any other body is going to be it's highest priority. However were contact can not be avoided - maintaining control of the vehicle becomes highest priority. So if there is a choice of running over a baby or an old person the decision will be made instantly based on which route is the safest based on control.


If we fail to anticipate the unforeseen or expect the unexpected in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorized or easily referenced.
- Fox Mulder 
   
Made in nl
Pragmatic Primus Commanding Cult Forces






 Xenomancers wrote:
Ultimately - the moral compass of the a self driving car should be self preservation of it's occupants. Which ultimately is going to create a situation in which avoiding contact with any other body is going to be it's highest priority. However were contact can not be avoided - maintaining control of the vehicle becomes highest priority. So if there is a choice of running over a baby or an old person the decision will be made instantly based on which route is the safest based on control.


That is not much of a moral compass.

Error 404: Interesting signature not found

 
   
Made in us
Fixture of Dakka





 Iron_Captain wrote:
 Xenomancers wrote:
Ultimately - the moral compass of the a self driving car should be self preservation of it's occupants. Which ultimately is going to create a situation in which avoiding contact with any other body is going to be it's highest priority. However were contact can not be avoided - maintaining control of the vehicle becomes highest priority. So if there is a choice of running over a baby or an old person the decision will be made instantly based on which route is the safest based on control.


That is not much of a moral compass.


It's (effectively) a robot, not a priest.

CHAOS! PANIC! DISORDER!
My job here is done. 
   
Made in nl
Pragmatic Primus Commanding Cult Forces






 Vulcan wrote:
 Iron_Captain wrote:
 Xenomancers wrote:
Ultimately - the moral compass of the a self driving car should be self preservation of it's occupants. Which ultimately is going to create a situation in which avoiding contact with any other body is going to be it's highest priority. However were contact can not be avoided - maintaining control of the vehicle becomes highest priority. So if there is a choice of running over a baby or an old person the decision will be made instantly based on which route is the safest based on control.


That is not much of a moral compass.


It's (effectively) a robot, not a priest.

And that is not an excuse. We want people who aren't priests to also have a proper moral compass, and that goes for robots as well, if a robot is given the same responsibility as a Human. Before a robot can be given such responsibility and allowed on the roads, it should be expected to be able to adhere to the same morals and ethics as the average person. In other words, it should realise basic things such as that the life of a child is valued higher than that of an adult or that you should not avoid a collision with another car if the only way to do so is by plowing through a group of pedestrians. Morals vary from individual to individual and culture to culture, but these are the kind of morals that are universal, and that robots at the very least should be able to take into account.

And I also predict that this ethics issue, along with the fact that loads of people just don't want a self-driving car (a lot of people, including me, much prefer being in control themselves), is what is going to sink this whole concept. Or at least it means that self-driving cars won't be a widespread thing for decades to come. But who knows. With the speed AI technology is advancing at we might have robots capable of near-Human reasoning and with a proper moral compass and able to drive so perfectly that accidents are entirely eliminated in like 20 years or so. It is going pretty quick if you look at where we were 20 years ago.

This message was edited 1 time. Last update was at 2018/11/16 04:25:15


Error 404: Interesting signature not found

 
   
Made in us
The Conquerer






Waiting for my shill money from Spiral Arm Studios

 Iron_Captain wrote:
But who knows. With the speed AI technology is advancing at we might have robots capable of near-Human reasoning and with a proper moral compass and able to drive so perfectly that accidents are entirely eliminated in like 20 years or so. It is going pretty quick if you look at where we were 20 years ago.


Possibly. It is equally possible though that they hit a roadblock. While technological advancement has been rapid over the last hundred years there is no guarantee that will continue indefinitely. In many areas the rate of advancement has slowed considerably, and in a few there is gross stagnation.

This message was edited 1 time. Last update was at 2018/11/16 04:51:42


Self-proclaimed evil Cat-person. Dues Ex Felines

Cato Sicarius, after force feeding Captain Ventris a copy of the Codex Astartes for having the audacity to play Deathwatch, chokes to death on his own D-baggery after finding Calgar assembling his new Eldar army.

MURICA!!! IN SPESS!!! 
   
 
Forum Index » Off-Topic Forum
Go to: