Self driving cars

mbalbritton

#@$%!
Joined
Mar 22, 2005
Location
Lakeland, FL
Just watched a commercial about the new Ford Expedition that has an auto trailer back up. It backs a trailer for you. Tesla and others have auto pilot. Many others have parallel parking assist.

With as fast as technology goes, Could our kids be the last generation to truly know how to drive?
 
No thanks. I prefer to do the driving myself.
 
Just watched a commercial about the new Ford Expedition that has an auto trailer back up. It backs a trailer for you. Tesla and others have auto pilot. Many others have parallel parking assist.

With as fast as technology goes, Could our kids be the last generation to truly know how to drive?
Isn't that the one where it gives you a rotary controller so you don't have to turn the steering wheel?

If so, that's dumb, because it's just a tiny steering wheel.
 
Friend of mine got a Ford F-150 that had the trailer back up crap. It comes with these sensors you have to mount on the trailer for the truck to “ping” off of. He tried it once for shits and giggles. They were never used again.
 
As soon as the autonomous cars are the majority, which won't be any time soon, your elected officials will outlaw you driving yourself.

No thanks.

Sent from my Moto G (4) using Tapatalk
 
This morning about 7:40 am, eastbound on I-40 @ 70 business/garner exit we roll up @ 68 mph and swerve into the left lane to avoid a car slowly BACKING UP in the right lane, just past the exit where the taper ends you can see in the pic. The car was not on the shoulder it was IN THE TRAVEL LANE! . We had passed a semi moments ago, I looked in the mirror and see the truck swerve into the left lane almost on top of the car. If there had been more traffic there would have been a hell of a crash and probably fatalities.......I'm sure a self driving car would never stop on an interstate and attempt to back up 70 yards because it missed the exit.

Google Maps
 
Last edited:
Autonomous cars are coming. All of the major auto manufacturers are assuming that will be "the thing" in 20 years and are investing heavily in it.
Billions in R&D $$ going into it now.
However it is widely recognized that (1) the biggest issue is the human and (2) it's going to have to be implemented in phases, basically by location.
The big problem isn't the safety of the AI vision/processing etc for vehicles, it's the variables of other humans. It really will only work safely if it's ALL auto vehicles or very little. Because, like mentioned n this thread, people cannot be trusted.
Bans on human driving can be feasibly implemented in dense urban areas, but not in rural suburbs or certainly out in the country, and time in the foreseeable future. So that's how its going to start.
Whats going to happen is you will have zones of auto driving vs "everywhere else".

I don't think our kids will be the last, but maybe our grandkids. On this continent, anyway. But you gotta think this is just another way that Americans are going to become even less prepared to deal with or be able to relate to other countries. Like people going to South America or Africa and realizing they can't drive b/c all the cars are stickshifts.
 
Also, if you think self driving cars are scary, consider that autonomous and AI-coupled "driving teams" are a major thrust for the military. It's no secret that the Army vision of the future is a high % of a squad being un-manned vehicles, and an overall smaller operation crew.
If you think an automated Tesla is scary, imagine an errant 10 ton APC or MATV.
 
Also, if you think self driving cars are scary, consider that autonomous and AI-coupled "driving teams" are a major thrust for the military. It's no secret that the Army vision of the future is a high % of a squad being un-manned vehicles, and an overall smaller operation crew.
If you think an automated Tesla is scary, imagine an errant 10 ton APC or MATV.
. Wasn't that the point of the darpa Grand challenge and Urban challenge? To get universities and individuals to develop the tech for a chance to win a pittance sum and publicity and then you have to surrender your work to darpa in it's entirety...
 
I don’t see any AI being able to back up a trailer successfully, but then again most the people that claim to be haulers can’t back a trailer up either...
 
Here's the thing that scares me about AI. A couple years back, I was a stay-at-home dad and was looking for ways to earn money from home. I came across some apps which allowed me to take surveys, and do various tasks to earn gift cards and cash. One of the tasks gave me a picture of a rain soaked road and I had to "draw" on the screen where I saw the lane lines. The pictures were often very grainy and trying to draw them on a small iPhone screen was rather difficult. I didn't always get it right, but the computer took my responses anyway and paid me like 10 cents per picture. In a half hour, I could do a couple hundred pictures and earn some money, even sitting on the crapper. I really didn't care why I was doing it, but it was really easy to do and I made money. I got to thinking, "any idiot can do this". And that's pretty much who it's targeted to. Any idiot with a smart phone can download this app and make money from their couch filling in lines on a picture.

Later on, I got an email from the company that was paying for the "research" I was contributing to thanking me for my time helping them develop the software that will eventually run autonomous vehicles. So, the moral of this story is that AI developers are using the cheapest possible research, from the least reliable sources for what will most likely be the most dangerous leap of technology.
 
The big problem isn't the safety of the AI vision/processing etc for vehicles, it's the variables of other humans. It really will only work safely if it's ALL auto vehicles or very little. Because, like mentioned n this thread, people cannot be trusted.

And computers can be hacked...
 
People can be hacked too. Just look at what Google and Apple have done to us.
 
Here's the thing that scares me about AI. A couple years back, I was a stay-at-home dad and was looking for ways to earn money from home. I came across some apps which allowed me to take surveys, and do various tasks to earn gift cards and cash. One of the tasks gave me a picture of a rain soaked road and I had to "draw" on the screen where I saw the lane lines. The pictures were often very grainy and trying to draw them on a small iPhone screen was rather difficult. I didn't always get it right, but the computer took my responses anyway and paid me like 10 cents per picture. In a half hour, I could do a couple hundred pictures and earn some money, even sitting on the crapper. I really didn't care why I was doing it, but it was really easy to do and I made money. I got to thinking, "any idiot can do this". And that's pretty much who it's targeted to. Any idiot with a smart phone can download this app and make money from their couch filling in lines on a picture.

Later on, I got an email from the company that was paying for the "research" I was contributing to thanking me for my time helping them develop the software that will eventually run autonomous vehicles. So, the moral of this story is that AI developers are using the cheapest possible research, from the least reliable sources for what will most likely be the most dangerous leap of technology.
Yep, that pretty much sums up how 95% of data for AI machine learning/vision has been curated.
We do a lot of this kind of stuff here at work.
The good news (maybe?) is that human error/variance is an assumed, built in part of the model. They don't just take your labels, and use that to train the system. They take you labels and those from 10,000 other people, and use them as an aggregate. The "learning" is based on extracting the common features out of all of the 1000 images given the same labels by different people at different times, etc. As long as EVERYBODY isn't wrong about the same one, then it works...

... and therein lies the big problem as well. You can have cases where the "trainers" are all consistently wrong, and so the machine just follows that and learns wrong, and will never know its wrong. It just follows the majority. Or, if you have limited data available for a particular case then the error rate can go way up.
 
Product development cycle...rush and be the first to market, make gobs of money...lose gobs of money because you cut corners or didn't know what you didn't know...so you're the subject of legal action. Other companies learn from the first company's mistakes, put their own tweak on things, the tech gets a little better. The general public suffers through recalls for a decade, becomes conditioned to accept whatever tech is being rammed down their throat, so when someone actually does it right, they're receptive to the product. And since everything is engineered to fail, by the time Joe Public is ready to buy again, everyone else will have caught up, driving prices down. So I say all that to say, what you see today is just the beginning, it'll get better, and someone will eventually get it right. 10 years ago, back up cameras and in dash navigation systems were a joke too...now try buying something new without one...and if you're receptive to it, they're actually pretty nice tools.

Oh...and I'd be perfectly fine with fully autonomous vehicles being a requirement, but with a select group of people...1) people with Ohio license plates 2) people with Virginia license plates 3) people with handicapped tags 4) anyone of federal retirement age. Those are 90% of the ass clowns that cause me roadway frustration. Then just raise gas prices to $5+/gal and that should take care of the idiots.
 
Oh...and I'd be perfectly fine with fully autonomous vehicles being a requirement, but with a select group of people...1) people with Ohio license plates 2) people with Virginia license plates 3) people with handicapped tags
You forgot Florida! How could you forget Florida!?!?!?!
 
Back
Top