Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
Disadvantage of self - driving cars
Disadvantage of self - driving cars
Disadvantage of self - driving cars
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: Disadvantage of self - driving cars
Joshua Brown, who was 40 years old, still had a long life ahead of him. In May of 2016, this driver was killed after his Tesla crashed into a semi-truck while it was on autopilot. In a split second, the truck had cut in front of it, causing the car to crash and drive underneath the semi. It then just veered off the road and into two fences and a power pole before stopping. The dangers of self-driving cars are unimaginable, but many people do not realize it. Self-driving cars should not be on the road because they come with many dangerous factors, disadvantages, and with all these dangers and disadvantages, they may not be as great as they are brought up to be.
There are dangerous factors that come with having a computer control a car rather
…show more content…
There are many situations that the computer (that is controlling the car) would have to deal with. Even humans have trouble in many of these situations. According to the Massachusetts Institute of Technology, there are situations where the car will have to make a decision that may hurt people. For example, there could be a driverless car that must hit either a pedestrian, or swerve a different way, causing a crash and harming other people. Since these cars are programmed with a set of safety rules, no one can be certain that they will make the right decision. Driverless cars are also not going to solve all the problems that come with regular cars because not all people will have these cars. In a Spectrum Ieee study titled, “The Big Problem With Self-Driving Cars,” it talks about how the cars may be the cause of more problems in the future. Based on what I read, the cars will most likely cause more human error because the people with regular cars would be making different decisions and they would be driving differently. Not only could this become dangerous, but it will also cause more traffic rather than minimizing it. Therefore, driverless cars will not fix many of the problems that regular cars
For a while, Self Driving Cars, have never really been a thought, to be thought about in the driving industry. It has always been the regular transportation, like regular cars, trains and other types of transportation for getting around. It was crazy enough to have thought about self driving cars, but now to start to make and produce self driving cars, is even crazier. The question is, is it safe to have these cars on the road? Also what kind of hazards might these vehicles be for people who decide to purchase them? Bob Lutz from (www.cnbc.com) states that “"The autonomous car doesn't drink, doesn't do drugs, doesn't text while driving, doesn't get road rage,". This shows that in Bob’s opinion, the self driving car, could be safer than the
Now, I am very intrested in cars and I love almost every aspect of them, but did you know, that each year 1 million, people die each year from car accidents? And 81% of these accidents are caused by human error? 1 million people, gone like that. Fortunately, there's a new technology that dramastically decrease this number. This technology is self-driving cars. A self-driving car is a car that is capable of sensing its environment and navigating without human input. Currently, about 33 companies including Tesla, BMW, and Google, are working to create self-driving cars that can prevent human errors and change the way people view driving. Self-driving cars, have other benefits besides preventing human error, such as less traffic congestion, and less fuel consumption. However, with these benefits come some costs such as cyber security problems and ethical dilemmas. So, should we have self-driving cars, or not?
Who’s to blame when the vehicle gets in a severe car accident? Advances in technology, like self-driving cars, will be bad because it causes people to be lazy, it takes away the responsibility of the driver, it takes away the responsibility of the driver, and it can malfunction causing accidents.
Imagine having your life flash before your eyes while you were still wearing diapers. And imagine having a hot hunk of metal crash into you and shatter your sense of everything. When I was just three years old, I was the victim of a very scary car accident. While waiting to make a turn into my nursery school, my mom’s car was rear-ended by a car driving at 50 miles per hour. I remember how incredibly loud the collision was and even how the windows seemed to shiver in their rubber holders. Seeing my mom's head fly back and feeling the car swerve into the opposing traffic, I thought I was going to die. And why did this happen? Because the person driving behind us was texting on her phone and was not focused on the road. All of this, the emotional, physical, and financial damage, and the possibility of losing my mom's or my own life, could have been prevented if the car behind us was a driverless car. Briefly, a driverless car is capable of driving itself via an intricate system of cameras, sensors and computers. I propose that human drivers should be replaced with driverless cars because driverless cars are safer and more efficient.
While there have been surveys to understand how people feel about self-driving vehicles, they only surveyed a little over a thousand, which isn’t comparable to the millions of people who actually drive cars (Degroat). Many, more than 70 percent, do believe that autonomic vehicles will reduce accidents, the severity of the crash, and help the fuel economy, nearly as many are concerned about the way the car will perform under unusual or unexpected circumstances, as compared to the way a human could react and perform, along with if the vehicle would have any system malfunctions (Degroat). Even though the car companies are working on the technology to make the cars safe and dependable, it would be easy for someone to “hack” into the vehicle to steal it, or take personal information from the vehicle like where they have been and where they plan to go (Degroat). Many also wonder how well the car will do under different climate and driving circumstances; will the car’s mechanics and equipment work well in a tropical or artic like environment, or how will it interact in New York City as opposed to a very rural and rugged environment like a farm. With the sensors and cameras attached to the car, will it be able to tell the difference and respond differently among other vehicles, pedestrians, and non-motored objects on the
Driverless cars are already starting to show signs of safer driving. In a test, drive exercise the driverless car demonstrated another vehicle cutting in front of the driverless car. While it was driving itself, still traveling at 100 km per hour the driverless car immediately braked to adjust its speed to maintain a safe distance behind the vehicle that cut in front and slightly moved the steering wheel to stay centre in the lane. It is said that in a situation such as this a human might have overreacted or shake the steering wheel which could of caused the car to steer into another vehicle or off the
Self-driving cars should not be produced because the technical part is not 100% figured out. In source #1 paragraph 23, it says “Computers develop glitches… could be deadly when it happens at 75 miles per hour on the freeway.” This is important because being in an accident on a freeway could lead to deadly injuries. When you might have been able to prevent that, when you were the driver. The self-driving car is what caused your injury so not having control could be a fatal technical
Ethical issues are, among those, the most notable ones. In “Why Self-Driving Cars”(2015), it arises a typical ethics dilemma when a driverless car can be programmed to either save the passengers by endangering the innocent nearby or sacrifice its owner to avoid crashing into a crowd. Knight(2015) cites Chris Gerdes, a professor at Stanford University, who gave another scenario when a automated car can save a child’s life but injure the occupant in the car. The real problem is, as indicated by Deng(2015), a car cannot reason and come up with ethical choices and decisions itself like a human does as it must be preprogrammed to respond, which leads to mass concerns. In fact, programmers and designers shoulder the responsibility since those tough choices and decisions should all be made by them prior to any of those specific emergencies while the public tends tolerates those “pre-made errors” less(Knight, 2015; Lin, 2015). In addition to the subjective factors of SDCs developing, Bonnefon and co concludes a paradox in public opinions: people are disposed to be positive with the automated algorithm which is designed to minimize the casualty while being cautious about owning a vehicle with such algorithm which can possibly endanger themselves.(“Why Self-Driving Cars”,
Patrick Lin’s WIRED article, “Here’s a Terrible Idea: Robot Cars with Adjustable Ethics Settings” raises a number of interesting points about the problems inherent in programming ethics into a computer system. A self-driving car has to make decisions. In the infamous trolley problem, inaction itself becomes a decision. There’s no way around decision-making; yet, short of endowing each and every car on the road with some kind of human-like self-awareness and consciousness (thereby defeating the purpose of having self-driving cars in the first place), the cars have no ability to make decisions outside of their programming. So, in a very real sense, the decisions a self-driving car makes in a difficult ethical scenario are the decisions of its creators, just deferred in space and time.
In July 12, The New York Times reported a news: “Inside the self-driving Tesla fatal accident”, which again caused enormous debates on whether self-driving cars should be legal or not.
One of the Google self-driving cars experienced an accident on September 23 of 2016. The car drove through a green light but stopped in the middle of the intersection. The car sensed another car going to run a red light and applied its brakes. The car kicked into manual mode, but the passenger’s reaction took too much time. The speeding car rammed into the autonomous car and caused an accident. Both vehicles sustained heavy damage. (Hartmans 2)
Human drivers have instincts that cannot be duplicated by technology, but by that same token human error is not a part of a self-driving car. In addition, we also need to take into consideration the transition period, when there are self-driving cars as well as human drivers on the road. Humans can notice the other drivers physically signal to go-ahead, when at a four way stop sign or; offer an opening for the merging lane. This is an example of what human interaction is capable of, that self-driving cars will need to calculate in order to
It might be hard to see where the self-driving car could have issues with safety but an interesting question arises when an accident is unavoidable. The question posed is “How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?” (ArXiv). This is a very interesting question surrounding ethics. I’m not sure if there is a right answer to the question, which could stall the self-driving car industry. Before self-driving cars are mass produced a solution needs to be found to the question about unavoidable accidents. Although this question is a problem, there may not be a need to address the problem. It is said that “"driver error is believed to be the main reason behind over 90 percent of all crashes" with drunk driving, distracted drivers, failure to remain in one lane and falling to yield the right of way the main causes.” (Keating). Self-driving cars could eliminate those problems entirely and maybe with all cars on the road being self-driving cars, there would be no “unavoidable accidents”. Safety is the main issue the self-driving car is trying to solve in transportation and seems to do a good job at
Automotive executives touting self-driving cars as a way to make commuting more productive or relaxing may want to consider another potential marketing pitch: safety (Hirschauge, 2016). The biggest reason why these cars will make a safer world is that accident rates will enormously drop. There is a lot of bad behavior a driver exhibit behind the wheel, and a computer is actually an ideal motorist. Since 81 percent of car crashes are the result of human error, computers would take a lot of danger out of the equation entirely. Also, some of the major causes of accidents are drivers who become ill at the time of driving. Some of the examples of this would be a seizure, heart attack, diabetic reactions, fainting, and high or low blood pressure. Autonomous cars will surely remedy these types of occurrences making us
Many feel that driverless cars are the future of the automobile industry. When someone hears “Robot cars hitting the road soon” is that a guarantee that the roads will still remain safe? With the rapid growth of technology through the centuries, more specifically computer software, the issue arises of whether or not roads and other drivers will be safe behind the wheel. Currently there is very little knowledge on how driverless cars will be engineered, which brings concerns to peoples eyes. Subsequently, driverless cars can be prone to hacking, which leads to out of control situations for drivers behind the wheel.