Uploaded by Yaseen Ali

Ethics Case Study (1)

advertisement
Ethics Scenario Case Study Analysis
The focus of this analysis is Tesla’s Full Self Driving Beta (FSD) software recall, which
announced a recall of nearly 400,000 vehicles. Tesla’s FSD Beta was available for subscription
use for most of its vehicles, specifically to drivers that maintained a high driver safety score,
which is determined by another machine learning model “that monitors their driving habits”
(CNBC 2023). However, it is important to note that Tesla allowed FSD Beta to be used by all
compatible vehicles five months before the recall, and not just the ones with high scoring
drivers. FSD enables the vehicle to drive itself, although the driver must constantly supervise
the vehicle’s actions and intervene when necessary. This is possible through a series of neural
networks, allowing trained models to react to things and make decisions in real time rather than
based on predetermined reactions which would be seen in traditional programming. This makes
it theoretically much more capable in self-driving on normal roads and normal traffic if perfectly
implemented, but, being the Beta version of the feature, it was not yet finalized or deemed to be
a proper release. Part of the reason for this beta version being made available is it possibly
allows data to be collected from all over the country, helping retrain the model to be more
accurate, as “Tesla has frequently been releasing new software updates to the FSD Beta
program to improve performance” (Lambert 2022). This means that the subscribers of FSD Beta
were/are used to test and train new software, which is something that Tesla does in regard to all
aspects of self driving, not just FSD. In fact, the National Transport Safety Board even
suggested that “Tesla is using customers as 'guinea pigs' to test its autonomous driving
technology before it is officially approved” (Keane, 2021).
The release of FSD Beta is where ethics further come into question, as it is meant to
have the capability to drive itself fully specifically: “Accelerate, brake and steer, stay within their
lane, make safe lane changes, parallel park, and slow and stop for traffic signs and lights”
(Forbes 2023). This could lead to concerns for the driver and passengers of the vehicle, as they
are being driven by a software that is not fully developed, and could possibly make mistakes.
Although constant supervision is required, if the driver fails to notice a mistake made by FSD, an
accident may result. This involves the driver and passengers of the vehicle, as well as those
related to them, in the ethical dilemma, as they could be involved in any accident caused by
FSD.
The same concern goes for using customers to test and train the unfinalized software, as
a model that hasn’t yet been tested enough for a final release may potentially risk accidents.
Using the customers as guinea pigs instead of thoroughly testing a model before release can be
seen as a fairly unethical decision. Stakeholders in these ethical dilemmas are the drivers and
passengers, the developers who create the software, and the executives or department who
make the decision to release the unfinalized model.
Before coming to any ethical judgment however, all or many of the facts related to this
case must be observed. As mentioned beforehand, Tesla’s FSD is a subscription feature that
enables the vehicle to accelerate, steer, brake, make lane changes, park, and recognize and
react to traffic lights and signs; essentially being capable of driving itself (Forbes 2023). FSD is
in its Beta release, meaning it has yet to be perfected, and so it is continuously updated to work
toward its final release (Lambert 2022). FSD, like many of Tesla’s other autopilot and self driving
features, collect driving data in consumer vehicles in order to retrain their neural network models
and make improvements to the software, essentially treating customers as guinea pigs (Keane
2021). Tesla maintains many warnings and conditions to using FSD, insisting the driver must
remain in control at all times and constantly supervise FSD’s actions to intervene when
necessary, as they are “responsible for operation of the vehicle whenever the feature is
engaged” (CNBC 2023). Prior to the recall, FSD was plagued with issues, which included poor
decision logic when encountering yellow lights as to whether it should continue through an
intersection or stop, braking in a jarring manner, failing to adjust to new speed zones in an
appropriate distance, and driving straight in lanes designated as turn only (Not A Tesla App
n.d.). The recall relating to FSD involved 363,000 vehicles (Forbes 2023). These issues are only
the ones that were resolved by the recall and do not highlight all of the flaws with FSD, however
these were the most prominent issues. Tesla’s recall of FSD does not involve physical work
needing to be redone on the vehicle like most recalls, it is instead a simple software update, and
so the use of the word “recall” has upset Tesla’s CEO among other supporters of the brand
(CNBC 2023). FSD is based upon neural networks rather than structured code, meaning it
makes decisions using probabilities instead of following predetermined instructions. (Fung
2023). The rate of accidents while FSD is engaged where airbags are deployed is every 3.2
million miles, while the U.S. average across autonomous and human driven cars is every
600,000 miles (Albanesius 2023). The recall was initiated by the National Highway Traffic Safety
Administration after an investigation they conducted (Forbes 2023).
There are many different courses of action that could have been taken with this
knowledge of FSD in mind by the people responsible for the release of FSD. One course of
action would have been to not release FSD Beta altogether and continue testing it and
improving it internally, thus not relying on their consumers to be guinea pigs for their model.
Another option would have been to release the version only for high scoring drivers, and keep it
for high scoring drivers until FSD becomes sophisticated enough to drive safely autonomous,
keeping FSD only in the hands of better drivers who are more likely to responsibly use FSD in
its Beta stage. A third option would be for FSD to be released with a requirement that the driver
must make consistent contact with the steering wheel to ensure constant supervision, thus
lowering the chances of an unexpected accident caused by FSD. If it had been released like
Tesla did, it would have been more ethical to make the issues apparent and begin the recall
process themselves, rather than the NHTSA bringing the issues to light. Another route, once the
recall has been issued, would be to remove FSD as a feature until it could be properly trained,
or one of the aforementioned solutions where only high scoring drivers could use it, or
consistent supervision must be ensured.
There are many ethical lenses to apply when choosing from these options, but the best
are the Utilitarian and Justice approach. As we seek to do the least harm possible in the
Utilitarian approach, the option would be to only release FSD to the highest scoring drivers until
the model is trained thoroughly and it performs at a high specification. This would theoretically
result in the least accidents, but also maintain a revenue stream from the project because of the
high scoring users, thus causing the least harm for all the stakeholders by penalizing the
consumers as well as producers the least. The Justice approach would approach would be to
release FSD under high restriction, enforcing constant driver supervision such as contact with
the steering wheel, and disabling FSD if there is a lack of driver supervision.This would give
most drivers the privilege of using FSD while ensuring that they are monitoring its performance,
and can intervene when they need to, protecting themselves. It also allows for Tesla to garner a
high subscriber/user base for FSD, ensuring prosperity of the project and company as a whole.
This approach is “just” as it ensures mutual benefit to both sides of FSD, where all stakeholders
are being treated in an ideal manner.
The approach I would have made, if it were up to me, would be to release FSD with the
enforcement of constant driver supervision. This aligns with the justice approach, because as
mentioned, all stakeholders are benefitting to a good extent, and no side is being shorted,
hence a just approach. I feel this way because if I were in this position, I’d want to maximize
company revenue without harming consumers, and this approach does that best, as we enable
access of FSD to those who want it, but require constant supervision, making the driver and
passengers of the vehicle safer by minimizing the possibility of an accident. It also makes the
most sense for the sake of accomplishing a fully functioning FSD release, as in order for the
neural networks that make up FSD to be trained properly, tons of data would be required and
must be very diverse. This is ideal as neural networks are the best option to make a self driving
vehicle (Fung, 2023). It makes the most sense that hundreds of thousands of drivers across the
country contribute to this data, speeding up the collection of data infinitely more than Tesla could
accomplish on its own, and would therefore theoretically be better trained with consumer data
due to its vast quantity, leading to a perfected FSD much sooner. Although this uses the
customers as guinea pigs, a perfected FSD would arguably be much safer than human drivers,
and accelerating the progress of it whilst maintaining driver safety by constant supervision of
FSD seems like the best course of action, as it doesn’t compromise safety nor profitability.
The course of action that the people who made the choices they did regarding FSD at
Tesla was likely most motivated by the company’s share price and revenue. Pushing FSD out
sooner than they should and making it so accessible and widely usable among Tesla owners
gives a very hopeful outlook to shareholders of the company, as no other automotive company
is anywhere near such a feature. Introducing FSD as a new product also gained the Tesla a lot
more revenue, as it was accessible only through a $200 monthly subscription or $15,000 single
payment. These may have been the biggest motives, judging by the decisions they made. Tesla
has historically made executive decisions and announcements simply for the sake of pumping
up their stock, much like other companies, and so this seems like the most probable motive. It
would be ideal to implement a rule or require training of officials that make these decisions to
consider the consumer and their safety, and weigh that in the most in their decisions when it
comes to an industry where the change or introduction of a new feature could cause fatalities,
such as the automotive industry. This would put the consumers’ safety at less risk, and most
likely would not hurt company profits by much, as there are almost always decisions that are
beneficial to both the consumer and company.
Works Cited
CNBC. (2023, February 16). Tesla recalls 362,758 vehicles, says full self-driving beta software
may cause crashes.
https://www.cnbc.com/2023/02/16/tesla-recalls-362758-vehicles-says-full-self-driving-beta-softw
are-may-cause-crashes.html
Tesla. (n.d.). Full self-driving subscriptions.
https://www.tesla.com/support/full-self-driving-subscriptions#tesla-accordion-v2-1040-can-i-requ
est-full-self-driving-beta-if-i-am-subscribed-to-tesla-full-self-driving-capabilities
Not a Tesla App. (n.d.). Tesla software updates: Version 2022.45.15.
https://www.notateslaapp.com/software-updates/version/2022.45.15/release-notes
Lambert, F. (2022, November 24). Tesla makes full self-driving beta available to all owners in
North America. Electrek.
https://electrek.co/2022/11/24/tesla-full-self-driving-beta-available-all-owners-north-america/
Keane, S. (n.d.). Federal investigators warn Tesla is using customers as 'guinea pigs' to test its
self-driving technology. Daily Mail.
https://www.dailymail.co.uk/sciencetech/article-9365277/Federal-investigators-warn-Tesla-usingcustomers-guinea-pigs-test-Self-Driving.html
Forbes. (2023, February 20). Tesla recall hits nearly 363,000 cars with full self-driving software.
https://www.forbes.com/sites/qai/2023/02/20/tesla-recall-hits-nearly-363000-cars-with-full-self-dri
ving-software/?sh=2ebbd7c13bef
Albanesius, C. (n.d.). Tesla full self-driving beta crash stats revealed. InsideEVs.
https://insideevs.com/news/655983/tesla-full-self-driving-beta-crash-stats-revealed/
Fung, F. (n.d.). Tesla FSD so far ahead, experts can't understand it. Torque News.
https://www.torquenews.com/14335/tesla-fsd-so-far-ahead-experts-cant-understand-it
Download