Uploaded by trevantebrown

Intelligent, autonomous weapons for the military

advertisement
Intelligent and
autonomous military
weapons
By: Trevante’ Brown , Anthony
Noetzel, Logan Pascucci, Konstantin
Tekin
Intelligent Machines
● To be considered as an intelligent machine, the machine has to be able to
interact with and adapt to its environment autonomously, while still
programed for a task
● Examples:
○
○
○
Drones
Humanoid Robots
Plagiarism Readers
Autonomous Machines
● Acting autonomously means the machines have the freedom to act
independently
● Can do tasks by itself without the need for any human control
● Examples:
○
○
○
Roomba vacuums
Self driving cars
Autonomous helicopters
Intelligent machines VS Autonomous machines
● Autonomous machines as weapons could be more deadly than
intelligent machines since they can act on their own accord
● A human has to order an intelligent machine to attack but not an
autonomous machine
● An autonomous machine would be able to identify targets on it’s own
What can we expect from intelligent and
autonomous weapons?
● With Artificial Intelligence becoming more advanced intelligent/autonomous
weapons are becoming more available and more research is being done into
their development
● A common use for intelligent/autonomous machines would be for use in the
military
● The United States Department of Defense defines autonomous weapons as
those that once activated, can select and engage targets without further
intervention by a human operator
The benefits of using intelligent and
autonomous weapons
● Decreases the amount of human soldiers needed which could result in
fewer human casualties
● Decreases possibility of human error
● A.I will notice and comprehend much larger datasets than humans are
capable of
● Aid decision makers
● Classify and prioritize threats
The negatives of using intelligent and
autonomous weapons
● Ethical concerns
● Would be extremely difficult to determine liability for collateral damage
caused by machines
● The possibility of hackers taking control
What is the status of their development now?
Machines that are being
developed:
● Robart III :
○
○
○
○
Carries a gun and rocket launcher
People can use the “follow”
command on it
When a person pointed their
weapons at the sea, Robart did the
same.
It has 5 different sensors to keep
track of friendly forces
Machines in development
●
MDARS (mobile detection and assessment response system):
○
○
○
○
360 degree sensors to detect motion
Once it spots human like motions it “shouts” a warning
If no response pepper balls at the intruders
Can track up to 6 targets
Machines in development
●
SKYLARK C
○
○
○
Patrols boats and small vessel operations
Fully autonomous from launch to recovery
Provides persistent surveillance with an
EO/IR (Electro-Optical/Infra-Red) payload
Machines in development
●
ROBATTLE
○
○
○
○
○
Advanced attack capabilities
Combat Intelligence, Surveillance, Target Acquisition & Reconnaissance (ISTAR)
Area excitation and decoy actions
Ambushes and attack
Forces and convoys protection in the combat
areas
Machines in development
●
Robot Combat Vehicle Light (RCV-L)
○
○
○
○
○
○
Experimental ground combat vehicles
Unmanned ground combat vehicle (UGCV)
Developed by QinetiQ
Can travel at 72km/hr can carry up to 3.3
tons
Remote controlled combat module includes
12.7 mm M2 Browning MG & 148 Javelin anti
tank missile system
Can identify targets by itself can cannot fire
without an operator
What is the status of their development now?
●
SEAGULL (USV):
○
Facilitates end-to-end mine hunting operations
including:
■ Detection
■ Classification
■ Location
■ Identification and neutralization of sea
mines
What is the status of their development now?
● Researchers from MIT and DARPA working on a chip called
EYERISS:
○
○
○
○
○
Based on neural systems
Will contain knowledge of air combat, ability to learn from a dog fight in real
time, adjust for threat situations
10x more efficient as a processing unit
Eliminates the need to load data from external memory units
Increases speed of object recognition which will help for target selection
What is the status of their development now?
● MIT has developed a system that uses wireless radio frequency signals to
measure a person’s heart rate
○
Applies machine learning and an “emotion classifier” to identify a person’s mood without
physical contact.
● NEURODYNAMIC learned how to manoeuvre and determine appropriate
shooting opportunities in a dog fight.
○
Neural networks are trying to reproduce the typical activities of the human brain in image
perception, language processing, and coordination.
How can intelligent and autonomous machines
be in the future?
● UCLASS:
○
○
○
Deep reconnaissance missions
Ability to hit non moving and moving targets
Mission execution and landing completely independent
● Machines that are potentially capable of changing their shape to fit into
small spaces then reverting back to fulfill their task
● Aircraft the size of bugs used to secretly eavesdrop on enemies
How can intelligent and autonomous
machines be in the future?
● U.S goal: have fleets of manned and unmanned aircrafts by 2030
○
○
Unmanned fleets would be cheaper in costs and in training
Simulations have shown manoeuvres over 20g
■ Airframe redesigned
What are the potential dangers?
● Once concern is about the machines algorithm and the
lack of human judgement
○
○
○
Distinguish between an enemy, an ally, or civilians?
Make ethical choices in a dynamic battlefield?
Tell the difference between a real act of surrender and a false one
● Find out who is responsible for collateral damage
What are the potential dangers?
● The understanding the quality of machine learning
● The disconnection between engineers and human
soldiers
● Researchers believe that regular use of LAWS (Lethal
Autonomous Weapons System) will have countries
resort to more frequent use of military power in any
dispute.
What’s being done to regulate development of
these kinds of robots?
● Major legal block is that the robots don't meet the laws of
armed conflict (LOAC):
○
○
○
○
Military necessity:
■ Injuring an enemy should not be excessive
Distinction:
■ The ability to tell the difference between a civilian and a hostile
Proportionality:
■ Take into consideration civilian property when targeting a military
objective
■ Damage should be minimal
Humanity:
■ Can’t use lethal force to cause unnecessary suffereing
What’s being done to regulate development of
these kinds of robots?
● Regulations through software:
○
○
○
○
○
Setting time checks for launching lethal weapons
A lethal weapon system operates in a zone limited by its sensors.
Before firing the autonomous machine can use this process:
■ Detecting a target
■ Locating and identifying it using the approved List of Targets
■ Analysing (combatant or not following IHL laws)
■ Characterizing it (hostile or non-hostile)
Decide if the target is a priority
Record all decisions for opening fire
What’s being done to regulate development of
these kinds of robots?
● Set up security measures such as:
○
○
○
Operator can deactivate the machine
Back up communication with the machine
Any technical failure with the targeting will shutdown the machine
What’s being done to regulate development of
these kinds of robots?
● Researchers and major industrials that signed an
Open Letter to ban LAWS
○
○
○
○
○
Stuart Russell
Stephen Hawking
Yan LeCun
Steve Wozniak
Elon Musk
What’s being done to regulate development of
these kinds of robots?
● Use not lethal weapons:
○
○
○
Optical or laser dazzlers
Weapons that use the electromagnetic spectrum to
disable enemy equipment
Coast guard cadet tested the concept of arming drones
with pepper spray and propeller nets
Sources
Armin Krishnan. Killer Robots : Legality and Ethicality of Autonomous Weapons. Routledge, 2009. EBSCOhost,
search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=292434&site=ehost-live&scope=site.
“Out of the Blue.” The Economist, The Economist Group Limited , 30 July 2011,
www.economist.com/asia/2011/07/30/out-of-the-blue.
Mulrine, Anna. “Unmanned Drone Attacks and Shape-Shifting Robots: War's Remote-Control Future.” The
Christian Science Monitor, The Christian Science Monitor, 22 Oct. 2011,
www.csmonitor.com/USA/Military/2011/1022/Unmanned-drone-attacks-and-shape-shifting-robots-War-s-remote-c
ontrol-future.
Leprince-Ringuet, Daphne. “Robot Soldiers Could Soon Make up a Quarter of the Army.” ZDNet, ZDNet, 9 Nov. 2020, www.zdnet.com/article/robotssoldiers-could-soon-make-up-a-quarter-of-the-army/.
Sources
CHATURVEDI, Sudhir Kumar, et al. “Comparative Review Study of Military and Civilian Unmanned Aerial Vehicles (UAVs).” INCAS
Bulletin, vol. 11, no. 3, July 2019, pp. 183–198. EBSCOhost, doi:10.13111/2066-8201.2019.11.3.16.
Sakharkar, Ashwini. “The U.S. Army Received the First RCV-L Robots from QinetiQ for Testing.” InceptiveMind, 17 Nov. 2020,
www.inceptivemind.com/u-s-army-received-first-rcv-l-robots-qinetiq-testing/16216/.
Mayfield, Mandy. “Services Infusing AI into Air, Land, Sea Robots.” Nationaldefensemagazine.org, 26 Oct. 2020,
www.nationaldefensemagazine.org/articles/2020/10/26/services-infusing-ai-into-air-land-sea-robots.
Bekey, George A. “Autonomous Robots.” Https://Mitpress.mit.edu/, mitpress.mit.edu/books/autonomous-robots .
Cooke, Dr. Gordon. “Magic Bullets: The Future of Artificial Intelligence in Weapons Systems.” Www.army.mil, 11 June 2019,
www.army.mil/article/223026/magic_bullets_the_future_of_artificial_intelligence_in_weapons_systems.
Sources
Magnuson, Stew. "The US Military Remains Reluctant to Accept Autonomous Robot Soldiers." Robotic Technology, edited by
Louise Gerdes, Greenhaven Press, 2014. Opposing Viewpoints. Gale In Context: Opposing Viewpoints,
https://link.gale.com/apps/doc/EJ3010899217/OVIC?u=nysl_li_nyinstc&sid=OVIC&xid=903971ac. Accessed 2 Dec. 2020.
Originally published as "War Machines: For Now, Lethal Robots Not Likely to Run on Auto-Pilot," National Defense, vol. 92, no.
652, Mar. 2008.
Quaranta, Paolo. “Unmanned Capabilities for 2020 and Beyond.” Military Technology, vol. 41, no. 5, May 2017, pp. 47–52. EBSCOhost,
search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=123666178&site=ehost-live&scope=site.
G. de Boisboissel, "Uses of Lethal Autonomous Weapon Systems," International Conference on Military Technologies (ICMT) 2015,
Brno, 2015, pp. 1-6, doi: 10.1109/MILTECHS.2015.7153656.
Sources
D. Danet, "Do not ban “Killer Robots” !," 2017 International Conference on Military Technologies (ICMT), Brno, 2017, pp. 716-720, doi:
10.1109/MILTECHS.2017.7988850.
COMBE II, PETER C. “Autonomous Doctrine: Operationalizing the Law of Armed Conflict in the Employment of Lethal Autonomous
Weapons Systems.” St. Mary’s Law Journal, vol. 51, no. 1, Dec. 2019, pp. 35–34. EBSCOhost,
search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=141502242&site=ehost-live&scope=site.
Download