AUTONOMOUS WEAPONS (LAWS) Ritesh Hattarki, Roland Sellman, Tunger Hong, Siddhartha Vaka INTRODUCTION (RITESH) • Lethal autonomous weapon systems (LAWS) are a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system. • LAWS must be able to fulfill: • Finding differences between legitimate targets and non-military targets such as civilians and hors de combat • Proportionality between targets that are or aren’t military objectives or are subject to protection • Predictability and reliability. In other words, making sure that the weapon will perform as expected SIDE 1(FOR) (ROLAND) • AI-enabled weapon system autonomy has the great potential to mitigate the risk of human error as an additional oversight tool to assist targeting operations. • Further, their close combat capabilities reduce the need to use high explosives as the means of delivering lethal effects. Compared to conventional munitions, autonomous systems will enable more accurate and surgical attacks with significantly reduced concern about collateral damage. • Prevents casualties by removing people from dangerous areas • Examples include CIWS, APS, Iron Dome SIDE 2 (AGAINST) (TUNGER) • Difficulty of identifying legal responsibility • For example, who is responsible if a LAW engages and takes action on the wrong target? Would it be the programmer, commander, or the soldier who set the LAW up? • Raises in concerns about predictability and reliability • Can create accidents in targeting • (IE CIWS A6 shoot down) • Autonomous systems are incapable of making complex ethical choices in the battlefield. • Weapon payloads carried on autonomous systems that enable them to cause lethal effects. PERSONAL OPINIONS(EVERYBODY) • Ritesh Hattarki: I support to have autonomous weapons being used throughout wars and other legal actions. They help out humans by performing tasks that are often too dangerous for humans. They also have lower risk of error can prevent casualties • Tunger Hong: • Siddhartha Vaka: • Roland Sellman: QUESTIONS??? QUESTIONS TO CONSIDER (CLASS DISCUSSION) • Could fully autonomous weapons comply with the requirements of international humanitarian law to protect civilians in armed conflict? • Why are fully autonomous weapons a pressing issue? • If fully autonomous weapons comply with the requirements of international humanitarian law, why should they be prohibited? SOURCES • https://sgp.fas.org/crs/natsec/IF11150.pdf • . https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwj797qsl6H6AhW djIkEHR_VAokQFnoECBgQAQ&url=https%3A%2F%2Fwww.icrc.org%2Fen%2Fdownload%2Ffile%2F 65762%2Fautonomous_weapon_systems_under_international_humanitarian_law.pdf&usg=AOvVaw3 Yo2nsxEEsyDwKBvi9K07G&safe=active&surl=1 • .https://www.cfr.org/blog/stop-stop-killer-robot-debate-why-we-need-artificial-intelligence-futurebattlefields • . https://www.hrw.org/news/2013/10/21/qa-fully-autonomous-weapons •