Uploaded by 阿席

Gender & Racial Bias in AI: Presentation

advertisement
‭Tam Nga Man (4176845)‬
‭Gender and racial biases in AI‬
‭Gender Shades project by MIT‬
‭‬ 9
●
‭ 3.6% of faces misgendered by Microsoft were those of darker subjects.‬
‭●‬ ‭95.9% of the faces misgendered by Face++ were those of female subjects.‬
‭Dangers of a bias AI‬
‭❖‬ ‭Widespread, mysterious and destructive algorithms (Weapons of Math Destruction,‬
‭O’neil, 2016)‬
‭❖‬ ‭it’s not just about how these algorithms are harming us — it’s about how these‬
‭algorithms harm us in ways we do not know, do not know we do not know, and do not‬
‭currently have means to prove and challenge.‬
‭➢‬ ‭someone wrongfully accused of a crime based on misidentification of the‬
‭perpetrator from security video footage analysis by AI‬
‭➢‬ ‭AI screening out female candidates for an engineering position bc the‬
‭company used to only hire male for the same position‬
‭➢‬ ‭Crime-fighting AI determining a black person to be more likely a criminal‬
‭reoffender than a white person‬
‭➢‬ ‭Recommendation algorithm not showing certain job positions to female users‬
‭➢‬ ‭… etc.‬
‭Big data =/= Good data (Kazimzade, 2019)‬
‭‬
●
‭●‬
‭●‬
‭●‬
‭ hen and how was the data collected?‬
W
‭Collected by whom?‬
‭Collected for what purposes?‬
‭Does it contain any historical or ideological mappings?‬
‭Tam Nga Man (4176845)‬
‭How about we only collect relevant data? (Dobrin, 2020)‬
‭●‬ e
‭ .g. How about we don’t use irrelevant data (like gender) as a factor to decide how‬
‭many credits you can have with your new credit card?‬
‭○‬ ‭The Apple Card example‬
‭■‬ ‭given a credit limit that was 10 times lower than her husband. They‬
‭share all the same assets and they share all the same accounts‬
‭■‬ ‭“While they explicitly removed gender (as a factor), there were other‬
‭factors associated with gender that the AI algorithm identified in order‬
‭to classify individuals based on a perceived risk.”‬
‭○‬ ‭“The bank deciding if mortgages should be given” example‬
‭■‬ ‭They ensured that the algorithm didn’t have any gender, racial,‬
‭religious or ethnic bias, but…‬
‭●‬ ‭Seemingly unrelated pieces of data might be closely related‬
‭○‬ ‭“Because the model was tainted by a historical bias in the data, they were‬
‭making a bunch of bad decisions‬‭really fast.‬‭”‬
‭Keeping algorithmic bias out of AI (Sharma, 2019)‬
‭ .‬ ‭Be aware of our own biases and the bias in machines around us‬
1
‭2.‬ ‭Make sure that full-spectrum teams with diverse individuals that are developing‬
‭systems‬
‭3.‬ ‭Give the AI diverse data to learn from‬
‭Inclusive coding (Buolamwini, 2017)‬
‭‬ W
●
‭ ho codes?‬
‭●‬ ‭How do we code?‬
‭○‬ ‭Algorithmic auditing to uncover problematic patterns‬
‭■‬ ‭a process through which an automated decision system (ADS) or‬
‭algorithmic product, tool, or platform (also referred to here under the‬
‭umbrella term ‘AI system’) is evaluated.‬
‭●‬ ‭Why do we code?‬
‭Algorithmic Justice League‬
‭Policy recommendations‬
‭1.‬ ‭Require the owners and operators of ai systems to engage in independent‬
‭algorithmic audits against clearly defined standards‬
‭2.‬ ‭Notify individuals when they are subject to algorithmic decision-making systems‬
‭3.‬ ‭Mandate disclosure of key components of audit findings for peer review‬
‭4.‬ ‭Consider real-world harm in the audit process, including through standardized harm‬
‭incident reporting and response mechanisms‬
‭5.‬ ‭Directly involve the stakeholders most likely to be harmed by ai systems in the‬
‭algorithmic audit process‬
‭6.‬ ‭Formalize evaluation and, potentially, accreditation of algorithmic auditors‬
‭Tam Nga Man (4176845)‬
‭References‬
“‭ Results,” Gender Shades, MIT Media Lab, MIT Civic Media, accessed 17 Apr 2024,‬
‭http://gendershades.org/overview.html‬‭.‬
‭Cathy O’Neil. Weapons of Math Destruction. United States:Crown Books. 2016.‬
‭ unay Kazimzade, “Racial and Gender Biases in AI,” filmed at TEDxHUBerlin, uploaded 14‬
G
‭Mar 2019,‬‭https://www.youtube.com/watch?v=cg_5t_bV4XE&ab_channel=TEDxTalks‬‭.‬
‭ eth Dobrin, “Tackling AI Bias is a human problem,” filmed at TEDxUniversityatBuffalo,‬
S
‭uploaded 2 Sept 2020,‬
‭https://www.youtube.com/watch?v=rWU83MK7t9c&ab_channel=TEDxTalks‬‭.‬
‭ riti Sharma, “How to keep human bias out of AI,” uploaded 13 Apr 2019,‬
K
‭https://www.youtube.com/watch?v=BRRNeBKwvNM&ab_channel=TED‬‭.‬
‭ oy Buolamwini, “How I'm fighting bias in algorithms,” uploaded 30 Mar 2017,‬
J
‭https://www.youtube.com/watch?v=UG_X_7g63rY&ab_channel=TED‬‭.‬
‭ asha Costanza-Chock, Emmaharvey, Inioluwa Deborah Raji, Martha Czernuszenko, Joy‬
S
‭Buolamwini. “Who Audits the Auditors? Recommendations from a field scan of the‬
‭algorithmic auditing ecosystem.” Algorithmic Justice League. 4 Oct 2023.‬
‭https://www.ajl.org/auditors‬‭.‬
Download