Algorithms –a set of rules and instructions baked into computer programs– are now commonly used to make economic, health and policy decisions, etc. For example, health insurance companies use algorithms to autonomously decide the level of cancer treatment someone might receive based on massive datasets and actuarial tables. Banks are increasingly using algorithms to determine who will be approved for a mortgage based on demographic information.
But people have to write the algorithms, and more often than not, they infuse them with cultural bias.
This SEVEN-WEEK, 1.5-credit module will teach you how to break down how algorithms make decisions and to uncover inherent discrimination in the code. You’ll also learn how to use the Freedom of Information ACT to acquire algorithmic packages used by federal, state and local agencies.