THE PROBLEMBlack and Brown students face rampant inequality on a daily basis. They are suspended more often, placed on lower academic tracks, and taught content to which they often cannot culturally relate, compared to white students for whom the education system is designed. These experiences quickly label students in ways that exclude them from the system, leading them down a well-researched school-to-prison pipeline. Discrimination is reflected in a range of “objective” outcome data that illustrate a race gap in attendance and discipline records, grades, and test scores.
Edtech companies use such data to train algorithms that promise to personalize learning, identify at-risk students, and save teachers time. Without examining the biases that influence this data, companies using artificial intelligence and machine learning can amplify existing bias, along with their own assumptions, into the products schools use.
WHY I SHOULD CARE?
Black and Brown students make up over half of students in American K-12 public schools today, yet edtech companies rarely design products with their unique needs in mind. Unfortunately, algorithms don’t work well on students for whom they were not designed. Technologies in other sectors have run into major racial discrimination challenges, with reports of racial bias in facial recognition(for surveillance, in risk assessment for the criminal justice system, and in smart recruitment systems for corporate hiring. Similarly, technologies designed to predict dropouts, identify behavioral issues, and personalize learning may perpetuate the same outcomes from the past; products that are designed without examining the underlying bias will further encode the racist history of our systems.
Simply avoiding the collection of sensitive data like race is not enough. Many factors like income, zip code, familiarity with technology, accent, or language proficiency can be highly correlated with race and easily encoded into algorithms through the collection of usage data. Companies that don’t collect data on a student’s race, language, or income or incorporate school-level demographic data fail to evaluate whether or not their products work well for all groups of students. Although legislation and school policies do not yet hold companies accountable for discrimination through technology, edtech companies should get ahead of forthcoming requirements by adopting new practices to deeply understand the social context in which their products are deployed, and to account for systemic racial bias before implementing their products in schools, and certainly before employing machine learning on the data they’ve collected.
WHAT I SHOULD DO ABOUT IT?
This won’t be easy. It will be worth it. Edtech companies have made big promises to change education – to personalize learning, to identify students at-risk of dropping out, or to automate administrative tasks so teachers can focus on their students. Just as edtech companies commit to student data privacy and accessibility, it’s time to commit to equity by adopting new design and development practices to account for racial bias. The edtech industry needs leaders to commit to changing education for all students, especially the ones the system isn’t designed for, and equally importantly, the ones who make up a majority of users in American public schools today.
The AI in Education Toolkit for Racial Equity provides a set of practices that edtech companies can follow to address racial bias at each stage of development, from ideation to implementation. The Toolkit will help edtech companies design products with Black and Brown students, their teachers, and families to uncover blindspots, catch potential risks, and ensure communities (customers) buy into the value of their products – a necessary approval for a company’s success. The Toolkit also describes tangible exercises, including code that companies can run to detect and mitigate racial bias. Companies should build time into each sprint to run these analyses and incorporate critical user feedback to ensure that developers never need to trade off between ethics and deadlines. Additionally, the Toolkit will help companies structure more inclusive UX design, testing, and implementation phases to incorporate feedback from Black and Brown users.
Links to further research:
1. Data Snapshot: School Discipline2. The Other Segregation3. Culturally Responsive Teaching4. Race, Disability and the School-to-Prison Pipeline
5. Status and Trends in the Education of Racial and Ethnic Groups 2018
6. Chronic Absenteeism in the Nation's Schools7. Disproportionality in student discipline: Connecting policy to research
8. Inequality at school
9. The Black-White Test Score Gap: Why It Persists and What Can Be Done
10. Racial/Ethnic Enrollment in Public Schools
11. Population validity for Educational Data Mining models: A case study in affect detection12. Face Recognition Vendor Test (FRVT)13. Machine Bias14. Complaint and Request for Investigation, Injunction, and Other Relief