A powerful investigative look at data-based discrimination—and how technology affects civil and human rights and economic equity
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years—because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect.
Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive …
A powerful investigative look at data-based discrimination—and how technology affects civil and human rights and economic equity
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years—because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect.
Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor.
In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.
The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values.
This deeply researched and passionate book could not be more timely.
A Vital Examination of How Technology can Reinforce and Scale Inequality
5 stars
Virginia Eubanks lays out a powerful, deeply researched case for reconsidering the role of technology and algorithms in vital human services. By considering in depth a number of specific cases of how different technologies have created Kafka-esque systems that ossify and amplify existing societal biases, Eubanks challenges technologists, researchers, and governments to rethink how and what they build. This book also contains devastating historical analyses of decision making relating to services for the poor, showing that today's efforts are not an innovation or an aberration. Highly recommend
In this cleverly constructed series of case studies, Virginia Eubanks takes a critical eye to the automation of social welfare systems in three separate contexts in the United States. Through rich, qualitative interviews, she continuously advances her key argument: that by automating our social welfare systems - housing, welfare, social supports - we are manifesting the poorhouse - and its affordances - for the age of big data.
Her work is mature ethnography: she forms close, trusted bonds with actors from all parts of the welfare systems she investigates, providing a nuanced, multi-faceted exploration of how the rationalisation and automation of welfare systems embodies and perpetuates fundamentally flawed axiology. In a conclusion that Donna Meadows would be proud of, she entreaties us to upend the system through solidarity, collective action and the recognition that poverty - and its automation - is a choice. We should choose better.
As I was nearing the third case study of Virginia Eubanks's Automating Inequality, I tweeted: "Studying class and income inequality has given me power where I have previously felt powerless. Reading Automating Inequality takes this feeling to new heights." Prior to reading this book, I knew that I wanted to leverage data analytics for social good, but I didn't know what that looked like. Eubanks not only shined a spotlight on the disturbing algorithms that drive inequality. She created a clear call to action for data nerds like me with a desire to do good in the world to dismantle these algorithms of oppression. I was must struck by the final case study focusing on child protection services and poverty. I don't have the book in front of me, but there was a line that child abuse and poverty look so much alike, and I was immediately reminded of my …
As I was nearing the third case study of Virginia Eubanks's Automating Inequality, I tweeted: "Studying class and income inequality has given me power where I have previously felt powerless. Reading Automating Inequality takes this feeling to new heights." Prior to reading this book, I knew that I wanted to leverage data analytics for social good, but I didn't know what that looked like. Eubanks not only shined a spotlight on the disturbing algorithms that drive inequality. She created a clear call to action for data nerds like me with a desire to do good in the world to dismantle these algorithms of oppression. I was must struck by the final case study focusing on child protection services and poverty. I don't have the book in front of me, but there was a line that child abuse and poverty look so much alike, and I was immediately reminded of my own experience as a child facing near starvation while my mother's abusive ex husband tried to frame the situation as abuse to take my brother and myself away from my mom. If North Carolina has employed those same models, would I have been able to stay with my mom? At the conclusion of this book, and in tandem with a Reply All podcast about l33t hackers, I felt compelled to overhaul my own data practices to build privacy. I don't know what someone like me can do to dismantle these oppressive systems, but I'm going to find out