I'm on a kick of reading anything and everything at the intersection of data, tech, and social justice. Technically Wrong popped up on a master list of books in this realm, so I knew I was in for a treat. I was not expecting to be blown away by both the examples in the book and the author's capabilities as a writer. I devoured this book in one sitting late one night. If you're looking for a broad overview of what tech gets wrong, start here.
Review of 'Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech' on 'Goodreads'
4 stars
Mycket bra! Om jag inte redan läst Design for real life hade den fått full pott. Några av anekdoterna har hon återvunnit, men det går att ha överseende med det.
Review of 'Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech' on 'Goodreads'
5 stars
Technology is now the energy field that surrounds us and penetrates us; it binds our planet together. But the tech industry is failing all of us in myriads of ways. This book gives a great summary of the problems of technology and how they came to be.
Recommended: for everyone who reads this on a screen and not in print.
Review of 'Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech' on 'Goodreads'
5 stars
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races …
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.
The examples that frame the narrative are disarming in their simplicity. The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field. The person of mixed racial heritage who can't understand which one box to check on a form. The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for.
Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design.
While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function. Give me 100 form designers, and I will will give you 100 different forms that provide 100 user experiences. And at least some of those 100 users will be left without "delight" - a nebulous buzzword for rating the success (or otherwise) of digital experiences.
Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their (at best) benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data. It is designing not for delight, but for deception.
"Default settings can be helpful or deceptive, thoughtful or frustrating. But they're never neutral."
Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.
Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the (unquestioned) assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future. Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.
The parting thought of this book is that good intentions aren't enough. The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!