Weapons of Math Destruction

259 pages

English language

Published April 28, 2016

ISBN:
9780553418811
Goodreads:
28186015

View on Inventaire

4 stars (12 reviews)

Weapons of Math Destruction is a 2016 American book about the societal impact of algorithms, written by Cathy O'Neil. It explores how some big data algorithms are increasingly used in ways that reinforce preexisting inequality. It was longlisted for the 2016 National Book Award for Nonfiction but did not make it through the shortlist has been widely reviewed, and won the Euler Book Prize.

6 editions

Suit up for Combat!

5 stars

This was an exceptional book. It's not heavy into statistics but gives the rationale for what is a WMD (Weapon of Math Destruction) and WMDs maybe a new term but we have been under the exploitation of WMDs well before we think. It's not a new phenomenon but it is one that we should be aware of.

Take a read and learn how about them so that we can all do better to combat them and use math to not only help describe the world but make it a better place to live in.

Review of 'Weapons of Math Destruction' on 'Storygraph'

5 stars

Excellent discussion on how the use of algorithms is affecting our education system, how likely are we to be hired, how much we pay for insurance and mortgages. These models have become black boxes that nobody knows exactly how they work but are considered reliable. What few people realize is that these algorithms are reinforcing discrimination and have biases built in them. So, instead of a fair objective system to evaluate whatever (loan approvals, credit scores, job candidates, school teacher's performance, etc), we have opaque models being applied everywhere that cannot be disputed or even understood. It's scary to think that our future life decisions will rely on algorithms.

Review of 'Weapons of Math Destruction' on 'Goodreads'

4 stars

I, too, was a mathematician once, but I lost my faith. Ms. O'Neil still seems to have much of hers, for an Occupier. I kept thinking she was naive that this stuff is fixable, but I may just be naively paranoid.

That algorithms can be biased was not a surprise to me. Logic itself can be biased because it is dependent on language which, like many WMDs, is a black box. It is full of proxies. Take the term "criminal" which (like "terrorist") brands the one so called as an evil doer. And it's measurable by determining if one has been convicted of anything. You can read [b:The New Jim Crow: Mass Incarceration in the Age of Colorblindness|6792458|The New Jim Crow Mass Incarceration in the Age of Colorblindness|Michelle Alexander|https://images.gr-assets.com/books/1328751532s/6792458.jpg|6996712] and discover that going to jail is part of systemic racism but when you hear the word "criminal" or "convict", you …

avatar for morachimo

rated it

3 stars
avatar for hughrawlinson

rated it

5 stars
avatar for balex

rated it

3 stars
avatar for opendoorgonorth

rated it

3 stars
avatar for JoeGermuska

rated it

4 stars
avatar for ajkerrigan

rated it

4 stars
avatar for alexcurtin

rated it

4 stars
avatar for stinkingpig

rated it

4 stars
avatar for realn2s

rated it

5 stars