-
Most people don't have any association in their minds with what they do and with ethics. They think they somehow moved past the questions of morality or values or ethics, and that's something that I've never imagined to be true.
Cathy O'Neil -
That's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
Cathy O'Neil
-
I wanted to prevent people from giving them too much power. I see that as a pattern. I wanted that to come to an end as soon as possible.
Cathy O'Neil -
The NSA buys data from private companies, so the private companies are the source of all this stuff.
Cathy O'Neil -
The public trusts big data way too much.
Cathy O'Neil -
The training one receives when one becomes a technician, like a data scientist - we get trained in mathematics or computer science or statistics - is entirely separated from a discussion of ethics.
Cathy O'Neil -
Especially from my experience as a quant in a hedge fund - I naively went in there thinking that I would be making the market more efficient and then was like, oh my God, I'm part of this terrible system that is blowing up the world's economy, and I don't want to be a part of that.
Cathy O'Neil -
When people are not given an option by some secret scoring system, it's very hard to complain, so they often don't even know that they've been victimized.
Cathy O'Neil
-
The most important goal I had in mind was to convince people to stop blindly trusting algorithms and assuming that they are inherently fair and objective.
Cathy O'Neil -
The disconnect I was experiencing was that people hated Wall Street, but they loved tech.
Cathy O'Neil -
My fantasy is that there is a new regulatory body that is in charge of algorithmic auditing.
Cathy O'Neil -
Evidence of harm is hard to come by.
Cathy O'Neil -
For whatever reason, I have never separated the technical from the ethical.
Cathy O'Neil -
Occupy provided me a lens through which to see systemic discrimination.
Cathy O'Neil
-
There are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'Neil -
Every system using data separates humanity into winners and losers.
Cathy O'Neil -
You'll never be able to really measure anything, right? Including teachers.
Cathy O'Neil -
Google is so big you have no idea what a given person does.
Cathy O'Neil -
I think what's happened is that the general public has become much more aware of the destructive power of Wall Street.
Cathy O'Neil