-
The disconnect I was experiencing was that people hated Wall Street, but they loved tech.
-
People felt like they were friends with Google, and they believed in the "Do No Evil" thing that Google said. They trusted Google more than they trusted the government, and I never understood that.
-
We don't let a car company just throw out a car and start driving it around without checking that the wheels are fastened on. We know that would result in death; but for some reason we have no hesitation at throwing out some algorithms untested and unmonitored even when they're making very important life-and-death decisions.
-
The public trusts big data way too much.
-
That's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
-
When I think about whether I want to take a job, I don't just think about whether it's technically interesting, although I do consider that. I also consider the question of whether it's good for the world.
-
The training one receives when one becomes a technician, like a data scientist - we get trained in mathematics or computer science or statistics - is entirely separated from a discussion of ethics.
-
Most people don't have any association in their minds with what they do and with ethics. They think they somehow moved past the questions of morality or values or ethics, and that's something that I've never imagined to be true.
-
I wanted to prevent people from giving them too much power. I see that as a pattern. I wanted that to come to an end as soon as possible.
-
When people are not given an option by some secret scoring system, it's very hard to complain, so they often don't even know that they've been victimized.
-
The most important goal I had in mind was to convince people to stop blindly trusting algorithms and assuming that they are inherently fair and objective.
-
I think what's happened is that the general public has become much more aware of the destructive power of Wall Street.
-
Google is so big you have no idea what a given person does.
-
Evidence of harm is hard to come by.
-
You'll never be able to really measure anything, right? Including teachers.
-
Occupy provided me a lens through which to see systemic discrimination.
-
Especially from my experience as a quant in a hedge fund - I naively went in there thinking that I would be making the market more efficient and then was like, oh my God, I'm part of this terrible system that is blowing up the world's economy, and I don't want to be a part of that.
-
For whatever reason, I have never separated the technical from the ethical.
-
My fantasy is that there is a new regulatory body that is in charge of algorithmic auditing.