-
Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to.
Kate Crawford -
While massive datasets may feel very abstract, they are intricately linked to physical place and human culture. And places, like people, have their own individual character and grain.
Kate Crawford
-
Error-prone or biased artificial-intelligence systems have the potential to taint our social ecosystem in ways that are initially hard to detect, harmful in the long term, and expensive - or even impossible - to reverse.
Kate Crawford -
We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data.
Kate Crawford -
If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases.
Kate Crawford -
Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems.
Kate Crawford -
It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science.
Kate Crawford -
Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images. If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces.
Kate Crawford
-
Facebook is not the world.
Kate Crawford -
There's been the emergence of a philosophy that big data is all you need. We would suggest that, actually, numbers don't speak for themselves.
Kate Crawford -
While many big-data providers do their best to de-identify individuals from human-subject data sets, the risk of re-identification is very real.
Kate Crawford -
Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.
Kate Crawford -
Self-tracking using a wearable device can be fascinating.
Kate Crawford -
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
Kate Crawford
-
We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.
Kate Crawford -
If we start to use social media data sets to take the pulse of a nation or understand a crisis - or actually use it to deploy resources - we are getting a skewed picture of what is happening.
Kate Crawford -
We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future.
Kate Crawford -
People think 'big data' avoids the problem of discrimination because you are dealing with big data sets, but, in fact, big data is being used for more and more precise forms of discrimination - a form of data redlining.
Kate Crawford -
Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited.
Kate Crawford -
Vivametrica isn't the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money.
Kate Crawford
-
Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate.
Kate Crawford -
Numbers can't speak for themselves, and data sets - no matter their scale - are still objects of human design.
Kate Crawford -
As we move into an era in which personal devices are seen as proxies for public needs, we run the risk that already-existing inequities will be further entrenched. Thus, with every big data set, we need to ask which people are excluded. Which places are less visible? What happens if you live in the shadow of big data sets?
Kate Crawford -
The amount of money and industrial energy that has been put into accelerating AI code has meant that there hasn't been as much energy put into thinking about social, economic, ethical frameworks for these systems. We think there's a very urgent need for this to happen faster.
Kate Crawford