Publication | Closed Access
Algorithms that "Don't See Color"
14
Citations
25
References
2022
Year
Unknown Venue
EngineeringDiscriminationColor CorrectionLawUnique OpportunityJournalismComputational Social ScienceSocial MediaColor ReproductionBiasAlgorithmic BiasData PrivacyDisparate ImpactComputer ScienceBias DetectionComputational ScienceSocial ComputingAlgorithmic FairnessDemographic FeaturesArtsColorizationSensitive Features
Researchers and journalists have repeatedly shown that algorithms commonly used in domains such as credit, employment, healthcare, or criminal justice can have discriminatory effects. Some organizations have tried to mitigate these effects by simply removing sensitive features from an algorithm's inputs. In this paper, we explore the limits of this approach using a unique opportunity. In 2019, Facebook agreed to settle a lawsuit by removing certain sensitive features from inputs of an algorithm that identifies users similar to those provided by an advertiser for ad targeting, making both the modified and unmodified versions of the algorithm available to advertisers. We develop methodologies to measure biases along the lines of gender, age, and race in the audiences created by this modified algorithm, relative to the unmodified one. Our results provide experimental proof that merely removing demographic features from a real-world algorithmic system's inputs can fail to prevent biased outputs. As a result, organizations using algorithms to help mediate access to important life opportunities should consider other approaches to mitigating discriminatory effects.
| Year | Citations | |
|---|---|---|
Page 1
Page 1