Algorithms have taken a lot of heat recently for producing biased decisions. People are outraged over a recruiting algorithm Amazon developed that overlooked female job applicants. Likewise, they are outraged over predictive policing and predictive sentencing that disproportionately penalize people of color. Importantly, race and gender were not included as inputs into any of these algorithms.
Using Algorithms to Understand the Biases in Your Organization
They can provide transparency and improve decision-making.
August 09, 2019
Summary.
Algorithms have taken a lot of heat recently for producing biased decisions. Should we be outraged by bias reflected in algorithmic output? Yes. But the way organizations respond to their algorithms determines whether they make strides in debiasing their decisions or further perpetuate their biased decision-making. Organizations should use algorithms for the magnifying glasses they are: algorithms can aggregate individual data points with the purpose of unearthing patterns that people have difficult detecting. When algorithms surface biases, companies should seize this “failure” as an opportunity to learn when and how bias occurs. This way, they’re better equipped to debias their current practices and improve their overall decision-making.
New!
HBR Learning
Decision Making Course
Accelerate your career with Harvard ManageMentor®. HBR Learning’s online leadership training helps you hone your skills with courses like Decision Making. Earn badges to share on LinkedIn and your resume. Access more than 40 courses trusted by Fortune 500 companies.
Practical ways to improve your decision-making process.
Learn More & See All Courses
New!
HBR Learning
Decision Making Course
Accelerate your career with Harvard ManageMentor®. HBR Learning’s online leadership training helps you hone your skills with courses like Decision Making. Earn badges to share on LinkedIn and your resume. Access more than 40 courses trusted by Fortune 500 companies.
Practical ways to improve your decision-making process.