Algoright: Auditing algorithms for bias

Building fairness, transparency, and accountability into algorithms

Priscilla W Guo
Coding it Forward

--

In 2020, news headlines read “Wrongfully Accused by Algorithm” and “UK ditches exam results generated by biased algorithm after student protests”. In the United States and around the world, we’re having a national awakening to the systemic injustices that are deeply encoded into our everyday lives. Although algorithms are ubiquitously used in a variety of decision-making contexts, there is a growing awareness of their ability to produce biased outcomes.

In 2018, an Amazon hiring algorithm displayed significant gender bias against female applicants and was scrapped from their recruitment process. Another Science article in 2019 found evidence of significant racial bias in healthcare algorithms since the algorithms were less likely to refer Black patients than equally sick white patients for additional care. From credit scoring to criminal justice, we have embedded algorithms into the most important decisions in people’s lives.

The Need for Fairness in Algorithms

In the last decade, regulatory mandates have emerged in the US and European Union, requiring explainability in algorithms, and soon, it will develop into a global standard. Additionally, consumers are demanding fairness in algorithmic decision-making, with a rising number of protests against unfair algorithms. At the moment, companies are not equipped with the tools to build fairness, accountability, and transparency into their algorithms to meet regulatory and consumer demands.

Historically, explainability has been hindered by the “black box” algorithms since researchers were unable to examine the algorithm and the data that produced it. However, FTC regulatory guidance has indicated that third-party organizations should review algorithms to be accurate and non-discriminatory. Essentially, algorithmic audits need to be independent and happen repeatedly.

Why I’m Building Algoright

I’ve seen and personally experienced the impacts of algorithmic bias. It’s what led me to dedicate years in academia toward researching solutions to address fairness in machine learning.

Until 2017, I had never seen the inside of a prison. But I needed to see the profound gravity of judicial decision making, and more importantly, the potential consequences of a handful of lines of code. Within criminal justice reform, I worked on exposing the ubiquity of risk assessment algorithms in the United States and articulating legal redress to algorithms that were producing biased sentencing recommendations for communities of color. I could not and did not accept the status quo. I wanted to fight for those who didn’t have a voice, the ones imprisoned by algorithms on the basis of potential future crimes and immutable characteristics like race.

I know who I’m building for and why fairness and transparency in algorithms are necessary. Throughout our academic work, I’ve seen a lack of industry applications of research techniques, and it’s a timely opportunity to translate fairness research into actionable business practice.

About Algoright

Algoright is a SaaS product that uses proprietary bias testing to help companies ensure that their algorithms are fair and anti-discriminatory. Our product provides insights and visualizations on demographic diversity, such as race or gender, in an easy-to-understand format. Our audit helps businesses become compliant with growing anti-discrimination and AI regulations.

A snapshot of our demo for universities, built with open-source data.

We plan to be the S&P for fairness, by providing a comparable fairness score for algorithms. Our user research has indicated businesses want to competitively signal fairness to consumers. Much like Gartner vendor ratings, Algoright aims to lead the way on fairness ratings of businesses. We have an MVP for universities’ outreach and admissions teams to visualize and interpret demographic fairness in their process. In the coming months, we will be adapting and scaling the product for healthcare, insurance, government, and credit scoring companies.

Ultimately, I hope I can help companies build fairness, transparency, and accountability back into their algorithms.

Algoright was founded by Priscilla Guo. To learn more, please visit our website, Twitter, Facebook, or Linkedin.

--

--