PyData Amsterdam 2024

Algorithmic bias is everywhere (especially at Breeze) - what can we do about it?
09-19, 13:40–14:30 (Europe/Amsterdam), Van Gogh

In this talk, I will detail how we found out that our recommender system at Breeze may be showing discriminatory behavior, and what we've been doing since to attempt to solve this issue. I will dive into our visit to the Netherlands Institute for Human Rights and how we have been trying to gain insight into the issue through gaining expert feedback and performing an audit without violating privacy legislation.


The talk is aimed at anyone who is involved in automatic decision making, especially recommender systems. No prior knowledge is needed as I will explain the very basic knowledge of recommender systems needed to understand how we may be strengthening existing biases, and I will also explain how sensitive attributes work under the GDPR/AVG and what the exceptions are.

The takeaways for the audience are a broader perspective on algorithmic bias and fairness in industry at a scaleup, relevant background knowledge needed for acting upon it and concrete starting points for addressing algorithmic bias within their own organisation.

Thomas Crul is a matchmaking researcher and data analyst for the dating app Breeze. Breeze aims to take ‘online dating offline’ by bypassing the messaging stage and directly arranging real-life dates for its users after a match. Crul specializes in developing recommendation algorithms responsible for connecting potential partners. He is interested in investigating and addressing biases within these technical systems.