From https://medium.com/pinterest-engineering/powering-inclusive-search-recommendations-with-our-new-visual-skin-tone-model-1d3ba6eeffc7 |
The topic of this week's MIT-Harvard CINCS / Hamilton Institute Seminar was about "Inclusive Search and Recommendations" from Dr Nadia Fawaz, Pinterest.
Nadia Fawaz is a research scientiest and tech lead at Pinterest, and she gave an interesting talk on inclusive search and recommendations with interesting examples of this line of efforts at Pinterest when they are building their ML-based search and recommendation systems.
Why this problem is important? It has been clear that in addition to the benefits of ML systems for a wide range of domains, there are also some critical problems have been noticed such as our learned language models could be biased (e.g., "Man is to Doctor as Woman is to Nurse"). This is mainly due to the training data we collected and used to train a ML system is biased. Without considering those biases, ML loop will enhance the bias instead of eliminating it. Nadia Fawaz mentioned during her talk that bias mainly comes from demographic features such as age, skintone, gender, etc., and the talk is focused on Pinterest's efforts to build inclusive search and recommendations with skintone as an example.
Example of Pinterest inclusive AI solution for skintone [medium article]. Motivated by a top request from Pinners where they want to feel represented in the product, they built the first version of skin tone ranges, an inclusive search feature, in 2018. This aims to provide more inclusive inspirations to be recommended in search as well as allow Pinners to choose ranges for recommendations/search results. Some important aspects in terms of building inclusive service are such as:
- data balanced across a wide range of groups (e.g., a wide range of skin tones)
- error analysis should be tailed down to each group (so that the system does not perform well on only some specific groups while performing poor on other groups)
- improve fairness and reduce potential bias in other ML models (e.g., incorporating fairness into objective functions)
- also as one might expect, to achieve the goal of inclusive ML services, multidisciplinary efforts and a lot of labeling works required from domain experts.
No comments:
Post a Comment