I’m going to start this blog post on fairness by not talking about fairness.
Stick with me.
Security in software applications has been a topic of research, analysis, and innovation since the formation of the field of Computer Science. It has seen significant, complex solutions that have been incrementally improved over time bringing us to the state of the art today. However, Two Factor authentication is, as a solution, one of the most impactful innovations in IT Security and is currently a cornerstone of personal data protection [1]. This is a solution that is focused on Human computer interactions and is part of the trend of “Secure by Design”.
This trend of HCI solutions arriving late to the party but making significant improvements is not unique to security as Harry Hochheiser and Jonathan Lazar said in their paper on the impact of HCI in society “Both privacy […] and voting systems […] attracted the notice of others in the computing community long before HCI practitioners became involved.” [2].
I feel that we are approaching a similar turning point for Fairness in HCI. Fairness has long been an area of work in sociology and more recently a topic in Machine Learning and statistics. It is a word used to cover a vast field including normative ethics, accuracy, and disparate treatment across demographic groups; It can be defined as the measures used to ensure no person or group of people are disadvantaged by a process (the definitions of “person”, “group of people“, “treated”, “disadvantaged”, and “process” can all be debated).
Fairness and what fair means is becoming part of the HCI conversation as more applications start to use advanced statistical models or tailored interfaces. It is worth noting that this is not an entirely new topic to HCI research. The HCI principle of Accessibility is a form of fairness by a different name. Fairness definitions like “Equality of Capability of Function” [3] look to ensure that the capabilities of those involved in the process have been a taken into account, this is comparable to Accessibility definitions like those that focus on “equivalent user experience for people with disabilities” [4].
While HCI researchers work on figuring out what “Fair by Design” means there is some low hanging fruit that the HCI community can consider now.
- Usages statistics can be tailored to become an intersectional measure and gather difference in behaviour between demographics. You need to know who is finding your designs easier to use.
- Be aware there is no such thing as a neutral design. You already have a tailored system, the user flow is to use or not to use.
- The measurements for Fair in computer science are narrow and focused on metrics. The measurements for Fair in sociology are broad and focus on conceptual design. HCI cares about both.
- You can’t opt-out of the fairness conversation by saying that you are “not taking demographic data into account”. This is a form of fairness called “Fairness through unawareness” [5] and has its own criticisms.
- Your personas are not equal. Some makeup more of your audience. Some need your application more.
- Fairness is an accessibility concept. When measuring how accessible an application is, a measure of fairness can be a reasonable benchmark to use.
- Be very intentional about whom you use in participatory design and co-design groups. Go beyond WEIRD (Western, Educated, Industrialized, Rich, and Democratic) participants and take note when non-WEIRD participants have divergent outcomes from WEIRD participants.
References –
[1] Wang, D., & Wang, P. (2016). Two birds with one stone: Two-factor authentication with security beyond conventional bound. IEEE transactions on dependable and secure computing, 15(4), 708-722.
[2] Hochheiser, H., & Lazar, J. (2007). HCI and Societal Issues: A Framework for Engagement. International Journal of Human-Computer Interaction, 23(3), 339–374. https://doi.org/10.1080/10447310701702717
[3] Nussbaum, M., & Sen, A. (Eds.). (1993). The quality of life. Clarendon Press.
[4]https://www.w3.org/WAI/fundamentals/accessibility-usability-inclusion/
[5] Chen, J., Kallus, N., Mao, X., Svacha, G., & Udell, M. (2019). Fairness under unawareness: Assessing disparity when protected class is unobserved. FAT* 2019 – Proceedings of the 2019 Conference on Fairness, Accountability, and Transparency, 339–348. https://doi.org/10.1145/3287560.3287594