top of page

OVERVIEW

health-2082630_1280.jpg
computer-3343887_1920.jpg

In the field of healthcare, in particular, the potential negative effects of AI on society—whether amplifying human biases and inequality or the potential pitfalls of over-automation—can not be ignored since they directly influence people’s decision making regarding their health. As a result, such topics are increasingly discussed in academia as well as the public. We argue that there is a critical, time-sensitive need to consider and examine human involvement in the work to incorporate AI into healthcare settings, as well as to recognize the hidden human labor that underlies the production and maintenance of many AI and automated healthcare systems.

Critical engagement of people in the process of formative studies, design, development, use, and evaluation of AI-based systems is vital to ensure that such systems are practical and beneficial. These include careful consideration of stakeholders’ needs, beliefs, values, expectations, and preferences. In addition, the recognition of human work in relation to AI and automation is critical in tackling sociotechnical challenges associated with AI systems in healthcare. Literature in CSCW and HCI has long shown that designing systems for complex sociotechnical contexts, such as health, needs to account for highly situated activities, relations among diverse human/non- human actors, and social worlds [1,3,8,16]. Star and Strauss’s (1999) sensitizing concept of “invisible work” includes those activities (often types of emotional labor, but also undervalued activities and marginalized perspectives) that are not supported by organizational processes or technological systems [15]. Even for AI systems, people often directly, but invisibly, contribute to making these systems work, which Gray and Suri (2019) term “Ghost Work” [9].


In this workshop, we will explore the stubbornly social aspects of healthcare work in the age of automation. This includes understanding 1) the involvement and labor of users, stakeholders, and communities— accounting for the tensions and negotiations that are often invisible but of critical importance when healthcare work is augmented by AI, and 2) identifying emerging sociotechnical and organizational phenomena related to human trust in technology, given the expected changes to healthcare work.
 

New healthcare technologies have the potential to alleviate pressing needs as well as create new, unexpected forms of labor (which may not be supported) and at times even reinforce health disparities [17]. It is thus essential that we bring together diverse perspectives to speak to various consequences and considerations that, while highly visible in some social worlds, could be invisible in others such as healthcare.
 

Additionally, this workshop will extend conversations around known issues in AI and automated systems, including biased algorithms, lack of transparency and trust, and accountability, to inform design processes and evaluations for these emerging technologies in healthcare settings. Our workshop aims to ground our discussions of human collaborative work and trust in known issues related to equity, labor replacement, and transparency, to stimulate discussions around challenges and opportunities for designing AI technologies in healthcare.

References:
1. Mark S. Ackerman. 2000. The Intellectual Challenge of CSCW: The Gap Between Social Requirements and Technical Feasibility. Human–Computer Interaction 15, 2-3: 179–203.
2. Tuka Al Hanai, Mohammad M Ghassemi, and James R Glass. 2018. Detecting Depression with Audio/Text Sequence Modeling of Interviews. In Interspeech. 1716–1720.
3. Jeffrey P. Bigham, Richard E. Ladner, and Yevgen Borodin. 2011. The Design of Human-powered Access Technology. In The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’11), 3–10.
4. Carrie J. Cai, Emily Reif, Narayan Hegde, Jason Hipp, Been Kim, Daniel Smilkov, Martin Wattenberg et al. "Human-centered tools for coping with imperfect algorithms during medical decision-making." In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, p. 4. ACM, 2019.
5. Stevie Chancellor, Michael L. Birnbaum, Eric D. Caine, Vincent M. B. Silenzio, and Munmun De Choudhury. 2019. A Taxonomy of Ethical Tensions in Inferring Mental Health States from Social Media. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT* '19). ACM, New York, NY, USA, 79-88. DOI: https://doi.org/10.1145/3287560.3287587
6. Cognoa – Autism diagnosis app. www.cognoa.com
7. Empatica – Smarter epilepsy monitoring. www.empatica.com/
8. Geraldine Fitzpatrick and Gunnar Ellingsen. 2013. A Review of 25 Years of CSCW Research in Healthcare: Contributions, Challenges and Future Agendas. Computer supported cooperative work: CSCW: an international journal 22, 4-6: 609–665.
9. Mary L. Gray and Siddharth Suri. 2019. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Eamon Dolan Books.
10. Marina Jirotka, Rob Procter, Mark Hartswood, Roger Slack, Andrew Simpson, Catelijne Coopmans, Chris Hinds, and Alex Voss. "Collaboration and trust in healthcare innovation: The eDiaMoND case study." Computer Supported Cooperative Work (CSCW) 14, no. 4 (2005): 369-398.
11. Ellen W. McGinnis, Steven P. Anderau, Jessica Hruschak, Reed D. Gurchiek, Nestor L. Lopez-Duran, Kate Fitzgerald, Katherine L. Rosenblum, Maria Muzik, Ryan McGinnis. Giving Voice to Vulnerable Children: Machine Learning Analysis of Speech Detects Anxiety and Depression in Early Childhood. IEEE Journal of Biomedical and Health Informatics, 2019; 1 DOI: 10.1109/JBHI.2019.2913590
12. Jesper Molin, Paweł W. Woźniak, Claes Lundström, Darren Treanor, and Morten Fjeld. "Understanding design for automated image analysis in digital pathology." In Proceedings of the 9th Nordic Conference on Human-Computer Interaction, p. 58. ACM, 2016.
13. MindMotion Go – Patient rehabilitation. www.mindmotionweb.com/mindmotion-go/
14. Naja L. Holten Møller and Naja L. Holten Møller. 2018. The future of clerical work is precarious. Interactions 25, 4: 75–77.
15. Susan Leigh Star and Anselm Strauss. 1999. Layers of Silence, Arenas of Voice: The Ecology of Visible and Invisible Work. Computer supported cooperative work: CSCW: an international journal 8, 1-2: 9–30.
16. Lucy Suchman. 2007. Human-Machine Reconfigurations: Plans and Situated Actions. Cambridge University Press.
17. Tiffany C. Veinot, Hannah Mitchell, and Jessica S. Ancker. 2018. Good intentions are not enough: how informatics interventions can worsen inequality. Journal of the American Medical Informatics Association: JAMIA 25, 8: 1080–1088.
18. Meredith Whittaker, Kate Crawford, Roel Dobbe, Genevieve Fried, Elizabeth Kaziunas, Varoon Mathur, Sarah Meyers West, Rashida Richarson, Jason Schultz and Oscar Schwartz (2018). AI Now 2018 Report. AI Now Institute. Retrieved from: https://ainowinstitute.org/AI_Now_2018_Report.html
19. Woebot therapy chatbot. www.woebot.io/

bottom of page