At the Sociotechnical Equity and Agency Lab, we take a qualitative, participatory approach that focuses on lived experience and perception to examine and address friction, harm, and inequity within sociotechnical systems.
We engage in qualitative and critical research to provide deep context and detailed design and policy guidance that accounts for entire relationship between people and social technology.
We instigate projects which aim to empower at-risk users and communities to better understand, adapt to, and resist computational systems.
We bring this approach to bear across multiple interconnected research areas, including:
- Social Media
- Folk Theorization and User Understanding
- Human/AI Collaboration
- Human Centered Machine Learning/AI
- Queer/Trans HCI
- Identity and Self-Presentation
- Privacy and Visibility
- Content Moderation
- Online Harassment and Abuse
- Mental Health and Trauma
- Algorithmic Literacy
We are currently building our lab at Northeastern University! While we continue to build our lab and establish new projects, please refer to our director’s website for past research, especially our research on Folk Theorization, Platform Spirit, and User Adaptation and our work on Queer and Trans Human-Computer Interaction.
SEALab Research Pillars
We Are About Qualitative Context
This lab’s core purpose is to leverage a variety of qualitative, design, and critical methods to examine the entire relationship between people and systems. We frequently engage in community-based and participatory work. We also develop novel qualitative methods to better explore complex problems, and partner with collaborators for mixed methods work.
We Take a Transfeminist Approach
Our approach is guided by transfeminist principles, in conversation with other feminist philosophies. As such, we are deeply concerned with self-determination and promoting agency through technology, and take a pro-inclusion, anti-separatist stance.
We Prioritize At-Risk Communities
Working with at-risk and marginalized communities allows us to see the most urgent points of friction and harm in the human-system relationship and close these specific gaps in understanding while building knowledge of how to address more general problems.
We Do Member Research
Whenever possible, we work as member-researchers, studying and collaborating with communities our researchers are part of and doing work on problems we experience ourselves.
We Are Trauma-Informed
We recognize that the harms imposed by social technology frequently induce or exacerbate trauma, and frequently engage in work that directly addresses these traumatic effects. As such, we take trauma-informed computing principles into account in all our research, especially our methods.
Latest Publications
-
Whose Knowledge is Valued? Epistemic Injustice in CSCW Applications
Leah Hope Ajmani, Jasmine C. Foriest, Jordan Taylor, Kyle Pittman, Sarah Gilbert, and Michael Ann Devito. 2024. Whose Knowledge is Valued? Epistemic Injustice in CSCW Applications. Proc. ACM Hum.-Comput. Interact. 8, CSCW2, Article 523 (November 2024), 28 pages. https://doi.org/10.1145/3687062…
-
Safety and Community Context: Exploring a Transfeminist Approach to Sapphic Relationship Platforms
Michael Ann DeVito, Jessica L. Feuston, Erika Melder, Christen Malloy, Cade Ponder, and Jed R. Brubaker. 2024. Safety and Community Context: Exploring a Transfeminist Approach to Sapphic Relationship Platforms. In Proceedings of the ACM on Human-Computer Interaction, Vol. 8, CSCW1, Article 203 (April 2024), 35 pages, https://doi.org/10.1145/3653694…
-
Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users
Samuel Mayworm, Michael Ann DeVito, Daniel Delmonaco, Hibby Thach, and Oliver L. Haimson. 2024. Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users. ACM Trans. Soc. Comput. 7, 1, Article 1 (March 2024), 27 pages. https://doi.org/10.1145/3632741…
-
“I See Me Here”: Mental Health Content, Community, and Algorithmic Curation on TikTok
Ashlee Milton, Leah Ajmani, Michael Ann DeVito, and Stevie Chancellor. 2023. “I See Me Here”: Mental Health Content, Community, and Algorithmic Curation on TikTok. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 17 pages. https://doi.org/10.1145/3544548.358148…
-
How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization
Michael Ann DeVito. 2022. How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization. In Proceedings of the ACM on Human-Computer Interaction, Vol. 6, CSCW2, Article 380 (November 2022), 31 pages, https://doi.org/10.1145/3555105…
Latest News
-
Dr. DeVito is part of the organizing committee for two workshops at CSCW 2023: On 10/14 she’ll be part of Trauma-Informed Design: A Collaborative Approach to Building Safer Online Spaces; on 10/15, she’ll be part of Epistemic Injustice in Online Communities
-
The lab has adopted the name “Sociotechnical Equity and Agency Lab”
-
Our first PhD Student Researcher, Erika Melder, has joined the lab.
-
Khoury College news featured Dr. DeVito in an article about how her research points to a need to reject the harmful, queer/transphobic Kids Online Safety Act
-
Dr. DeVito has established her new social computing-focused lab at Northeastern University