At the Sociotechnical Equity and Agency Lab, we take a qualitative, participatory approach that focuses on lived experience and perception to examine and address friction, harm, and inequity within sociotechnical systems.

We engage in qualitative and critical research to provide deep context and detailed design and policy guidance that accounts for entire relationship between people and social technology.
We instigate projects which aim to empower at-risk users and communities to better understand, adapt to, and resist computational systems.
We bring this approach to bear across multiple interconnected research areas, including:
- Social Media
- Folk Theorization and User Understanding
- Human/AI Collaboration
- Human Centered Machine Learning/AI
- Queer/Trans HCI
- Identity and Self-Presentation
- Privacy and Visibility
- Content Moderation
- Online Harassment and Abuse
- Mental Health and Trauma
- Algorithmic Literacy
We are currently building our lab at Northeastern University! While we continue to build our lab and establish new projects, please refer to our director’s website for past research, especially our research on Folk Theorization, Platform Spirit, and User Adaptation and our work on Queer and Trans Human-Computer Interaction.
SEALab Research Pillars
We Are About Qualitative Context
This lab’s core purpose is to leverage a variety of qualitative, design, and critical methods to examine the entire relationship between people and systems. We frequently engage in community-based and participatory work. We also develop novel qualitative methods to better explore complex problems, and partner with collaborators for mixed methods work.
We Take a Transfeminist Approach
Our approach is guided by transfeminist principles, in conversation with other feminist philosophies. As such, we are deeply concerned with self-determination and promoting agency through technology, and take a pro-inclusion, anti-separatist stance.
We Prioritize At-Risk Communities
Working with at-risk and marginalized communities allows us to see the most urgent points of friction and harm in the human-system relationship and close these specific gaps in understanding while building knowledge of how to address more general problems.
We Do Member Research
Whenever possible, we work as member-researchers, studying and collaborating with communities our researchers are part of and doing work on problems we experience ourselves.
We Are Trauma-Informed
We recognize that the harms imposed by social technology frequently induce or exacerbate trauma, and frequently engage in work that directly addresses these traumatic effects. As such, we take trauma-informed computing principles into account in all our research, especially our methods.
Latest Publications
-
Moving Towards Epistemic Autonomy: A Paradigm Shift for Centering Participant Knowledge
Leah Hope Ajmani, Talia Bhatt, and Michael Ann DeVito. 2025. Moving Towards Epistemic Autonomy: A Paradigm Shift for Centering Participant Knowledge. CHI Conference on Human Factors in Computing Systems (CHI ’25), https://doi.org/10.1145/3706598.3714252…
-
Transphobia is in the Eye of the Prompter: Trans-Centered Perspectives on Large Language Models
Morgan Scheuerman, Katy Weathington, Adrian Petterson, Dylan Thomas Doyle, Dipto Das, Michael Ann DeVito, and Jed R. Brubaker. 2025. Transphobia is in the Eye of the Prompter: Trans-Centered Perspectives on Large Language Models. ACM Trans. Comput.-Hum. Interact. Just Accepted (June 2025). https://doi.org/10.1145/3743676…
-
“A Blocklist is a Boundary”: Tensions between Community Protection and Mutual Aid on Federated Social Networks
Erika Melder, Ada Lerner, and Michael Ann DeVito. 2025. “A Blocklist is a Boundary”: Tensions between Community Protection and Mutual Aid on Federated Social Networks. Proc. ACM Hum.-Comput. Interact. 9, 2, Article CSCW021 (May 2025), 30 pages. https://doi.org/10.1145/3710919…
-
Why Can’t Black Women Just Be?: Black Femme Content Creators Navigating Algorithmic Monoliths
Gianna Williams, Natalie Chen, Michael Ann DeVito, and Alexandra To. 2025. Why Can’t Black Women Just Be?: Black Femme Content Creators Navigating Algorithmic Monoliths. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI ’25). Association for Computing Machinery, New York, NY, USA, Article 108, 1–14. https://doi.org/10.1145/3706598.3713842…
-
Whose Knowledge is Valued? Epistemic Injustice in CSCW Applications
Leah Hope Ajmani, Jasmine C. Foriest, Jordan Taylor, Kyle Pittman, Sarah Gilbert, and Michael Ann Devito. 2024. Whose Knowledge is Valued? Epistemic Injustice in CSCW Applications. Proc. ACM Hum.-Comput. Interact. 8, CSCW2, Article 523 (November 2024), 28 pages. https://doi.org/10.1145/3687062…
Latest News
-
Dr. DeVito is part of the organizing committee for two workshops at CSCW 2023: On 10/14 she’ll be part of Trauma-Informed Design: A Collaborative Approach to Building Safer Online Spaces; on 10/15, she’ll be part of Epistemic Injustice in Online Communities
-
The lab has adopted the name “Sociotechnical Equity and Agency Lab”
-
Our first PhD Student Researcher, Erika Melder, has joined the lab.
-
Khoury College news featured Dr. DeVito in an article about how her research points to a need to reject the harmful, queer/transphobic Kids Online Safety Act
-
Dr. DeVito has established her new social computing-focused lab at Northeastern University