49. Human Groups as Technology
Designing Fairness, Justice, and Equality into Human Structures
A Reflection on Janet Vertesi’s Workshop on Organizational Sociology
At American Cyborg, we think a lot about the organization of information, how technology takes part in this process, and how specific technologies -- especially algorithmic decision-making -- can be made more equitable, just, and fair.
Recently we had the good fortune of taking part in a weeklong training workshop on Organizational Theory with Janet Vertesi. Vertesi is an American Cyborg favorite, a scholar of Science and Technology Studies and author of Seeing Like a Rover, a sociological look at NASA, the Jet Propulsion Laboratory, and how the scientists there work with -- and think with -- the robots they’ve built. A key takeaway of Vertesi’s work is a close look at technology, but not the technology you might expect: she looks especially closely at the technology of human organizations. How are humans arranged, how are their interactions and powers structured, and how does this shape the production of knowledge, of objects, and of futures?
In a stark example of how the technology of human arrangements can impact fairness, Vertesi cited an example of what she called “structural powerlessness.” She observed, over and over again, “empty” efforts to promote women (for the appearance of equality), putting women in a leadership position where they don’t actually have any decision-making power. She observed that women in these positions tend to lash out, become territorial over their limited powers, and are difficult to work with -- behaviors which are then cited by the surrounding men as “evidence” of why they don’t promote women more often.
Vertesi, however, observed this pattern again. And again. And again. And when you see that so many people in the same structural position behave similarly, then it becomes clear that the arrangement of humans and their power structures influence the way that people behave within those structures.
A lot of the tech industry and policymakers are rightfully concerned about biases built into design of algorithms. We humbly suggest that algorithms cannot be untwined from the groups that produce them, and that attention must be paid to the biases built into the design of human groups. We’re getting started here at American Cyborg, by designing our organizational structures to increase the clarity and fairness of each team member’s labor and contribution.