Our first week was off to an exciting start with three class activities to learn more about this week’s topics: Tech Determinism and Discriminatory Design.
#1: Pop culture and media narratives around technological determinism and solutionism
#2: “Value-neutral” technology, discriminatory design and the implications for various communities and the environment
What is tech solutionism? Tech is often seen as a solution in our society that is meant to fix our problems. However, in Byrum and Benjamin’s article “Disrupting the Gospel of Tech Solutionism to Build Tech Justice,” the authors question the injustice that follows the introduction of emerging tech such as AI and automated systems: “What if new technologies sold to solve social problems are simply ill-suited to the complexity of the environments where they are introduced? What if, sometimes, the answer isn’t technology?”
What is discriminatory design and its implications? In “Design Values: Hardcoding Liberation?” Ruha Benjamin defines discriminatory design as “the normalization of racial hierarchies within the underlying design of sociotechnical systems.” One famous example is Robert Moses’s architecture that was built for the upper-middle class. His physical designs prevented buses from traveling under overpasses. Whereas white affluent, car-owning individuals could travel with ease, non-White bus takers did not have that mobility freedom.
In “Read Race After Technology,” Benjamin provokes us to rethink the connotation behind innovation: ****“Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in a way of moral and social improvement. For this reason, it is important to question “innovation” as a straightforward social good and to look again at what is hidden by an idealistic vision of technology. How is technology already raced?”
In Class Activity: I Love Algorithms
Our first class activity was **“I love Algorithms: A Machineless Machine Learning Creation Kit,”** a card deck activity where students considered the products, implications, and disruption in systems that came with combining different algorithms (i.e.: Classification, Clustering, Reinforcement Learning, Dimensionality Reduction, Regression, and Association) and datasets (i.e. from Spotify, Wikipedia, and Disney+). Students realized that all technology is designed, and these designs exist in systems that come with implications and inherent biases.
[insert example we discussed and implications]
In Class Activity: Ready Steady
In our second class activity, students participated in a case study activity called “Ready Steady.” In this hypothetical situation, students have demanded Stanford provide them with more real-world opportunities in the form of job placement, fellowships, externships, and mentorships.
In order to meet this demand, Stanford Career Education has deployed STEADY! an Algorithm Decision-Making System to determine student readiness. Collecting data from Stanford Student ID Cards, it predicts which students are most ready to receive real world opportunities. STEADY! ranks students’ “readiness” from 1-5 based on (1) Academic effort, (2) student responsibility, and (3) community participation. It also usest some profile data of successful Stanford alumni from the last 25 years.
In reflecting on this system and reverse engineering it, students identified who benefitted from this system, the data types collected, rules that were applied to the data sets, and the data that was not collected but ought to be used. In the class discussion students explored how alumni data may affect the system’s definition of “success.” Historically, Stanford has had majority stem majors who go into tech, and there were many legacy students from wealthy families. It would only be unfair to hold the current student population to these criteria that are based off data that does not represent them.