By Arya Dixit, Research Assistant + Systems Design Engineering Student, Critical Media Lab
As an engineering student, I am continuously immersed in the world of technology, learning about its capabilities and ways to manipulate it to solve problems. However, after watching Wendy Chun’s Critical Tech Talk, I found myself questioning the limitations of this approach.
Chun, a University of Waterloo alumni who studied Systems Design Engineering and English, delivered a thought-provoking talk on the use of big data and machine learning methods in code discrimination and bias, and discussed ways in which we can overcome this to create more democratic big data. She was joined by two respondents who shared their reflections on Chun’s book: Discriminating Data. One of the two respondents was Brianna I. Wiens (she/her), a Postdoctoral Researcher in Communication Arts at the University of Waterloo (now Assistant Professor in English Language and Literature), and co-director of the Qcollaborative, an intersectional feminist design lab. The other was Queenie Wu (she/her), a fourth-year Systems Design engineering student at the University of Waterloo.
Chun argues that our reliance on software as a metaphor for understanding the world has led us to prioritize efficiency, optimization, and predictability over nuance and complexity. As someone who is used to thinking in terms of algorithms and data analysis, this was a sobering reminder of the potential pitfalls of relying too heavily on technology. One of the key insights that Chun offers is the idea that technology can perpetuate inequality and reinforce existing power structures. This is particularly relevant in the field of engineering, where the design of algorithms and other technological tools can reflect the biases and assumptions of the people who create them. It is important to be aware of, and teach about, these biases and to work towards designing technology that is inclusive and accessible to all.
Another takeaway from Chun’s talk is the idea that technology can shape our understanding of ourselves and the world around us. This is something that I have experienced firsthand, as my work with technology has often led me to view problems and solutions in a very specific way. Chun’s talk was a reminder that there is more to the human experience than can be captured by algorithms and data analysis alone. This is especially relevant in fields such as engineering, where technological solutions may be designed without taking into account the broader social and cultural context. For instance, a technology that is designed to optimize traffic flow may not take into account the impact on local communities, such as increased noise or air pollution. In my own work, I have also come across situations in which technology can perpetuate inequality. Algorithms used for resume screening may be programmed to prioritize certain keywords or criteria that may disadvantage applicants from marginalized communities. Similarly, facial recognition technology may be less accurate for individuals with darker skin tones, leading to biases and discrimination in law enforcement and other contexts. It is necessary to consider the broader implications of our work and to engage with stakeholders to understand their perspectives and needs.
Listening to this Critical Tech Talk served as a valuable reminder that technology is not a panacea for all of our problems. It is important to approach technology with a critical eye and to be mindful of its limitations. At the same time, we must also be willing to embrace the nuance and complexity of the human experience, and recognize that there is no algorithm or data set that can capture it fully. I feel that one way to do this is to incorporate a broader range of perspectives into our work. This might involve collaborating with experts from fields such as sociology, psychology, or anthropology to gain a deeper understanding of how technology impacts people’s lives. It might also involve engaging with communities and stakeholders to understand their needs and concerns, and to involve them in the design process. Another important step is to ensure that our work is guided by ethical principles and values. This might involve considering the impact of our work on marginalized communities, or reflecting on the potential unintended consequences of our technological solutions. This should be implemented early on during undergraduate, or even earlier, education. We have a responsibility to consider the broader social and ethical implications of our work, and to use technology as a tool for positive change.
In summary, Wendy Chun’s Critical Tech Talk offers valuable insights for engineering students who are working with technology. By reminding us of the limitations of software as a metaphor for understanding the world, and by highlighting the potential for technology to perpetuate inequality and reinforce existing power structures, Chun challenges us to think more critically about our work. As we continue to develop new technological solutions, it is essential that we remain mindful of these issues and work to design technology that is inclusive, ethical, and reflects the complexity of the human experience.