Risam, Roopika. “What Passes for Human?: Undermining the Universal Subject in Digital Humanities Praxis.” Bodies of Information: Intersectional Feminism and the Digital Humanities, edited by Elizabeth Losh and Jacqueline Wernimont, University of Minnesota Press, Minneapolis; London, 2018, pp. 39–56. JSTOR, www.jstor.org/stable/10.5749/j.ctv9hj9r9.6. Accessed 6 Feb. 2020.
Risam’s article looks at various digital humanities and robotics projects done within the past decade to highlight the insidious ways they present a “universal subject” that is still very much premised on racist colonial conceptions of the “human.” Looking at Microsoft’s AI chatbot “Tay,” Hanson Robotics’ Sophia, and the LaMem project, alongside a few other DH projects, Risam underscores critical elisions in methodology and claims to universality. Most importantly, Risam somewhat follows Safiya Noble’s argument about the critical importance of the manual labor conditions that underpin DH and robotic work. Looking at the corporation Amazon Mechanical Turk, Risam argues that the lack of information about labor pools, and who is doing basic input coding, is a huge issue. He links this argument to the broader lack of information available about various large algorithms and how they function and claims this is a primarily site for decolonizing intervention in the digital humanities.
Sauers, Jenna. “Life After Fiction: The Future of Lil Miquela.” Cultured Mag, 23 April 2019, https://www.culturedmag.com/lil-miquela/. Accessed 6 Feb. 2020.
In a very coherent piece for Cultured magazine, Jenna Sauers offers an explanatory background on Miquela Sosa (Lil Miquela is the CGI character’s stage name) and poses questions about Miquela’s creators- a mysterious “digital media startup” called Blud. Sauers is focused on the confusion the CGI influencer causes her audience. Why are people willing to suspend their disbelief about her being a physical reality when, logically, most people know robots have not reached this level of technical advancement? What does this aura of mystery create for Blud? While generally pessimistic about what Miquela means in regards to commodification of social justice language, Sauers most interesting argument is that she can be interpreted as a “useful” primer for ethical questions about AI that we are sure to encounter very soon.
Shram, Brian. “Accidental Orientations: Rethinking Queerness in Archival Times.” Surveillance & Society, vol. 17, no. 5, 2019, pp. 602-617, ojs.library.queensu.ca/index.php/surveillance -and-society/article/view/8688. Accessed 6 Feb. 2020.
Shram interrogates theoretical conceptions of “Queerness” in the age of Big Data and surveillance. Building on previous Big Data studies, specifically focused on Facebook, Shram illustrates how surveillance culture minutely categorizes subjects and nearly eliminates “the margins and in-between spaces,” where scholars of the 90’s theorized as liberatory. However, Shram focuses on how algorithms’ inherent “performative capacities” can also lead users into different queer online terrain and questions the ontological implications of one’s “data double” being assamblaged as queer without one’s knowledge.