Blog / Texts

What does it mean to be seen by machines?

by Kin (Cultura Plasmic INC)

POSTED: 28 July 2022

Artist Kin (Cultura Plasmic INC) considers the effects of machine vision and emotion-tracking software on the establishment of prejudicial norms and the reinforcement of ableist ideals

Ahead of our free online event Social Marginalisation and Machine Learning: Defying the Labels of the Machinic Gaze, we commissioned artist Kin (Cultura Plasmic INC) to consider the role of AI in her creative work, and the relationships between machine learning and issues of social exclusion, discrimination and human rights more broadly.

Does a smile always mean we’re happy? For an emotion-tracking AI, it’s generally assumed so. Machine vision speaks a claustrophobic language of sameness. I have an acute awareness of the ways in which complex emotions are often simplistically reduced to outward appearances which is informed, at least in part, by my experiences with depression. When we’re talking about assumptions of what something should or shouldn’t look like, there are significant implications in terms disability politics, the policing of ableist ideals and discrimination against non-normative bodies. Machine vision technologies (such as emotion recognition and predictive policing¹) rely on establishing and reinforcing normative boundaries, and the effect is the shrinking of space for difference.

As an artist, much of my previous work has focused on surveillance, but it was in 2020 that I really honed in on the politics of machine vision technologies. I started developing an audiovisual work, Crystalline Unclear, and began to think about what it means to be seen by an AI trained on a dataset. At the same time, I was reading about crystallography and the process of using a beam of light (for example, an X-ray) to see the inner structure of a substance. I’ve heard it compared to producing a fingerprint of a material. In my work I focused on a piece of quartz which happens to be a crystal with a deep history in the development of electronics. Crystals are unique in their highly ordered, homogenous and predictable inner structures so if the beam of light reveals a structure that conforms to such a pattern, it’s easy to classify the crystal.

sample

Kin (Cultura Plasmic INC), Crystalline Unclear [still], 2021

sample

Kin (Cultura Plasmic INC), Crystalline Unclear [still], 2021

In my mind a synergy grew between these two areas – machine vision and crystal as a material - based on the importance of conforming to expectations. There is an emphasis on predictability, of staying within set boundaries, and the reward… is being seen.

Last year an anonymous software engineer revealed that emotion-detection technologies were integrated into policing systems in the Xinjiang region in China. This region is somewhat infamous for its high levels of surveillance used against the Muslim-majority Uyghur population. Positioning it as almost an extension of flawed lie detector technology, the engineer stated that emotion-tracking was used for ‘pre-judgement without any credible evidence’. In other words, it overrides anything you have to say for yourself; it is criminalising in the absence of evidence and any questioning of its accuracy is an afterthought.

I’m writing this in the week following reports about Intel’s plans to integrate emotion-detection on the Zoom video communications platform. This time, in a bid to re-position such software as distinct from surveillance technologies, its proponents claim the purpose is to support teachers in better understanding their students. However, multiple criticisms include: removing student autonomy; eroding trust between teachers and young people; the socio-economic and environmental costs of having to keep a webcam switched on at all times; students not wanting others to judge their housing conditions; the flaws of quantifying emotion; and students being penalised if they don’t express the right facial responses.

The kinds of judgements that are intrinsic to data labelling and categorisation are fraught with prejudice, personal values and subject-position, all of which are hidden by the supposed objectivity and accuracy of machine vision.

Of course, machine vision isn’t limited to facial analysis. Let’s consider AI that focuses on posture or behaviour. What does suspicious behaviour look like? It is easy to see how bias and prejudice can saturate these perceptions.

I remember reading a text by artist Memo Atken about how machines see and how previous experience is fundamental to perception. Reflecting on a neural network in AI, Atken writes ‘It can only see through the filter of what it already knows. Just like us. Because we too, see things not as they are, but as we are’. When an AI identifies anger, it is likely because it conforms to every image it has seen up to that point that has been labelled by someone as displaying anger. That is not to be confused with the actual emotional reality of the person it is analysing. There is a whole body of research around how gender and race influence emotion recognition and how the same expressions can be read differently, depending on the person who is being analysed. These are the kinds of social factors that creep into the labelling of datasets.

When a company is designing an AI that claims to detect threats or suspicious behaviour, a dataset defines what that behaviour looks like. But gauging how suspicious behaviour might appear is not universal. These kinds of judgements that are intrinsic to data labelling and categorisation are fraught with prejudice, personal values and subject-position, all of which are hidden by the supposed objectivity and accuracy of machine vision.


Kin (Cultura Plasmic INC), from Crystalline Unclear [detail], 2021

I began my research for Crystalline Unclear by thinking about machine vision and what it means to be seen by machines, but maybe the issue is more about what it means to be recognised by machines and the promotion of conformity that comes with it. Having said that, perhaps it still doesn’t go far enough to capture the scale of the issue, in that it doesn’t draw attention to the actions and consequences that follow as a result of recognition (and its inaccuracies). When recognition is used to compound or justify a police response at a protest, or the decisions of a driverless car heading for a crash, it goes beyond recognition and becomes about the chain of events that follows.

In many cases, it is arguably safer not to be seen or recognised by AI, a line of thinking also pursued by Francis Hunger & Flupke in their design of Adverserial.io, a webapp that subtly adds noise to images to prevent their machine-readability. When I was experimenting with it, a clear headshot of a person becomes wrongly pleasingly identified by the AI as a drumstick as a result of the added noise. To the human eye, however, the adjusted image and the original image are difficult to tell apart, demonstrating just how fragile and subject to manipulation machine vision can be.

So, what does it mean to be seen by machines? Perhaps it is not so different from being seen by a human, with the added caveat that biases and prejudices are more easily concealed and glossed over. What I hope that Crystalline Unclear can contribute to this conversation is an emphasis on the role of conformity, predictability, and the setting of norms to machine vision.


______
¹ The use of predictive analytics in law enforcement to identify potential criminal activity.

about the Author

sample

Kin (Cultura Plasmic INC)

Kin is a multi-pseudonymous artist and essayist from Newcastle upon Tyne. She works with sound, video, sensors and installation to explore the social, environmental and psychological aspects of digital technology, with a particular focus on the politics of surveillance-communications networks. Over the last few years she has developed a creative language that uses light and visibility to form critiques of surveillance, as well as raising awareness about the social inequalities perpetuated by data-gathering practices and predictive technologies. Kin’s work often emerges from philosophical reflections on technology and the dynamics of power and control.

Kin is a member of the Transforming Leadership project, on which Autograph is proud to be a key partner. See more of Kin's work on her website.

SHARE YOUR THOUGHTS

Can you spare a few moments? Autograph is carrying out a survey to better understand who our digital audiences are. The survey should take no longer than five minutes to complete. Anything you tell us will be kept confidential, is anonymous and will only be used for research purposes.

The information you provide will be held by Autograph and The Audience Agency, who are running the survey on our behalf. In compliance with GDPR, your data will be stored securely and will only be used for the purposes it was given.

You can take the survey here. Thank you!



P R O J E c t   P a r t n e r


Part of Shape's Transforming Leadership programme, on which Autograph is proud to be a key partner.



Banner image: Kin (Cultura Plasmic INC), Crystalline Unclear [still], 2021. Courtesy of the artist.

Images on page: 1 - 3) Kin (Cultura Plasmic INC), Crystalline Unclear [still], 2021. Courtesy of the artist. 4) Kin (Cultura Plasmic INC).

Discover more images: 1) Sophie Hoyle, Chronica, 2018. Photo © Hydar Dewachi. 2) Image by Kin (Cultura Plasmic INC), courtesy of the artist. 3) Transforming Leadership graphic. 4) Mónica Alcázar-Duarte, 200 Billion Per Year [detail], 2021. © the artist.