Other(ing) Sensing. Practices, Politics and Ethics of Sensitive Media
Research Group »SENSING: The Knowledge of Sensitive Media«
Conference report «Other(ing) Sensing. Practices, Politics and Ethics of Sensitive Media», 17.-18. June 2021, ZeM – Brandenburg Centre for Media Studies/ Research Group «SENSING: The Knowledge of Sensitive Media»
»Other(ing) Sensing. Practices, Politics and Ethics of Sensitive Media« (17-18 June 2021) was the second conference organized by the Research Group »SENSING: The Knowledge of Sensitive Media« (ZeM – Brandenburg Centre for Media Studies) after its inauguration in 2018. The notions of »other«, »othering« and »otherwise« were the »pressing themes for all of us« despite the group’s divergent research projects, as Vanessa Oberin described in her introductory words on behalf of the research group.1 On the one hand, sensing technologies bear the promise of making the other readable, tangible or traceable, and on the other hand, their historical and contemporary implementations are deeply intertwined with the violence of othering subjectivities. This tension was taken up as a starting point to critically investigate the pathways between sensitive media and conceptions of alterity. At the same time, the notion of ›sensing otherwise‹ pointed towards a particular ethics of research that intervenes in the epistemology and scope of application of sensing technologies to let them become something other than means of oppression and domination while taking into account sensing practices that already exist but have mainly been neglected. The title of the conference proved to be an umbrella for this two-sided maneuver of engaging critically with questions of power and simultaneously intervening in the constitution of powerful relations within the realms of sensitive media. While it would certainly be wrong to portray all of the conference's contributions as coherent, it is noticeable that critical and inventive approaches were brought into a productive proximity, which in turn brought the methods and orientations of research practices into focus.
In the first talk of the conference »Facing Recognition«, Wendy Hui Kyong Chun demonstrated the discriminatory2 Moreover, the term discrimination is politically charged since it refers to unjust ways of social organization: »In parallel to the development of racist ideology, discrimination since then [late nineteenth century] has referred to a prejudicial treatment of individuals based on a social category (e.g., race, gender, sexuality, age, class).« (ibd.) logics and politics of machine learning in the context of facial recognition technologies, which are used to identify and classify individuals within various spheres of everyday life. »Discriminating Data«, as Chun outlined the contours of her recent research, »reveals how algorithms currently encode legacies of segregation, eugenics and multiculturalism […] and how we might work together across disciplines to create different protocols and programs that acknowledge the rich diversity of relations and experiences.« – an approach that resonates with the two-folded endeavor of the conference.
Chun made a strong case why facial recognition systems are rather to be understood as engines of discrimination, misrecognition and segregation, instead of tools for authentication, disclosure or the prediction of future events. As recognition technologies are further implemented into chains of decision making and the organization of social relations, their failures and limits, which especially manifest when it comes to the detection of race and gender, have severe consequences. But, as Chun elaborated, this misaligned constellation that enhances discrimination and fosters unaccountability, cannot be challenged by working towards making pattern recognition systems more ›accurate‹. According to Chun, what is needed instead is a fundamental analysis of what these technologies actually do when people assume that they more or less work. The question, then, is under which assumptions and conditions recognition technologies are believed to work ›correctly‹ – that is, the question under which circumstances and for whom do machine learning procedures count as truthful and obligatory. Chun argued that in order to analyze the methodological and aspirational stakes of machine learning, »we have to look inside and outside the machine«. She further exemplified her approach by investigating one specific case of a controversially discussed machine learning study from 2017 by Yilun Wang and Michal Kosinski from Stanford University, where the authors claim that »deep neural networks can detect sexual orientation from faces« 3 by referring to a highly contested biological theory and making use of biometrical methods in the legacy of physiognomy and eugenic thinking. Chun walked the audience through the entire set-up of that study and each of its steps, revealing a lot of troubling assumptions, methodological misappropriations and errant conclusions.4 What made Chun’s contribution insightful for further research beyond this specific case study is that she managed to raise awareness for the mathematical and technical details and at the same time to strengthen the – so to speak – ›non-digital‹ methods of the humanities by illustrating their significance for the analysis of computational cultures.
As the network structure switched from a conference tool to a live audio stream for the course of the listening session, the participants were encouraged to engage in a different mode of sensing, to ignore the direct vis-à-vis, namely their camera equipped screen, and make time for a contemplative mode of correspondence that values sonic encounters and something that might be called site-based imaginaries. In the first of two listening sessions Budhaditya Chattopadhyay shared excerpts from his publication »The Nomadic Listener«: a blend of live reading and unedited field recordings that together form, as Chattopadhyay put it, »psychographic explorations« of contemporary cities. It is an invitation to listen to Chattopadhyay’s and the record’s listening to the cities. Enmeshed in particular scenes of everyday life and without any hunting for ›iconic‹ markers, the apparatus unflinchingly traced the flux of sonic events, while Chattopadhyay’s poetic depictions attested to a wandering and storytelling mind. Once this mind was affected by stubborn ears that refuse to attend to their environment or by a persistent tune from the past; once carried away by reflections on the (dis)embodiment of passing voices or perceptions beyond the sonic scope; once captivated by ambient rhythms of a room or the tonality of an electrified city under surveillance. Here, the intermingling modalities of recording were not means for capturing patterns and enclosing meaning, but rather a technique for spending time with the flux of events and paying tribute to the unpredictable dynamics of sense making.
The panel »Other(ed) Sensibilities: Revisiting the Sensory Politics of Racialization« might be regarded as a continuation of the topics presented in Chun’s opening talk. It brought together contributions that examined the entangled history of racialization, aesthetics and media-technological apparatuses (Erica Fretwell), that analyzed the history of the connection between sensory regimes, eugenics and the modern intersection between race and sex differences (Kyla Schuller), and that turned the conjunction of sensibility and racialization inherent in knowledge production from an instrument of deprivation and objectification into a tool for critique, intervention and subjectification (Sachi Sekimoto and Christopher Brown). All three presentations dismissed any idea of sensation, feeling or perception as being the ‘pure’ material that withdraws from processes of meaning making, and thereby encouraged a thorough re-thinking of the role of aesthetics within the history of Western philosophy and politics.
Erica Fretwell began her paper by pointing out the choice of words of W.E.B. Du Bois in his introduction to »The Souls of Black Folks« (1903), where he describes the experience of the racialization of black bodies in terms of a »double consciousness«: »It is a peculiar sensation, this double-consciousness, this sense of always looking at one’s self through the eyes of others, of measuring one’s soul by the tape of a world that look on in amused contempt and pity.« 5 After interpreting Du Bois’ mentioning of a tape measuring the soul as »an oblique reference to psychophysics«, Fretwell elaborated how the experiments in the field of perception and affect, as well as the associated theories of aesthetics and media practices participated in the racialization of sensitivity from the second half of the 19th century onwards. Fretwell developed the concept of »sensitivity training« to describe how notions of perceptual sensitivity became the material matrix for racial objectification and a theoretical ground to racist biopolitics. With the help of this concept, she traced and made explicit that the historically evolved connection between aesthetics and the racialization of subjectivities extended into seemingly divergent areas: ranging from the notion of perceptual sensitivity as propelled by British and North American eugenics following the experiments of German psychophysics, over reformatory pedagogy and liberal politics, to popular media practices within the realms of arts and medicine. For the latter Fretwell expanded on two particular image-making genres, namely spirit photography and its role during the American Civil War in the 1860s and contemporary prenatal ultrasound images. Both were discussed as two different examples for a specific modality of sensitivity training that Fretwell called »not-seeing«: a training that converges between what is invisible to ‘bare’ perception on the one hand and ignorance on the other. While spiritual photography, which claimed to capture the souls of deceased relatives, functioned not only as a mediator of loss and grief but also as a tool for racializing these feelings, the prenatal, white pictures of the unborn represent not only innocent life but also a seemingly racially unmarked body.
While Fretwell enfolded her notion of sensitivity training, she referred to, among others, the following panelist Kyla Schuller and her work on »impressibility« that offered further insights which dispositives nourished and naturalized the culture of sensitivity training. As Schuller explained, during the 19th century, theories of sensation, evolution and race condensed into the concept of »impressibility« that became a framework for enacting and justifying a racialized, gendered and sexualized hierarchical ranking amongst human subjects. Thereby Schuller not only exhibited a historical case study of how the modern conceptions of race, sex and gender intersected concretely, but also disclosed how notions of ›progress‹ as well as environmental dependency have been crucial factors for the formation of discriminatory agendas. As analyzed by Schuller in the context of the so-called American School of Evolution,6 the conception of impressibility was consistent with the then dominant Lamarckian understanding of evolution7 and rendered race as a »differential capacity to move forward through time« (Schuller): On the one hand, the assumed (in)capacity of a body for sensory impression – the (in)ability to be changed by experience, to grow and progress – was considered to be a marker for racial difference that moved along the line of what was believed to be (un)civilized; and on the other hand, the (un)impressible body was supposed to be the result of a transformable heredity determined by repetitious behaviors and sensations, that is to say by a progressive adaptation to environmental affordances. The framework of impressibility brought forward a model of the modern white body that actually owed its purported superiority to its environmental dependency and therefore to its vulnerability. As Schuller further explained, the American School of Evolution argued that in order to handle the burden of impressibility efficiently the white body turned to specialization by dividing its capacities into a complementary and compatible binary of male and female. So, through her historical analysis, Schuller provided evidence for how the very idea of binary, cis-gendered sexes »fully articulated« (Schuller) only during the 19th century and was genuinely intertwined with the emergence of racialized subjects. This sensory regime of the modern body uplifted the role of the environment and at same time turned precarity into the condition for progress. Highlighting the relation between modern knowledge of sensation and biopolitics, Schuller further demonstrated the different regulatory measures that were propelled in the wake of the concept of impressibility. Those ranged from segregation in the name of safeguarding progress to patronizing programs of education, where orphaned or indigenous children were brought into residential schools in order to enable the cultivation of allegedly better sensibilities. Schuller closed her talk by emphasizing that »regulating experience and sensation and environments can also be part of long eugenic legacies«.
In Sachi Sekimoto’s and Christopher Brown’s approach of theorizing the sensory dimension of race and racism, sensing was modified into a critical tool for a situated analysis of the sensuous materiality of racialization and its discriminatory effects. Next to drawing upon their examination of race as feeling by expanding upon several examples from their research, Sekimoto and Brown both initiated exercises that asked the audience to engage with one’s own bodily feelings and reflect upon the profound ways of how one’s sensory experience is embedded in a racist regime. In this manner Sekimoto and Brown addressed the audience as subjects of race without assuming or promoting ‘common feelings’ and at the same time performatively illustrated the intervening potential of their methodology and their core argument: Racialization is generative, it shapes the sensuous texture and rhythm, produces »multisensorial feelings«, »physical memories«, »viscous sensations«, as Brown and Sekimoto put it, that »somatically, kinesthetically and tactilely« attach to the body. So, by particularly focusing on the subject that experiences and makes sense of the inscribed racialization, race is rendered as »feel-able« (Brown), which allows for an analysis of power relations within racist societies through the framework of sensation. This method opens up for a knowledge production that is oriented towards subjectification, intersubjectivity and the »assemblages of embodied senses« (Sekimoto), rather than objectification. As Sekimoto and Brown pointed out in the abstract that accompanied the announcement of their talk: »When it comes to race, the body is not merely an object on which racial differences are inscribed, but it is simultaneously the subject that feels such inscription.« It is this shift of perspective towards the subjects of feeling and perception that Sekimoto and Brown performatively and informatively initiated in their presentation; a shift from othering to making sense otherwise.
The panel »Sensing Technology Narratives – Imaginaries of Intervening and Accessing« put emphasis on how narratives in terms of tropes, concepts, fictions or lines of argumentation are entangled with the development and deployment of sensing technologies and therefore have meaningful and material effects in terms of the experiences and lives of people and the shaping of socio-technological relations. Exploring the cross-roads between technical imaginaries and imaginary techniques, the three contributions presented different approaches of how to investigate technological narratives: ranging from an attention towards the claims of epistemological raptures (Orit Halpern), over an unpacking of recurring narratives (Melissa Littlefield), to the elaboration of counter-narratives (Ashley Shew).
Orit Halpern addressed the relationship between neo-liberalism, neural nets and reactionary politics by enfolding a historical trajectory of the formation of an epistemology of shock, which evolved from the late 1940s within the realms of psychology, economics and computing-technology, to the contemporary situation, where »market volatility and reactionary politics appear to have become norms« (Halpern). Arguing that »shock therapies and neural networks are not merely metaphors of neo-liberal policies, but a form of knowledge and practice«, Halpern showed the historically intimate relationships between shock as means for manipulation, the proliferation of digital environments and an economy driven by the financial market. According to Halpern, the introduction of the concept of networked neuro plasticity was fundamental for an epistemological shift that normalized shock in terms of sensory stress, market disruption and information overload. At the core of the networked neuro plasticity lays a theory of mind that renders cognition as a dynamic and collaborative system that is »scalable, handleable and re-programmable« and that gave rise to the idea that people’s cognition can be ‘re-wired’ from a distance by »environmental manipulation of data« (Halpern). Though Halpern mentioned that the contexts for this way of thinking are to be found in the ongoing automatization of cognitive work, the industrialization of electronic media, as well as the instrumentalization of sensory deprivation as means for therapy as well as for violence, she focused on the ‘neuro-nexus’ that thickened between a liberal theory of cognition and finance economy – a nexus, where computational media act as a kind of relay in a self-reinforcing feedback-loop between these spheres. As Halpern explained, in analogue to the neural mind, neoliberal economy was modeled as a dynamically re-organizing and open network with such complex circuits of informations that it becomes unmanageable for individuals as well as for central institutions. The figure of the market became the answer to this problem. Since the market was believed to be the only entity that is able to properly process all the distributed economic information, it has been assumed to reasonably coordinate the economic processes. So, as Halpern elaborated, the notion of a plastic and networked mind, that was introduced by the psychologist Donald Hebb in 1948 and picked up by the economist Friedrich Hayek in the 1950s, was key for the creation of »markets that can be datafied and self-organizing, though never represented in their completion« (Halpern). Further, Halpern considered algorithmic trading and derivative instruments as conceptualized by the economist Fischer Black8 to have caused an acceleration of the normalization of shock by tightening the ›neuro-nexus‹ and intensifying digital infrastructures’ instrumental role for the ideology of ‘networked intelligence’.
While Orit Halpern presented a genealogy of ‘networked intelligence’ in stressing the epistemological and normative ruptures it has incorporated, Melissa Littlefield questioned the ideas associated with the very assumption that the human brain is »accessible and understandable, plastic and malleable« (Littlefield) by demonstrating the extent to which brain waves are to be understood as an invention rather than a discovery. In her talk, Littlefield focused on one specific brain imaging technology, namely Electroencephalography (EEG)9 , that gave rise to brain waves as an entity and that is used as means to target the human brain not only for medical reasons, but also in order to control, optimize or manipulate persons. Littlefield argued that conceptions of the EGG technology are formed by fiction in terms of imaginaries that drive science practice as well as by depictions in science fiction. After showing the persistence of brain wave ideology in the context of the contemporary commodification of EEG into wearable custom-goods (which are either promoted as a fun accessories or used seriously as monitoring device for controlling employees), Littlefield unraveled the cultural history of EGG and brain waves by paying attention to the two notions of fiction. Firstly, Littlefield engaged with pre-EEG narratives and showed that before EEG was introduced by the psychiatrist Hans Berger as a scientific concept in 1929, the »idea that mental processes can be transferred almost directly to visibly written patterns« (Littlefield) was already present in the popular and scientific imagination. The notion of brain waves was introduced by the architect James Thomas Knowles in 1869, and mainly shaped by a belief in connection and communication which circulated widely in magazines and novels, artistic and scientific speculations, before it became of central concern within the design endeavors of the actual technological device. Secondly, Littlefield turned towards the post-EEG representations of brain waves in science fiction from the 1950s and 1960s with a closer analysis of the imagined potentials and risks of brain-to-brain-to-machine connections by examining episodes of »Flash Gordon« and »The X-Men«. Littlefield concluded her presentation by stressing that a »combined analysis of science and science fiction” enables to »uncover science fictional foundations for historical and contemporary technological design and ideologies« and to »recognize that brain waves are as much a product of our imagination as they are perceived output from an oscilloscope«.
Ashley Shew began her talk by building a bridge to the first panel of the conference. Taking the cue from Brown and Sekimoto, she firstly shared her thoughts about her own embodiment as disabled person – and the way she did it already touched upon counter-narrative modes of talking and thinking about the relation between technology and disability. In offering a personal account of her embodiment, Shew emphasized how her bodily perceptions were reconfigured by becoming an amputee, and highlighted the ambivalence with which she experienced these changes, sharing them with the audience in a variety of ways and tones.10 This led Shew to the main approaches of the research group »Disability, Experience, and Technological Imagination,« in which she and her colleagues are studying »disabled people’s accounts, ways of knowing and expertise« (Shew) in order to counter dominant narratives about disability and to change the perspective on how disability and technology are approached and whose experiences count in different areas of public participation. As Shew pointed out, the historically troubling relationship between disability and technology, closely tied to eugenic beliefs, is still very much persistent and present. Parallel to the most dominant medical and popular tropes in which, according to Shew, disabled people are either stereotyped as persons awaiting redemption or sources of inspiration, »technologies for disability usually come in two flavors: either you overcome your disability or you aren’t worth living« (Shew). Shew coined the term »technoableism« to describe the phenomenon of undermining the agency of people with disabilities by viewing them merely as a challenge around which to design. Technoableism is a perspective that turns disabilities into individual problems of bodily perfection. Shew made clear that, what is really needed, is a perspective that addresses disability as a systemic issue and recognizes that infrastructures are entangled with embodied experiences and knowledges. It’s a perspective that engages with disabled people as active participants of the community and values their agency, creativity and choices. This ethic guides the research group’s work and is methodologically realized through the analysis of various materials published by disabled persons as well as the conversations they have with each other, privileging disabled person’s ways of knowing and expertise for building counter-narratives. Shew gave examples of counter-narratives in the context of space travel and presented a collection of arguments from various disabled persons, in which they show how their particular embodiments are especially suited for the demands of space travel, foregrounding the already present ableness of their bodies and knowledges. Highlighting ambiguous dimensions of any technology and noting that »no technology is good for all the time«, Shew further encouraged the audience to think about (dis)abilities in terms of temporal assemblages of bodies and technologies.
For the second listening session, »Parasites,« by Janna Holmstedt, the format of the conference once again switched into a radio-like setting. In her acoustic essay, Holmstedt took the audience to the sea, where she tuned into the life and death of bladder wracks, and further turned the sea-setting into a para-site to think-with. In a playful exploration of Michel Serres’ notion of the parasite, Holmstedt elaborated on the figure of the para-site as an expression for the »condition of existing alongside«, and a mode of being that centers hospitality. »In its most radical sense,« as Holmstedt stated in the listening piece, »hospitality involves giving oneself over to a stranger. I am a parasite. We are para-sites.« For Holmstedt the figure of the para-site allows for a concept of more-than-human that accounts for the unavoidable distance in the process of making sense, emphasizing that »language and storytelling are always of the site.« As Holmstedt further explained in the piece: »I embrace the parasite as a way to resist illusionary oneness, acknowledge power relations, yet remain open to the possibility of being-with, being relation.« Since Holmstedt illuminated in her work the extent to which hospitality is the premise for the operationality of information networks, the piece could also be understood as a commentary on the medial conditions of the online-conference and the attempt to grasp it as a gathering taking para-place, where »my data might be your noise.«
The speakers of the third panel »More-than-human sensing« shared a concern for relationality, difference and engagement. Since the panelists have backgrounds in different disciplines and have collaborated with particular entities and environments differently – ranging from caterpillars or marble captured in literary observations (Ally Bisshop), over neurons and hormones sustained within the realms of in vitro laboratory life (Deboleena Roy), to a mapping of the becoming of Darling 58, a transgenically modified American chestnut tree (Elaine Gan) –, each of them told a different story. Nevertheless, each foregrounded in their own way the necessity and desire to seek methods that re-configure how research and knowledge is practiced; for techniques that re-work how the humanities, sciences, activism and the arts are done; for approaches that take place in between these disciplines in order to produce shifts towards less violent ways of relating. So, throughout the panel, the adjective »more-than-human« unfolded into something that might be called a ›considerate pragmatics‹ – it became an attribute for practices that, despite their discrepancies, share an acknowledgment of the ongoing re-shaping of relationships and cherish the generative modes of becoming while taking into consideration refusal, disentanglings and endings.
Commencing with the premise that »more-than-human sensing is a relational practice of speculative techniques for sensing with something otherwise«, Ally Bisshop’s presentation resembled a mediation in movement that embarked on propositions for »more-than-human sensing«. A cemetery in Brisbane in Australia worked as a kind of anchor throughout Bisshop’s wandering mode of making more-than-human sense. Or rather, her field notes from the cemetery paid attention to different modes of sensing and interaction – for example between a caterpillar and a marble stone or a spider and a leaf – served as a concrete and symbolic place of departure and return for probing and conceptualizing an ethos of the »more-than-human.« Bisshop framed the notion of more-than-human as a countermovement to modern Western notions of human exceptionalism with its propagation of dehumanization and indifference. While stating that the process of sensing is to be understood as a »practice of making relations across thresholds of differences«, so that experience and sense-making are »always-already-more-and-less-than-human«, Bisshop advocated for openness, curiosity, play and becoming strange in order to move beyond »mechanistic schemas« of relating. Nevertheless, or perhaps rather for that very reason, Bisshop emphasized the necessity to stay alert to refusal, to care for frictions and irritations, to be attentive to languages, gestures, signals that indicate rejections in the process of making sense. In her proposition for an ethos of more-than-human, the notion of refusal is as integral as those of curiosity and collaboration. Demonstrating an awareness for the factor that playful relationships require trust and are risky undertakings, Bisshop formulated the obligation to »be alert to the more-than-human signaling ›no‹«.
Under the title »Molecular Feminisms, Stolonic Strategies, and Microphysiologies of Desire« Deboleena Roy offered an account of the genesis of her research practice, which is characterized by a mediating and transdisciplinary approach navigating between molecular biology and philosophy, scientific research and social justice activism, post-humanism and post-colonial theory. By experimentally examining the »Melatonin and Gonadal Steroid-mediated Regulation of the Reproductive Axis« Roy was able to challenge a dominant understanding of biological processes and functions. Against the common assumption that modeled the communication between brain and ovaries in a hierarchical fashion of ›top-down-command‹, Roy proved with her laboratory research that the communication is organized in terms of reciprocal feedback and that the hormones melatonin, estrogen and androgen do impact the neuronal regulation of reproductive physiology. This produced knowledge, as Roy further explained, was the result of a collaborative practice in two ways: it was motivated by political activism in the field of reproductive justice and it was co-produced by the in vitro cells themselves, with whose rhythms of living Roy had to temporarily attune to and partially synchronize. Roy thereby described the practice of research as an intimate relationship that asks the researcher to be touched and moved by its ›object‹ of study: to produce knowledge means to learn from the ›object‹ of study not merely about it. Furthermore, Roy emphasized that responsiveness is not an exclusively human feature. With reference to the notion of »physiology of response« by the biophysicist J.C. Bose (who demonstrated the responsiveness of plants by measuring their electrical activity) as well as to her grandmother (who thought Roy to be attentive to the liveliness of grass), Roy advocated for an ontological approach that she metaphorically coined as »stolonic strategies«. Analogically to grass that throws out runners (also stolons in botanical language) in order to make connections and produce further roots at its nodes, Roy uses stolonic strategies in the realms of social movements in order to make connections in a horizontal manner and to contribute to the »multiplication of difference« (Roy) by bringing together people and knowledge from molecular biology, philosophy and political activism.
Difference and collaborative knowledge-making practices are also crucial to Elaine Gan’s research, which she described as multimodal. In her talk, Gan gave insights into current and still ongoing fields of study, while focusing on questions of methods, commitments and orientations that shape how the research is done. Stating that one of her primary interests in research is »how relations come to matter«, Gan referred to Helen Verran’s notion of »doing difference differently« in order to explain her take on difference, which is guided by the principle to »move away from assuming that we know what difference looks like« and ask instead »what might it mean to learn, to inhabit difference, to become sensitive and sensible to the incommensurate in epistemology.« Together with graduate students, Gan examines the case of the American chestnut tree by engaging in different modes of conversation with the several agencies and actants that build this specific case of a more-than-human assemblage. Methodologically, archival research, interviews with different groups of people,11 readings, media recordings and field walks are all part of the apparatus that participate in what it examines, namely the onto-epistemological becoming of Darling 58, a transgenically modified American chestnut tree. This transgenic version of the American chestnut tree – the former »queen of the forest« became extinct due to a fungus that was carried by imported Japanese and Chinese chestnut trees in the 19th century and to whom the American chestnut was not immune – is being planted in the state of New York without commercial restriction, since the geneticists, who designed the tree, have not patented it. The aim of the project is that the modified tree may cross pollinate with other trees and find its own companions. After accompanying and investigating the contested procedures of planting Darling 58, Gan argued that so far the findings of their research challenge the common narratives in the debates on transgenes. Countering ethical framings which are characterized by a two-sided fight, Gan emphasized that »in this particular story, there is no evil cooperation and no all-good, all-knowing savior of wild or native forests, of indigenous lands.«
While the conference began with the problematization of the historically shaped and technologically amplified relation between difference and discrimination, it ended with the critical and creative potential of becoming different. Or rather, the question of navigating between both notions of difference was present throughout the whole conference. Other themes that recurred throughout the conference were time and timing, environment, (dis)entanglement and the relation between laboratory experiments, commodification, and education. Starting with a prominent example of contemporary sensitive media (machine learning applications) and bringing together researchers who interrogate the relations between the sensitive and media in different ways, the conference not only broadened the scope of what sensitive media might be, but also foregrounded the task of further investigating and historically contextualizing the epistemologies of sensation.
- 1The concept of the conference was developed by the doctoral candidates, namely Anja Breljak, Kate Donovan, Vanessa Oberin, Nicole Schimkus, Christian Schwinghammer, Alice Soiné, Daniel Stoecker, hosted together with Marie-Luise Angerer, Bernd Bösel, Jan Distelmeyer from the research group, and supported by Anna Jehle, Hannah Schmedes, Rebekka Eick and Anna Zaglyadnova from the team at ZeM (Brandenburg Center for Media Studies). The conference took place online.
- 2Within computer science »discrimination« is used as a technical term in the context of »pattern recognition« and describes the operation of attributing identity markers to data in order to gain (filter, distinguish, separate) information. »But«, as Clemens Apprich points out, »far from being a neutral process, the delineation and application of patterns is in itself a highly political issue, even if hidden behind a technical terminology.«
»Introduction«, in: Pattern Discrimination. Meson press 2018, p. x)
- 3»Deep Neural Networks Are More Accurate Than Humans at Detecting Sexual Orientation From Facial Images«, in: Journal of Personality and Social Psychology. February 2018 Vol. 114 Issue 2, p. 246-257. Chun explained her choice by underlining that this article is »not exceptional – neither in its content, nor in its technique«.
- 4Just to name a few: The authors extracted portrait images from US dating sites, used those and the provided information thereon with regards to gender, race and sexual orientation as their input data set and ground truth, without considering that different ways of ›faking‹ are integral to self-representation (ranging from filters, photoshop, lies or plastic surgery). Or as Chun brought it to the point: »Deep fakes are not the post- but the pre-condition of machine learning applications.« Moreover, the authors not only turned questions of facial style into a biological fact and thereby ignored that, as Chun explicated, reading style is a complex socio-cultural activity; but also, they justified both a racial filtering of the data set (making use only of Caucasian white faces) as well as their claim of a universal validity simply by referring to a refuted biological theory.
- 5W.E. Burghardt Du Bois: The Souls of Black Folk. Essays and Sketches, 1903, Chicago, A.c. McClurg & Co. 1903, p. 3. Du Bois’ »double-consciousness« resonates with what the panelists Sekimoto and Brown captured under the notion of »skin consciousness«. Though the latter concept explicitly focuses on the connection between the unavoidable intersubjective reciprocity of tactility and the politicization of skin, it meets with Du Bois in terms of the doubling of self-perception: »Living in a racialized body«, as Sekimoto explained during the presentation, »means moving through the social world with a double somatic awareness of being a subject, who is aware of their own objectiveness.«
- 6According to Schuller the so-called American School of Evolution consisted among others of race scientists, phrenologists, anatomists, zoologists and biologists.
- 7Meaning that evolution was conceptualized in the tradition of Jean-Baptiste Lamarck, where species progress, as explained by Schuller, was modeled by the idea that traits develop through the ongoing repetition of behaviors and sensations rather than being understood as a result of random mutation and natural selection as brought forward by Charles Darwin. Following the finding of the historian Peter J. Bowler, who demonstrated that it was not until the 1940s that the Lamarckian lens of heredity was replaced by the Darwinian lens of mutation, Schuller undertook a re-examination of the 19th century discourse on evolution and bodies.
- 8As Halpern explained, Fisher Black framed noise as the driving factor of financial assets and turned instability (rather than the relation between demand and supply) into a source of value.
- 9For example, Shew addressed the moments of confusion through the reconfigured embodiment by finding metaphors that convey the felt fear as well as by sharing photographs that bring out the humorous aspects of it.
- 10EGG is a technology that allows to gain knowledge about a brain’s functionality by measuring the electrical potential that is generated in the different parts of the brain and translating this data into visual information such as graphic curves or colored mappings.
- 11For example, with indigenous environmental experts, governmental agencies and geneticists. According to Gan these groups of people »are thinking together about the ethics and politics of transgenic and public forests and public parks« (Gan).
Die Open-Access-Veröffentlichung erfolgt unter der Creative Commons-Lizenz CC BY-SA 4.0 DE.