Marina Otero Verzier, head of Research & Development at Het Nieuwe Instituut, wrote the following introduction to the publication 'Loving Support'.
The cubicle of the office and the library have been, to a large extent, replaced by the long communal table equipped with Wi-Fi connection. A flexible office, a loud library. Activities that were believed to require physical isolation and concentration are now entangled with mingling, sharing and networking, or what has come to be called co-working, the ultimate neoliberal institution. Boosted by private and public sources of funding that privilege crossovers, global networks, interdisciplinary and international teams, contemporary research practices are carried out collectively. We are encouraged to “put ourselves out there.” We work in Dropbox and Etherpad, have discussions in Whatsapp, meetings on Skype and Google Hangouts. And yet, we spend most of the time detached, whether it be physically removed from the bodily presence of others – separated by a screen while connected – or surrounded, but hopefully not bothered, by them.
The idea of research had been long associated with imaginaries of withdrawal. Questioning existing forms of knowledge and advancing new ones was connected to the virtues of reflection and concentration, with limited influence from the exterior world, with constraining the field of action, or certain parameters. Its role models were the anchorite, the mystic, the philosopher, the scientist. Its spatial manifestations ranged from caves to testing rooms, from the Ph.D. carrels to research residencies in cabins in the woods.
Research was the art of negotiating the state of being removed from society while professionally relevant to the society at large.
In the last decades, that imaginary turned outwards. The world was claimed to be the laboratory. The city, too. With no distinction between inside and outside, where circulating facts are not necessarily confirmed or denied, research in this boundless test room, as an intelligence officer for the US Department of Defense once claimed, becomes a sort of ‘red teaming’, the practice of analyzing tactical alternatives*. Research demands the ability to navigate the excess of information and multiplication of media to distinguish signal from noise. In that process, value is no longer linked solely to what a person knows or owns, but to her capacity of threshing out and sharing.
Inspired by previous experiences and these imaginaries of collective research, Het Nieuwe Instituut and Volume invited a group of people to retreat for four days into the woods – a purposely low tech environment – and explore the possibilities and consequences of employing artificial intelligence in the practice of research. We combined co-working with withdrawal and embraced research as re-search. Re- not as a repetition towards excellence, but as an opportunity to open up departures from conventional modes of thinking; by destabilizing routines as much as constructing new ones. We cooked together, ate together, partied together, watched and listened together. We walked with each other, talked with each other. We shared tables, playlists and bathrooms. We didn’t try to reach common ground, so we didn’t. We had agreements, and heated discussions. Some left early. Some came late. Some just dropped in and joined for the fun.
The retreat was meant to serve as a catalyst for new perspectives on the relations between machine learning and research, including questions of authorship, copyright, originality, as well as the transformed condition of labor under general processes of automation. However reflecting on artificial intelligence and its relationship to research – as currently valued according to ‘red teaming’ security frameworks – triggered a rumination on otherness, whether it was the other within oneself (an unutterable desire to be different), or in terms of the construction social identities and relations.
Bearing uncanny resemblance to the activities of military strategists operating in foreign territories, the retreat's participants ended up questioning how algorithmic entities can assist in negotiating this topological terrain of otherness.
A bot that brings a counter argument to our arguments; a critical voice who doesn’t fear your reaction; a refusive worker, though unable to know for reasons of willful disobedience or technical malfunction; an uninvited outsider able to challenge existing power relations like gender unbalance and the geographical origin of sources and languages; a way to destabilize the relation between funding and research; someone to implement all of your decisions; something that would free you of the burden of having to physically interact with, please, compensate or credit others; a bot that can be turned off.
*In an interview with Gary E. Weir (Chief Historian, Office of Corporate Communications, National Geospatial Intelligence Agency) conducted by the author at the NGA headquarters in Springfield, Virginia, on December 12, 2012.