Category: Uncategorized

Lecture Series Winter 2025/26: Digital Complexity: Beyond Human Understanding

We are happy to announce that the lecture series of the winter term 2025/26 will revolve around the topic of Digital Complexity: Beyond Human Understanding.

Current developments in the fields of simulation and artificial intelligence have shown that the complexity of digital tools has exceeded the levels of human understanding. We can no longer comprehend and explain the results that AI delivers. Even AI deceptions and hallucinations are now almost impossible to detect. This raises the question of the relationship between humans and their technology anew. Are technologies as instruments useful extensions of human capabilities, as it was understood in the classical philosophy of technology, or are we now extensions of technology? Will AI dominate us in the near future?

The lecture series addresses these fundamental questions as well as ethical issues of digital transformation. It also takes a look at the development of digitality as a modern paradigm. Even though digital computers first appeared in the 1940s, there is a longer-term history of the development of digital tools and methods deeply rooted in our self-understanding as humans. Knowledge of this history makes it easier to understand current developments.

But what exactly do these current developments mean for science and society? The different lectures aim to tackle various aspects of the digital transformation of science and society from the perspective of “digital complexity.” Questions about explainable AI, about the well-being of people in a digital world, about the social and political impact of digital, social media will be explored, as well as the provocative question of who will be doing research in the future: humans or AI?

Various speakers, including the media theorist Anna Tuschling and the sociologist Dirk Baecker, will be guests at the KHK c:o/re to shed light on “Digital Complexity: Beyond Human Understanding” from different disciplinary perspectives.

Please find an overview of the dates and speakers in the program.

The lectures will take place from October 22, 2025, to February 11, 2026, every second Wednesday from 5 to 6:30 pm in presence and online.

Part of the lecture series are three keynotes, held in the context of the 8th HaPoC Conference “History and Philosophy of Computing,” hosted by c:o/re in December 2025. The conference will examine the topic of “digital complexity” in greater detail.

If you would like to attend, please write a short email to events@khk.rwth-aachen.de.

The Computer in Motion

A red, tall sculpture in front of the building of the main building of the University of Otago in Dunedin, New Zealand.

ARIANNA BORRELLI

The Symposium Computer in Motion critically questioned the notion of the computer as a “universal machine” by demonstrating how the computer adapted to and was appropriated in different milieus, cultures, communities and across borders. By historicising the movement of computer-related knowledge and artifacts, the presentations help us recover multiple sites of agency behind the idea of an unstoppable digital transformation driven by capitalist innovation. The event was organized by Barbara Hof (University of Lausanne) Ksenia Tatarchenko (John Hopkins University) and Arianna Borrelli (c:o/re RWTH Aachen) on behalf of the Division for History and Philosophy of Computing (HaPoC) in context of the 27th International Conference of History of Science and Technology, held at the University of Otago in Dunedin, New Zealand, and online from June 29 to July 5 2025.

The Symposium had an exceptional coverage of periods and cultures, showcased here as an example of the manifold thematic and methodological facets of the history of computing..

Main building of the University of Otago in Dunedin, New Zealand (photo credits: Elisabetta Mori).

Decimal place-value notations prior to the 10th century: a material computer

Karine Chemla (School of Mathematics, University of Edinburgh, and researcher emerita CNRS)

This presentation argues that decimal place-value notations have been introduced as material tools of computation and that, until around the 10th century, they were used only as a material notation to compute, and were never shown in illustrations in mathematical writings, let alone used to express numbers. Furthermore, the presentation argues that the same remarks hold true whether we consider the earliest extant evidence for the use of such a numeration system in Chinese, Sanskrit, and Arabic sources. In all these geographical areas, decimal place-value numeration systems were first used as a material notation. These remarks suggest that, as a tool of computation, decimal place-value numeration systems have circulated in the context of a material practice, despite changes in the graphics for the digits, and changes in the materiality of the computation.

Smuggling Vaxes: or how my computer equipment was detained at the border

Camille Paloque-Bergès (Laboratoire HT2S, Conservatoire National des Arts et Métiers)

Between 1980 and 1984, the global computer market was heavily influenced by rising DEFCON levels and industrial espionage, alongside the imposition of restrictions on US computer equipment exports (Leslie, 2018). One notable example was the VAX mini-computer from DEC, which became subject to the COCOM doctrine, restricting its distribution due to its strategic importance. Popular in research communities, the VAX supported the Unix operating system and played a pivotal role in the development of Arpanet and the UUCP networks, both precursors to the modern Internet. Despite restrictions, the VAX was widely imported through workarounds or cloning techniques. This paradox of open-source R&D efforts occurring within a politically closed environment (Russell, 2013; Edwards, 1996) is illustrated by the infamous “Kremvax” joke on Usenet, which falsely claimed the USSR had joined the Internet. The study of the VAX’s role in both Eastern and Western Europe highlights the tension between technological openness and Cold War-era containment policies. These technical and administrative maneuvers, though trivial to the broader public, were crucial for the diffusion and cultural adoption of early data networks at the level of the system administrator working in a computer center eager to become a network node.

A Thermal History of Computing

Ranjodh Singh Dhaliwal (University of Basel) ranjodhdhaliwal.com

If you open a computer today, the biggest chunk of real estate, curiously, is not taken by processors, memories, or circuit boards but by increasingly complex heat sinks. Starting from this observation that all technology today needs extensive heat management systems, this piece theorizes the historical and conceptual dimensions of heat as it relates to computing. Using case studies from the history of computation–including air conditioning of the early mainframe computers running weather simulations (such as ENIAC in IAS in the 1960s) and early Apple machines that refused to run for long (because Steve Jobs, it is said, hated fans)–and history of information–the outsized role of thermodynamics in theorizing information, for example–I argue that computation, in both its hardware and software modalities, must be understood not as a process that produces heat as a byproduct but instead as an emergent phenomenon from the heat production unleashed by industrial capitalism.

Put another way, this talk narrates the story of computation through its thermal history. By tracing the roots of architectural ventilation, air conditioning of mainframes and computer rooms in the 20th century, and thermodynamics’ conceptual role in the history of information and software, it outlines how and why fans became, by volume, the biggest part of our computational infrastructures. What epistemological work is done by the centrality of heat in these stories of computation, for example, and how might we reckon with the ubiquitization of thermal technologies of computing in this age of global climate crises?

Karine Chemla (photo credits: Elisabetta Mori)

Supercomputing between science, politics and market

Arianna Borrelli (Käte-Hamburger-Kolleg “Cultures of Research” RWTH Aachen)

Since the 1950s the term “supercomputer” has been used informally to indicate machines felt to have particularly high speed or large data-handling capability. Yet it was only in the 1980s that systematic talk of supercomputers and supercomputing became widespread, when a growing number of supercomputing centers were established in industrialized countries to provide computing power mainly, but not only, for fundamental and applied research. Funding for creating these institutes came from the state. Although arguably at first these machines could be of use only in a few computationally-intensive fields like aerodynamics or the construction of nuclear power plants, sources suggest that there were also scientists from other areas, especially physicists, who promoted the initiative because they regarded increasing computing power as essential for bringing forward their own research. Some of them also had already established contacts with computer manufacturers. In my paper I will discuss and broadly contextualize some of these statements, which in the 1990s developed into a wide-spread rhetoric of a “computer revolution” in the sciences.

Neurons on Paper: Writing as Intelligence before Deep Learning

David Dunning (Smithsonian National Museum of American History)

In their watershed 1943 paper “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Warren McCulloch and Walter Pitts proposed an artificial neural network based on an abstract model of the neuron. They represented their networks in a symbolism drawn from mathematical logic. They also developed a novel diagrammatic system, which became known as “McCulloch–Pitts neuron notation,” depicting neurons as arrowheads. These inscriptive systems allowed McCulloch and Pitts to imagine artificial neural networks and treat them as mathematical objects. In this manner, they argued, “for any logical expression satisfying certain conditions, one can find a net behaving in the fashion it describes.” Abstract neural networks were born as paper tools, constituting a system for writing logical propositions.

Attending to the written materiality of early neural network techniques affords new historical perspective on the notoriously opaque technology driving contemporary AI. I situate McCulloch and Pitts in a material history of logic understood as a set of practices for representing idealized reason with marks on paper. This tradition was shot through with anxiety around the imperfection of human-crafted symbolic systems, often from constraints as mundane as “typographical necessity.” Like the authors they admired, McCulloch and Pitts had to compromise on their notation, forgoing preferred conventions in favor of more easily typeset alternatives. Neural networks’ origin as inscriptive tools offers a window on a moment before the closure of a potent black box, one that is now shaping our uncertain future through ever more powerful, ever more capitalized deep learning systems.

Knowledge Transfer in the Early European Computer Industry

Elisabetta Mori (Universitat Pompeu Fabra, Barcelona)

The collaboration with an academic mathematical laboratory or research institute is a recurring pattern in the genesis of early computer manufacturers: it typically involved financial support and exchanges of patents, ideas and employees.


In my presentation I show how knowledge transfer between academic laboratories and private corporations followed different strategies and was shaped by the contingent policies and contexts in which they unfolded. The presentation focuses on three different case studies: the partnership between the Cambridge Mathematical Laboratory and Lyons, begun in 1947; the example of the Mathematisch Centrum in Amsterdam and its 1956 spin-off NV Electrologica; and the case of the Matematikmaskinnämnden and Facit, the Swedish manufacturer of mechanical calculators, which entered the computer business in 1956.

The three case studies are representative of three distinct patterns. First, knowledge transfer by a sponsorship agreement. Funding and supporting the construction of the EDSAC computer enabled the Lyons catering company (a leader in business methods) to appropriate its design to manufacture its LEO Computers. Second, knowledge transfer through a spin-off. Electrologica (the Netherland’s first computer manufacturer) was established by computer scientists of the Mathematisch Centrum as a spin-off to commercialize the computers designed by the institute. Third, the recruitment of technical staff from a center of excellence. Facit entered the computer business by hiring most of its technicians and researchers from Matematikmaskinnämnden (the research organization of the Swedish government). Taken together the three case studies cast light on how R&D diffused in the embryonic computer industry in post-war Europe.

Elisabetta Mori (photo credits: Mano Manoharan)

A commission and its nationalist technicians: expertise and activities in the Brazilian IT field in the 1970s

Marcelo Vianna (Federal Institute of Education Science and Technology of Rio Grande do Sul)

The history of Brazilian IT in the 1970s is influenced by the work of a group of specialists who occupied spaces in university and technocratic circles to propagate ideas of technological autonomy from the Global North. In this sense, there is a consensus that an elite of this group, acting in the Commission for the Coordination of Electronic Processing Activities (CAPRE), managed to establish a national Informatics policy, giving rise to an indigenous computer industry at the end of the decade. However, there is still much to be explored about the dynamics surrounding CAPRE’s different activities and the profile of its “ordinary” technicians, considering the breadth of attributions that the small body assumed in structuring the Brazilian IT field. Our proposal is to map them by combining prosopography and identifying the concepts, cultures and practices that guided its actions, such as the ideas of “rationalization” and “technological nationalism” and the establishment of a technopolitical network with the technical-scientific community of the period, including the first political class associations in the field of Computer Science. The paper will discuss the composition of the group and its expertise and trajectories, as well as the main actions of the technicians aimed at subsidizing CAPRE’s decision-makers. In this sense, the considerable degree of cohesion between technicians and its leaders ensured that an autonomous path was established for Informatics in the country, even though they were exposed to the authoritarian context of the period, which led to CAPRE itself being extinguished in 1979.

People of the Machine: Seduction and Suspicion in U.S. Cold War Political Computing

Joy Rohde (University of Michigan)

The computational social scientific projects of the Cold War United States are known for their technocratic and militarized aspirations to political command and control. Between the 1960s and the 1980s, Defense officials built systems that sought to replace cognitively limited humans with intelligent machines that claimed to predict political futures. Less familiar are projects that sought to challenge militarized logics of command and control. This paper shares the story of CASCON (Computer-Aided System for Handling Information on Local Conflicts), a State Department-funded information management system that mobilized the qualitative, experiential knowledge and political acumen of diplomats to challenge U.S. Cold War logics, like arms trafficking and unilateral interventionism. The system’s target users—analysts in the Arms Control and Disarmament Agency and the State Department tasked with monitoring conflicts in the global South—were notoriously skeptical of the Pentagon’s militarism and computational solutionism. Yet users ultimately rejected the system because it did tell them what to do! Despite their protestations, they had internalized the command and control logics of policy computing.

CASCON was an early effort to design around the contradictions produced by coexisting fears of human cognitive and information processing limits, on the one hand, and of ceding human agency and expertise to machines on the other. I conclude by arguing that CASCON reflects the simultaneous seduction and fear of the quest to depoliticize politics through technology—an ambivalence that marks contemporary computing systems and discourse as well.

AI in Nomadic Motion: A Historical Sociology of the Interplay between AI Winters and AI Effects

Vassilis Galanos (University of Stirling)

Two of the most puzzling concepts in the history of artificial intelligence (AI), namely the AI winter and the AI effect are mutually exclusive if considered in tandem. AI winters refer to the phenomenon of loss in trust in AI systems due to underdelivery of promises, leading to further stagnation in research funding and commercial absorption. The AI effect suggests that AI’s successful applications have historically separated themselves from the AI field by the establishment of new/specialised scientific or commercial nomenclature and research cultures. How do AI scientists rebrand AI after general disillusionment in their field and how do broader computer science experts brand their research as “AI” during periods of AI hype? How does AI continue to develop in periods of “winter” in different regions’ more pleasant climates? How do periods of AI summer contribute to future periods of internet hype during their dormancy? These questions are addressed drawing from empirical research into the historical sociology of AI, a 2023 secondary analysis between technological spillages and unexpected findings for internet and HCI research during periods of intense AI hype (and vice versa, AI advancements based on periods of internet/network technologies hype), as well as a 2024 oral history project on AI at Edinburgh university and the proceedings of the EurAI Workshop on the History of AI in Europe during which, several lesser known connections have been revealed. To theorise, I am extending Pickering/Deleuze and Guattari’s notion of nomadic science previously applied to the history of mathematics and cybernetics.

Janet Toland (photo credits: Elisabetta Mori)

Vector and Raster Graphics : Two Pivotal Representation Technologies in the Early Days of Molecular Graphics

Alexandre Hocquet and Frédéric Wieber (Archives Poincaré, Université de Lorraine), Alin Olteanu (Shanghai University), Phillip Roth (Käte-Hamburger-Kolleg “Cultures of Research” RWTH Aachen)

https://poincare.univ-lorraine.fr/fr/membre-titulaire/alexandre-hocquet

Our talk investigates two early computer technologies for graphically representing molecules – the vector and the raster display – and traces their technical, material, and epistemic specificity for computational chemistry, through the nascent field of molecular graphics in the 1970s and 1980s. The main thesis is that both technologies, beyond an evolution of computer graphics from vector to raster displays, represent two modes of representing molecules with their own affordances and limitations for chemical research. Drawing on studies in the media archaeology of computer graphics and in history of science as well as primary sources, we argue that these two modes of representing molecules on the screen need to be explained through the underlying technical objects that structure them, in conjunction with the specific traditions molecular modeling stems from, the epistemic issues at stake in the involved scientific communities, the techno-scientific promises bundled with them, and the economic and industrial landsape in which they are embedded.

Erring Humans, Learning Machines: Translation and (Mis)Communication in Soviet Cybernetics and AI

Ksenia Tatarchenko (John Hopkins University)

This paper centers on translation in Soviet cybernetics and AI. Focusing on cultural practices of translation and popularization as reflected in widely-read scientific and fictional texts, I interrogate practices of interpretation in relation to the professional virtue of scientific veracity as well as its didactic function in the Soviet cybernetic imaginary throughout the long Thaw. The publication of the works of Norbert Wiener, Alan Turing, and John von Neumann in Russian was not simply aimed at enabling direct access to the words and thoughts of major bourgeois thinkers concerned with automation and digital technologies: translating and popularizing cybernetics in the post-Stalinist context was about establishing new norms for public disagreement. No longer limited to the opposition of true and false positions, the debates around questions such as “Can a machine think?” that raged across a wide spectrum of Soviet media from the late 1950s to the 1980s were framed by an open-ended binate of what is meaningful or, on the contrary, meaningless. In his classic 1992 book The Human Motor: Energy, Fatigue, and the Origins of Modernity, Anson Rabinbach demonstrates how the utopian obsession with energy and fatigue shaped social thought in modern Europe. In a similar line, this project explores how human error takes on a new meaning when the ontology of information central to Western cybernetics is adopted to a Soviet version of digital modernity.

Tech Disruptors, Then and Now

Mar Hicks (University of Virginia)

This paper explores the connected histories of whistleblowers and activists who worked in computing from the 1960s through the present day, showing how their concerns were animated by similar issues, including labor rights, antiracism, fighting against gender discrimination, and concerns regarding computing’s role in the military-industrial complex. It looks at people who tried to fight the (computer’s) power from within the computing industry, in order to write an alternative history of computing.

Atosha McCaw (photo credits: Elisabetta Mori)

Nosebleed Techno, Sound Jams and Midi Files: the Creative Revolution of Australian Musicians in the 1990s through AMIGA Music Production.

Atosha McCaw (Swinburne University of Technology, Melbourne)

This paper looks at the innovative use of the AMIGA computer by Australian musicians in the 1990s, highlighting its role as a cost-effective tool for music production, experimentation, and collaboration. By examining how these artists harnessed the power of this technology to share files and rapidly materialize creative concepts, we uncover a fascinating chapter in the evolution of electronic music in Australia.

Computers and Datasets as Sites of Political Contestation in an Age of Rights Revolution: Rival Visions of Top-Down/Bottom-Up Political Action Through Data Processing in the 1960s and 1970s United States

Andrew Meade McGee (Smithsonian Air and Space Museum)

As both object and concept, the electronic digital computer featured prominently in discussions of societal change within the United States during the 1960s and 1970s. In an era of “rights revolution,” discourse on transformative technology paralleled anxiety about American society in upheaval. Ever in motion, shifting popular conceptualizations of the capabilities of computing drew comparisons to the revolutionary language of youth protest and the aspirations of advocacy groups seeking full political, economic, and social enfranchisement. The computer itself – as concept, as promise, as installed machine – became a contested “site of technopolitics” where political actors appropriated the language of systems analysis and extrapolated consequences of data processing for American social change. Computers might accelerate, or impede, social change.

This paper examines three paradigms of the computer as “a machine for change” that emerge from this period: 1) One group of political observers focused on data centralization, warning of “closed worlds” of institutional computing that might subject diverse populations to autocratic controls or stifle social mobility; 2) In contrast, a network of social activists and radicals (many affiliated with West Coast counterculture and Black Power movements) resisted top-down paradigms of data centralization and insisted community groups could seize levers of change by embracing their own forms of computing. 3) Finally, a third group of well-meaning liberals embraced the potential of systems analysis as a socially-transformative feedback loop – utilizing the very act of data processing itself to bridge state institutions and local people, sidestepping ideological, generational, or identity-based conflict.

Computing a Nation: Science-Technology Knowledge Networks, Experts, and the Shaping of the Korean Peninsula (1960-1980)

Ji Youn Hyun (University of Pennsylvania)

This paper presents a history of the ‘Systems Development Network’ (SDN), the first internet network in Asia established in 1982, developed in South Korea during the authoritarian presidency of Park Chung-Hee (1962-1979). I examine scientists and engineers who were repatriated under Park’s Economic Reform and National Reconstruction Plan to reverse South Korea’s ‘brain-drain’, re-employed under government sponsored research institutions, and leveraged to modernize state industrial manufacturing.

Pioneered by computer scientist Kilnam Chon, often lauded as ‘the father of East Asia’s internet’, a transnationally trained group of experts at the Korea Institute of Electronics Technology (KIET) developed the nation’s internet infrastructure, despite repeated government pushback and insistence on establishing a domestic computer manufacturing industry. Drawing on the Presidential Archive and National Archives of Korea, I describe how the SDN manifested through a lineage of reverse-engineering discarded Cheonggyecheon black market U.S. Military Base computer parts, prototyping international terminal and gateway connections, and “extending the instructional manual” of multiple microprocessors.

The reconfiguration of computer instructional sets are one of many cases of unorthodox, imaginative, and off-center methods practiced in Korea to measure up and compete with Western computing. Although repatriated scientists were given specific research objectives and goals, their projects fundamentally materialized through a series of experimental and heuristic processes. This paper will illuminate South Korea’s computing history, which until now has not been the subject of any history, and also allow a broader reflection on the transformation of East Asia during the Cold War––highlighting political change through the development of computing.

Daphne Zhen Ling Boey (photo credits: Janet Toland)

Collecting Data, Sharing Data, Modeling Data: From Adam and Eve to the World Wide Web within Twenty Years

Barbara Hof (University of Lausanne)

Much like physicists using simulations to model particle interactions, scientists in many fields, including the digital humanities, are today applying computational techniques to their analysis and research and to the study of large data sets. This paper is about the emergence of computer networks as the historical backbone of modern data sharing systems and the importance of data modeling in scientific research. By exploring the history of computer data production and use in physics from 1990 back to 1970, when the Adam & Eve scanning machines began to replace human scanners in data collection at CERN, this paper is as much about retelling the story of the invention of the Web at CERN as it is about some of the technical, social and political roots of today’s digital divide. Using archival material, it argues that the Web, developed and first used at physics research facilities in Western Europe and the United States, was the result of the growing infrastructure of physics research laboratories and the need for international access to and exchange of computer data. Revealing this development also brings to light early mechanisms of exclusion. They must be seen against the backdrop of the Cold War, more specifically the fear that valuable and expensive research data at CERN could be stolen by the Soviets, which influenced both the development and the restriction of data sharing.

Differing views of data in Aotearoa: the census and Māori data

Daphne Zhen Ling Boey and Janet Toland (Victoria University of Wellington | Te Herenga Waka)

This presentation explores differing concepts of “data” with respect to the Indigenous Māori people of Aotearoa and colonial settlers. A historical lens is used to tease out long-term power imbalances that still play out in the data landscape today. Though much data has been collected about Māori by successive governments of New Zealand, little benefit has come to Māori themselves.

This research investigates how colonisation impacted Māori, and the ongoing implications for data. The privileging of Western approaches to harnessing the power of data as opposed to indigenous ways stems from colonisation – a system that results in “a continuation of the processes and underlying belief systems of extraction, exploitation, accumulation and dispossession that have been visited on Indigenous populations.”

We examine the census, an important tool that provides an official count of the population together with detailed socioeconomic information at the community-level and highlight areas where there is a fundamental disconnect between the Crown and Māori. Does Statistics New Zealand, as a Crown agency, have the right to determine Māori ethnicity, potentially undermining the rights of Māori to self-identify? How do differing ways of being and meaning impact how we collect census data? How does Aotearoa commit to its Treaty obligations to Māori in the management and optimisation of census data? We also delve into Māori Data Sovereignty, and its aim to address these issues by ensuring that Māori have control over the collection, storage and use of their own data as both enabler of self-determination and decolonisation.

History of computing from the perspective of nomadic history. The case of the hiding machine

Liesbeth De Mol (CNRS, UMR 8163 Saviors, Textes, Langage, Université de Lille)

Computing as a topic is one that has moved historically and methodologically through a variety of disciplines and fields. What does this entail for its history? The aim of this talk is to provoke a discussion on the future of the history of computing. In particular, I use a notion of so-called nomadic history. This is in essence the idea to identify and overcome ones own disciplinary and epistemological obstacles by moving across a variety of and sometimes conflicting methods and fields. I apply the method to the case of the history of the computer-as-a-machine which is presented as a history of hide-and-seek. I argue that the dominant historical narrative in which the machine got steadily hidden away behind layers of abstraction needs countering both historically as well as epistemologically. It is based on a collaboratively written chapter for the forthcoming book “What is a computer program?”.

Luke Stark (photo credits: Elisabetta Mori)

Modeling in history: using LLMs to automatically produce diagrammatic models synthesizing Piketty’s historiographical thesis on economic inequalities

Axel Matthey (University of Lausanne)

This research integrates theoretical digital history with economic history. Employing Large Language Models, we aim to automatically produce historiographical diagrams for analysis. Our experience with the manual production of historiographical diagrams suggests that LLMs might be useful to support the automatic generation of such historiographical diagrams which aim at facilitating the visualization and understanding of complex historical narratives and causal relationships between historical variables. Our initial exploration involved using Google’s LLM (Gemini 1.5 Pro) and OpenAI’s GPT-4o to convert a concise historical article by Piketty into a simplified causal diagram. This article is A Historical Approach to Property, Inequality and Debt: Reflections on Capital in the 21st Century . LLMs have demonstrated remarkable capabilities in various domains, including understanding and generating code, translating languages, and even creating different creative text formats. We show that LLMs can be trained to analyze historical texts, identify causal relationships between concepts, and automatically generate corresponding diagrammatic models. This could significantly enhance our ability to visualize and comprehend complex historical narratives, making implicit connections explicit, and facilitating further exploration and analysis. Historiographical theories explore the nature of historical inquiry, focusing on how historians represent and interpret the past: in this research, the use of diagrams is being considered as a means to enhance the communication, visualization, and understanding of these complex theories.

Computational Illegalism

Luke Stark (Western University Canada)

In his analysis of the concept in his lectures on the development of the “punitive society,” Michel Foucault describes the eighteenth century as a period of “systematic illegalism,” including both lower-class or popular illegalism and “the illegalism of the privileged, who evade the law through status, tolerance, and exception” (Foucault 2015, 142). In this paper, I argue that illegalism has new utility as an analytic concept in the history of computing. Illegalism is a characteristic of both the business models and rhetorical positioning of many contemporary digital media firms. Indeed, such “computational illegalism” is so rife that commentators often seem to accept it as a necessary aspect of Silicon Valley innovation.

In this presentation, I describe illegalism as theorized by Foucault and others and develop a theory of platform illegalism grounded in the history of technical and business models for networked computing since the 1970s. This presentation is part of a larger project in which I document the prevalence of illegalism on the part of digital platforms in various arenas, focusing in particular on platform labor and generative AI; examine the range of responses to such illegalism from consumers, activists, and governments; and formulate recommendations regarding ways to account for platform illegalism in scholarly and activist responses as part of governance mechanisms for digitally mediated societies.

The datafied “enemy,” Computational work, and Japanese American incarceration during World War II

Clare Kim (University of Illinois Chicago)

Following the events of Pearl Harbor in December 1941, a series of U.S. presidential proclamations and executive orders authorized the legal designation and treatment of people of Japanese ancestry as “enemy aliens.” The designation of the US West Coast as military zones under Executive Order 9066 enabled the removal and subsequent incarceration of more than 120,000 Japanese Americans in internment camps. The problem of identifying, incarcerating, and managing Japanese enemy alien populations necessitated the treatment of these military zones and spaces as information environments, where the classification of Japanese and Japanese American residents as enemy alien, citizen, or an alternative subject position could be adjudicated. This paper explores how conflict in the Pacific theater of World War II contoured the entanglements between computational work and Asian and Asian Americans residing in the U.S., recounting the setup of statistical laboratories established to track and manage Japanese American incarceration. It reveals how datafication practices were collapsed and equated with bodies that were racialized as an enemy alien and yellow peril, which paradoxically effaced other subject positions to which Japanese Americans came to occupy at the time: in particular, the invisible labor to which they furnished to statistical work as technical experts themselves.

(photo credits: Barbara Hof)

Fellow Publication: Integrative Contemporary Art and Science Practices Building Catalytic Structures

KHK c:o/re Fellow Hannah Star Rogers contributes to Integrative Contemporary Art and Science Practices Building Catalytic Structures, a newly released volume, edited by J.D. Talasek and Barbara Stauffer, and published by Routledge.

Contemporary Art and Science Practices: Building Catalytic Structures (2025) considers how such interdisciplinary efforts have shifted from outsider experiments to increasingly institutionalized initiatives. It examines the motivations, challenges, and transformative potential of this integration across public engagement, education, and cultural discourse. This groundbreaking collection brings together leading thinkers and practitioners to examine the evolving relationship between contemporary art and scientific inquiry. In addition to Rogers, the text features contributions from other leading voices in art and science, including William L. Fox, Ellen Levy, Mel Chin, Brandon and Aurore Ballengée, and Jill Scott. This volume is a vital resource for researchers, educators, curators, artists, scientists, and policy makers navigating the complex intersections of knowledge, creativity, and collaboration.

Rogers’ chapter, “Art, Science and Technology Studies: Charting Collaborative Practice,” offers a compelling analysis of the power dynamics, collaborative models, and institutional conditions shaping art-science partnerships today. Rogers’ work contributes a critical theoretical framework from Art, Science, and Technology Studies (ASTS), a subfield of Science and Technology Studies (STS), to advocate for more symmetrical, equitable modes of interdisciplinary collaboration.

Rogers argues for understanding both art and science as socially and culturally situated systems of knowledge. Drawing on examples ranging from historical botanical illustration to contemporary biotech art and artist residencies, she categorizes four prevalent models of collaboration, each with distinct power structures, intentions, and outcomes. She critiques the persistent instrumentalization of art – particularly in science communication – where artistic practice is often reduced to a tool for enhancing scientific messages. Her chapter provides a roadmap for critically evaluating and fostering more generative, balanced partnerships between artists and scientists.

About the Editors:
J.D. Talasek is a curator, researcher, and writer known for integrating the arts into scientific contexts through his leadership at the National Academy of Sciences and as editor-in-chief of Leonardo Journal.

Barbara Stauffer, a ceramic artist and former program director at the Smithsonian’s National Museum of Natural History, has led numerous interdisciplinary initiatives focused on public engagement and education.

The Artwork Is the Network

A man stands behind a speaker's desk next to a screen displaying an old computer.

ARIANNA BORRELLI

The workshop “After Networks: Reframing Scale, Reimagining Connections”, organized by c:o/re Fellow Nathalia Lavigne, took as its starting point the increasing critiques to digital platforms as monopolizing and shaping networking according to economic interests, and so leading to a crisis of social interactions.

A key question at the meeting was whether and how artistic activities can help (re)imagine connections beyond digital social media, and artist Eduardo Kac was invited to present and reflect his work in this perspective. Given the critical stance of the workshop towards new technologies, Kac could at first appear as a strange choice, since his artworks, while of extremely diverse nature, all made use of what were at the time cutting-edge technologies, from early computer networks to space travel. Can we use technology to reach beyond Big-Tech-dominated networks? Let us seek the answer in Kac’s works as he presented them at the c:o/re event.

Eduardo Kac created his first artworks in Brazil in the early 1980s by manipulating the pixels on a computer screen, and had to work hard to have the results accepted as an art piece for an exhibit. Later, he artistically explored one of the first computer network: the French Minitel. In the 1980s, the French government had decided to kick-start one of the first forms of a nation-wide digital information network. Minitels were not personal computers, but videotex terminals with screen and keyboard: they could be loaned for free from the Post offices, plugged into the telephone network and so enabled to send or request information, access bulletin boards, book tickets, buy products – or view four works by Kac.

At the event, the artist showed us on a large screen an example of what the users would have seen on their Minitel viewer. In the work “Reabracadabra” (1985) colored lines slowly drew themselves from top down on the screen, and eventually became recognizable as the letter A, surrounded by small letters forming the word abracadabra. Even though Kac had shown us before a picture of the finished image, seeing it slowly emerge from the dark screen with a simple, but fluid motion was somehow surprising, as the effect was quite different form today’s digital imaging. Like all information received through the Minitel, the artwork could not be stored locally, and disappeared when the screen was cleared. In other words, the art existed in the connection, and only as long as the connection itself was there. Indeed, the original artworks disappeared for good when the French government finally switched off the Minitel network, but Kac had already been active to recover and reconstruct them, and so they could be displayed on original Minitel terminals at the exhibition “Electric Dreams. Art and Technology Before the Internet” (Tate Modern, London 28/11/24-1/6/25). Thus, the work also explores questions of the limits of archiving digital artworks, and lets us wonder how far a recreated network can support the “same” artwork.

Eduardo Kac during his keynote at the interdisciplinary workshop “After Networks: Reframing Scale, Reimagining Connections” in Aachen.

During the 1990s, the internet became a global phenomenon, but in the meantime Kac had become active in another technological outreach: biotechnologies. Other than the Minitel artworks, Kac’s creations in this field are quite well-known, especially the GFP Bunny (2000), a genetically engineered rabbit which glows in the dark. Its presentation gave rise to broad and intense media reactions which surprised its author and prompted him to embed them in new artworks. Kac pointed out that the pop-culture reaction to his work gave him the opportunity of opening a communication channel, where he would send implicit messages to companies, television shows and other agents quoting his work. This communication channel was a way to create networks via implicit messages, where the media is the globality of media and the artwork becomes the medium enabling communication. Kac also presented another example of art involving non-human life forms: “Essay Concerning Human Understanding” (1994), in which a bird and a plant are enabled to communicate in a bio-technological environment and so generate art for each other. Here, technology and human actors become a network for the creation and consumption of art on the part of non-human creatures.

Eduardo Kac provided a glimpse into his different projects using many pictures.

The final works Kac discussed at the workshop turned to yet another cutting-edge technology:  space travel. With the cooperation of NASA since the early Noughties, Kac placed artworks in space, and one of them, a cubic, laser-engraved glass sculpture named “Adsum”, lies today in the Mare Crisium, a crater on the Moon’s face always visible from Earth. Yet these are  “only” earthly artworks placed in space: the next creation Kac showed us in his presentation was an artwork produced in space to be consumed in space. “Inner Telescope“ is a technologically minimal creation made out of two standard sheet of paper by using only the bare hands and a pair of scissors. The hands were not those of the artist, though, but of French astronaut Thomas Pesquet who, following Kac’s instructions, produced the artwork during his stay in the International Space Station (ISS) in 2017. Looking like an M pierced by a tube, the work on Earth would only clumsily and formlessly slump onto a surface, but under zero gravity it floats lightly against the backdrop of the earthly blue marble: the first native outer space artwork. Who is the artist here: Kac, the astronaut, the zero gravity environment – or maybe NASA? Clearly, this question makes little sense, as the work highlights what was already implicit in the previous ones, namely the number factors and actors which combine to produce a work of art, blurring the distinction between creators and consumers, and letting them all appear as nodes in a live artistic network. Kac’s creative impulse takes the role of an enabler, setting up a bio-physical-technological network and artwork.

Let us now go back to the initial question: Can we use technology to reach beyond Big-Tech-dominated networks? Kac’s works show that this may be possible by highlighting how artworks, however technologically based, are never made out of technology, but of the situated entities communicating through it, be they humans on earth or space, animals or plants, or paper floating in space. In a similar way, we might go beyond today’s social networks not by rejecting them, but by becoming aware that their digital technology does not constitute a new, magical network for us to live in, but is only an additional factor enabling life forms in the universe to live out their inner potential for connection. We are the network, if we so imagine ourselves.

Artistic Research Part 3: “The process can give depth to the final work, while the work can visualize the questions raised during the process”

An art installation consisting of various monitors on which fluorescent colors in blue and black can be seen

At the KHK c:o/re, the practice of artistic research has always been part of our research interests. For this reason, we invite fellows working closely with the arts in each fellow cohort. In the past four years, we have  realized various projects in collaboration with art scholars and practitioners, and different cooperation partners, ranging from artistic positions in the form of performance, installation, discourse, and sound. Many of these events were part of the center’s transfer activities to make the research topics and interests visible, relatable, and tangible. In this sense, art can be seen as a translation for scientific topics. But the potential of the interaction between science and art doesn’t end there. Art is more than  a tool for science communication, it is a research culture. Therefore, we ask the following questions: What kind of knowledge is generated in artistic production? How are these types of knowledge lived in artistic research? And, in combination with one of the central research fields of the center: To what extent are artistic approaches methodologies for what we understand as expanded science and technology studies?

To get closer to answering these questions, we talked to some of our fellows who are working closely with the arts and researching the connection between science and art. We want to find out about the epistemic value of artistic research, the methodologies and institutional boundaries of artistic research in an academic environment, and how they implement artistic research in their research areas.

In this edition of the interview series, we spoke to KHK c:o/re alumni fellow Masahiko Hara.

Profile Image

Masahiko Hara

Masahiko Hara is an engineer and Professor Emeritus of Tokyo Tech, Japan. His research interests are in the areas of Nanomaterials, Nanotechnology, Self-Assembly, Spatio-Temporal Fluctuation and Noise, Ambiguity in Natural Intelligence, Bio-Computing, Chemical Evolution, Origins of Life, and Science and Art Installation.

KHK c:o/re: What do you think is the epistemological value of artistic research?

Masahiko Hara: Artistic research expands the diversity of “knowing” by exploring sensory and embodied experiences, as well as aspects of “tacit knowledge” that are difficult to verbalize. It offers an alternative approach to phenomena that cannot be fully grasped by the theories and data-driven frameworks centered on “explicit knowledge”, which are prioritized in contemporary science and technology.

Historically, philosophy and physics (here referring to the natural sciences) were two sides of the same coin. However, as they developed separately in the 20th century, there arose a need for a new metaphysics — a kind of metaphysical translation that could bridge the gap. I believe this is where the value of artistic research lies:

  • Contribution to diverse forms of knowledge
  • Reframing how questions are asked
  • Critically reflecting on how we know
  • Emphasizing experience and relationality

What specific methodologies are used in artistic research? Can you give an example?

Certainly, I believe the science-art installation experiments we are conducting represent a cutting-edge methodology in artistic research. Other examples include performances, participatory projects, and experimental creations using bio-materials, for example.

I have conducted “scientist-in-residence” projects that explore experimental creation at the intersection of science and art — such as computation using slime mold amoebas and experiments on crowd psychology and social group dynamics. Each of these projects is still in a “prototype” phase, but I feel that this very process of trial and error itself constitutes a new methodology.

Two people presenting an art installation with a monitor and pink light tubes hanging on stairwell.
Yasmin Vega and Masahiko Hara introduce their science-art installation experiment “Melodic Pigments: Exploring New Synesthesia”

How do product-oriented art forms such as exhibitions or installations differ from process-oriented approaches to art? Is there a hierarchy? How do these approaches influence each other?

Product-oriented approaches focus on a “finished form” to be delivered to the audience, while process-oriented approaches value the creative process and trial-and-error itself (I think installations are not categorized in the product-oriented art, but rather process-oriented). There is no hierarchy between the two; rather, they are complementary. The process can give depth to the final work, while the work can visualize the questions raised during the process. Interestingly, from my viewpoint, both science and art today, established in the 20th century, have product-oriented tendencies. In both cases, it’s about delivering something complete — whether a published paper or a finished artwork — to the audience.

What is also notable is that in Asian ways of thinking, there tends to be a greater appreciation for the effort and process leading up to a goal, rather than just for “winning” at the Olympics or World Cup, for example. Unfinished processes, or those not yet reaching a goal, can themselves generate new value, especially in the form of installation experiments in both science and art. In this sense, when we talk about “mutual influence”, I believe that the idea of “incomplete completeness” in both product-oriented and process-oriented approaches could be coupled and give rise to new forms of emergence.

Is there a specific aesthetic that characterizes artistic research? Like trends or movements?

I think the aesthetics of artistic research lie in its attitudes, such as reexamining how questions are framed and embracing uncertainty. As for trends, I believe artistic research challenges the very foundation of aesthetics itself: it prompts us to ask what beauty is, whether universal beauty exists, and so on. In both science and art, within the larger environment of the universe we inhabit, the pursuit of true beauty and exploration of its methodologies is becoming increasingly relevant.

Composition of the art installation “Unfelt Treshold” by Aoi Suwa and Masahiko Hara; photo by Aoi Suwa

What are the problems and challenges of artistic research in an academic environment?

Some problems and challenges include the mismatch between evaluation criteria in science and art, the difficulty of “making outcomes visible,” and the gap between academic and artistic modes of expression. The open-ended, tacit nature of artistic processes often conflicts with the demand for codified, explicit knowledge in academic evaluation systems.

However, I believe that this very sense of “discrepancy” is one of the most important issues. It is precisely because this friction exists that artistic research, especially of a metaphysical nature, becomes meaningful.

What does “experimenting” mean in the case of artistic research, perhaps in contrast to the usual scientific methods?

In science, experiments emphasize reproducibility and control. In contrast, in artistic research, an experiment is an “open-ended attempt” that unfolds through unexpected discoveries, chance, and relationships with observers. Failure, deviation, and ambiguity are also essential components. Both fields involve emergence, but to exaggerate slightly, scientific experiments aim to discover phenomena and possibilities that already exist in the universe, whereas experiments in artistic research may invent phenomena and possibilities that have never existed before. They may offer answers that cannot be generated by machine learning and big data.

Do art and science have different forms of knowledge production?

Yes, unfortunately, based on the developments of the 20th century, the answer is currently yes. Science has sought universal knowledge through analysis and systematization, while art has produced individual, experiential forms of knowledge. The former values reproducibility, while the latter considers identical outcomes by different people to be banal. This again mirrors the divide between philosophy and physics. That said, while the two differ, they are fundamentally complementary forms of knowledge.

One interesting point is that artworks sometimes grasp truths that science has not yet addressed. Artists often aren’t aware they are engaging with scientifically significant perspectives. Conversely, scientists often don’t believe that artists are doing such things. Our goal, through our science-art installation experiments, is to repair and bridge this gap or missing link, reconnecting philosophy and physics into a healthy and cyclical relationship.

Artistic Research Part 2: “[A]rt and science are not distinct domains, but are intertwined practices”

A glass model of a green purple anemone.

At the KHK c:o/re, the practice of artistic research has always been part of our research interests. For this reason, we invite fellows working closely with the arts in each fellow cohort. In the past four years, we have  realized various projects in collaboration with art scholars and practitioners, and different cooperation partners, ranging from artistic positions in the form of performance, installation, discourse, and sound. Many of these events were part of the center’s transfer activities to make the research topics and interests visible, relatable, and tangible. In this sense, art can be seen as a translation for scientific topics. But the potential of the interaction between science and art doesn’t end there. Art is more than  a tool for science communication, it is a research culture. Therefore, we ask the following questions: What kind of knowledge is generated in artistic production? How are these types of knowledge lived in artistic research? And, in combination with one of the central research fields of the center: To what extent are artistic approaches methodologies for what we understand as expanded science and technology studies?

To get closer to answering these questions, we talked to some of our fellows who are working closely with the arts and researching the connection between science and art. We want to find out about the epistemic value of artistic research, the methodologies and institutional boundaries of artistic research in an academic environment, and how they implement artistic research in their research areas.

In this edition of the interview series, we spoke to KHK c:o/re fellow Hannah Star Rogers.

Profile Image

Hannah Star Rogers

Hannah Star Rogers is a scholar, curator, and theorist of art-science. She does research on the knowledge categories of art and science using interdisciplinary Art, Science, and Technology Studies (ASTS) methods.

KHK c:o/re: What do you think is the epistemological value of artistic research?

Hannah Star Rogers: Epistemology asks: What counts as knowledge? How is knowledge produced? How do we justify what we believe to be true? Who gets to decide what is valid knowledge? These are fundamental concerns of STS, and they have been the driving force behind my interest in considering the relative power of art and science, in order to understand how these groups have persisted in knowledge production. It should be said that I have in mind the large tent of STS knowledge production, which can include things like aesthetic knowledges. Artistic research holds significant epistemological value by contributing to knowledge production in ways that are often overlooked. In my book Art, Science, and the Politics of Knowledge (2022), I try to offer a perspective on the epistemological value of artistic research. Drawing from Science and Technology Studies (STS), I argue that art and science are not distinct domains but are intertwined practices that both produce knowledge through shared methodologies such as visualization, experimentation, and inquiry.

Book cover showing a person climbing onto a metal table under a white sheet.
Book cover, 2022. Photo credit: Kira O’Reilly and Jennifer Willet. Refolding (Laboratory Architectures). School of Biosciences at the University of Birmingham, 2010. Photos by Hugo Glendinning.

I have a particular interest in liminal objects, like the Blaschka glass marine models or Berenice Abbott’s illustrative science photographs, because they belong to both art and science networks at different times and places. Another phenomenon I’ve been interested in for what it might tell us about art and science as knowledge-making communities are intentionally hybrid art-science practices, like bioart. Their status is different but they can also help us think about STS concerns like expertise, boundary-making, and disciplinary zoning. It’s hardly news that context changes meaning, but these liminal objects are a chance to think about how people construct those meanings by invoking materials and rhetorics.

A glass figure of an anemone.
Anemonia sulcata, Cornell Collection of Blaschka Invertebrate Models, Model No. 35. Photograph by the Corning Museum of Glass. © Corning Museum of Glass.

These liminal objects challenge traditional dichotomies between art and science, suggesting that these categories are socially constructed labels that order our understanding of knowledge. Building on the work of Latour and Woolgar, combined with Howard Becker, we can observe that both art and science function as networks that produce knowledge, often overlapping in their practices and outcomes. By examining the intersections of art and science and studying the works of other ASTS scholars, I observe the complex and collaborative nature of knowledge-making in art and in science. This leads me to a position of advocacy which is beyond the scope of ASTS and intersects more with my role as an art-science curator: I want to advocate for a more inclusive understanding of how an expanded understanding of what knowledge is and how it is produced, validated, and experienced.

What specific methodologies are used in artistic research? Can you give an example?

Art methods are many, but most projects involve the discovery of new processes and methods. It is easy to remark that scientists set out their methods first, but in fact, especially in the case of groundbreaking research, they often must discover the method by which to produce, reproduce, and capture data about a phenomenon. Method-making is a central part of the efforts of both artists and scientists.

Put another way, the work of artistic researchers covers many methods we are familiar with in STS, including historical and anthropological research, interviews with community members and experts, ethnographic observations, and philosophical reflections. At the same time, and I speak here about art-science, there are art processes which we tend to use less often: direct work with materials, a sensibility for offering the public an experience of the work (which often shapes choices from the beginning of artists’ processes), the duplication and hacking to standard protocols from within the sciences, and an openness to staying from our original methods. The final “product” may be an installation or performance or poetry, but often what is most revealing are the processes and decisions that shaped it. This recursive attention to method is itself a form of inquiry—and one that carries epistemological weight.

How do product-oriented art forms such as exhibitions or installations differ from process-oriented approaches to art? Is there a hierarchy? How do these approaches influence each other?

    In my experience, behind the most interesting art-science projects are even more fascinating methods and processes. Showing methods and processes is a major interest of nearly all the artists (bioart, digital art, eco-arts, participatory/community arts) interviewees I have ever spoken to as part of my Art, Science, and Technology Studies (ASTS) research. It is worth noting that I particularly work with actors in art-science or art-science-technology but I believe that we would find this to be a wider pattern in other areas. A component of many contemporary artmakers’ work is to figure out how to convey their actions or the actions of others (be they communities, plants, microbes, scientists, or otherwise) through their work. Art-science curators often take up this same concern. We ask: how can we design an encounter that invites the public into the process? This can be complex, but it’s central to how we try to help audiences encounter the richness of artistic research. I’ve tried to explore some curators’ approaches to these issues in my forthcoming edited volume, What Curators Know, from Rowman & Littlefield, due out later this year. I also would argue for the need to create conditions that support open-ended artistic inquiry—akin to basic scientific research. Too often, artists are pressured to produce legible outcomes or results. But like scientists, artists should also be given space to ask difficult, speculative questions without immediate expectations of closure or utility. I have written a bit about basic artistic research (BAR) for the journal Leonardo because I believe much more needs to be done to offer artists the conditions under which they might work under the bluest skies possible, that is with open research possibilities like those that have traditionally been supported in basic scientific research.

    Artistic Research Part 1: “[I]n an academic environment, […] practice-led research in the arts is still not fully recognized as eligible”

    A mural depicting various items of clothing in black and white hangs on a large brick building.

    At the KHK c:o/re, the practice of artistic research has always been part of our research interests. For this reason, we invite fellows working closely with the arts in each fellow cohort. In the past four years, we have  realized various projects in collaboration with art scholars and practitioners, and different cooperation partners, ranging from artistic positions in the form of performance, installation, discourse, and sound. Many of these events were part of the center’s transfer activities to make the research topics and interests visible, relatable, and tangible. In this sense, art can be seen as a translation for scientific topics. But the potential of the interaction between science and art doesn’t end there. Art is more than  a tool for science communication, it is a research culture. Therefore, we ask the following questions: What kind of knowledge is generated in artistic production? How are these types of knowledge lived in artistic research? And, in combination with one of the central research fields of the center: To what extent are artistic approaches methodologies for what we understand as expanded science and technology studies?

    To get closer to answering these questions, we talked to some of our fellows who are working closely with the arts and researching the connection between science and art. We want to find out about the epistemic value of artistic research, the methodologies and institutional boundaries of artistic research in an academic environment, and how they implement artistic research in their research areas.

    In this first edition of the interview series, we spoke to KHK c:o/re alumni fellow Nathalia Lavigne.

    Profile Image

    Nathalia Lavigne

    Nathalia Lavigne [she/her] works as an art researcher, writer and curator. Her research interests involve topics such as social documentation and circulation of images on social networks, cultural criticism, museum and media studies and art and technology.

    KHK c:o/re: What do you think is the epistemological value of artistic research?

    Nathalia Lavigne: Artistic research and research-based art have become huge topics in the last two decades and some art historians have even suggested that there is an overabundance of these terms in contemporary art exhibitions in recent years. However, there are several epistemological values coming out of this approach. In general, it comes from combining knowledge from different fields, making them more visible outside of academia (in cultural spaces), and eventually contributing back to the respective field. This interconnectivity of knowledge can be valuable to academia by bringing new perspectives on objectivity and methodologies and providing more space for speculation and a more enjoyable way to absorb research.

    What specific methodologies are used in artistic research? Can you give an example?

    I can talk about some methodologies developed by artists I have worked with, either as a curator or when writing about their work. One example is the project (De)composite Collections, developed by Giselle Beiguelman, Bruno Moreschi, and Bernardo Fontes for the ZKM’s intelligent.museum residency in 2020. They analyzed the collections of two Brazilian museums through AI reading systems. Using these datasets, which were algorithmically processed with GANs (Generative adversarial networks), they questioned what other art histories might emerge from AI’s readings of the images and how these systems could contribute to understanding the gaze as a historical construct. Part of this methodology involved the development of a dataset organized by recurrent themes in Brazilian modernism, such as indigenous people, people of color, white people, and tropical nature. This work was also part of a project developed by students and faculty members at the University of São Paulo (USP), so it originated in an academic environment.

     Is there a specific aesthetic that characterizes artistic research? Like trends or movements?

    I would say that there are some characteristics that we can notice in the way these projects are formalized depending on the period. One that has been quite evident in recent years is the so-called “forensic aesthetics.” Popularized by artist and researcher Eyal Weizman, the term refers to a methodology used in art to explore the memory of places and objects as forms of testimony. This aesthetic has influenced artists working on topics such as repressed memory and collective amnesia in different contexts. One artist I have collaborated with as a curator is Rafael Pagatini, whose work addresses the memory of the Brazilian Civil-Military Dictatorship (1964-1985) in the present. In his process, he applies methodologies from both history and law, which affects how these memories are addressed (or omitted) from institutional archives.

    Black and white photos of various objects are mounted as a mural on a house wall.
    The mural “este capítulo não foi concluído” (“This chapter is not yet closed”) by Rafael Pagatini; photo by Nathalia Lavigne
    A black-and-white print of a shoe on one side of a labelled sheet of paper.
    Closeup of the mural “este capítulo não foi concluído” (“This chapter is not yet closed”) by Rafael Pagatini; photo by Nathalia Lavigne

    But there are also artists who have been collaborating with scientists long before this kind of artistic production was labeled “artistic research.” For example, since the 1990s, Eduardo Kac has developed projects with bioengineers, geneticists, and, more recently, astronauts and space agencies to create his space arts projects. In his case, the methodologies and the process of materialization vary greatly depending on each project.

    What are the problems and challenges of artistic research in an academic environment?

    In general, the challenges in an academic environment are that practice-led research in the arts is still not fully recognized as eligible for funding or career assessment procedures. But this has been changing, especially since the Vienna Declaration on Artistic Research co-written by different European associations in 2020. However, in countries from the Global South — or even in the US, where public funding for research is limited — this reality is very different. As I’ve heard from artists engaged in artistic research in Brazil, for example, the challenges are not so much in an academic environment but rather in the art system in general. As the Brazilian art system still largely revolves around the market due to the fragility of public institutions, research-based art finds it difficult to fit into a more commercial logic that prioritizes art objects that are less process-oriented.

    Event Announcement: Talk by Professor Caspar Hirschi

    On Wednesday, June 25, 2025, at 6:00 pm, Professor Caspar Hirschi from the University of St. Gallen will give a talk on “A child of the knowledge economy? On the history of the history of knowledge” (“Ein Kind der Wissensökonomie? Zur Geschichte der Wissensgeschichte”) in the KHK c:o/re lecture hall. The talk will be held in German.

    Everyone is cordially invited to attend!

    For further information and registration, please contact Sandra Dresia: dresia@histinst.rwth-aachen.de

    Expanding Cultures of Research and Governance in the Innovation Era

    A spiral staircase photographed through a pane of glass.

    NINA FRAHM

    My short-term fellowship kicked off with a KHK c:o/re workshop exploring the 2025 thematic field ‘expanded science and technology studies (STS)’. As Stefan Böschen explained at the beginning of the session, key questions guiding work in this field were how to study and make sense of increasingly hybrid forms of knowledge production in contemporary research and technological development. Expanding well beyond traditional cultures of science and engineering, research today reflects an imperative to integrate heterogeneous actors, diverse epistemic backgrounds and material practices, and a plurality of economic and political interests. On the one hand, this hybridization of research is driven by expectations for research to become more innovative – to produce new knowledge and technological tools at ever-greater speed and scale. On the other hand, it is a response to a crisis of scientific authority and narratives of technoscientific progress – producing corollary demands to democratize research through greater inclusion of society, stakeholders, and the wider public. During the workshop, we discussed different avenues through which STS can study, meaningfully engage in, and, perhaps, even contribute to critique of this trend and its dynamics. Might expanded forms of scientific and technological production also require expansion of analytical perspectives, methodological tools, types of collaboration, and vocabularies of critique in STS?

    Profile Image

    Nina Frahm

    Nina Frahm is a postdoc at the Department for Digital Design and Information Studies, Aarhus University. Grounded in Science and Technology Studies (STS), she critically examines policies and governance approaches for innovation across countries, institutions, and technoscientific domains.

    During my lecture at KHK c:o/re a few hours later, I argued that STS research into changing cultures of technoscience indeed should expand to studying equally important transformations in the governance of science and technology – in fact, changes in one rarely occur without changes in the other. My research over the last couple of years has closely followed recent shifts in the ways public policies frame the governance of technoscience, and in particular, which frameworks and instruments have been put into practice to achieve a greater inclusion of society in innovation processes. In the past, science and technology policy emphasized a ‘social contract with science’ and a hands-off, hidden role of the state in the production and governance of technological innovation1. Today, however, we witness governments and public institutions openly embracing innovation imperatives and policies to support the development of innovative technologies beyond science alone. Yet, promises on part of policy to achieve social progress and wellbeing through investments in innovation also face the challenge to legitimize public investments in high-risk, highly uncertain research and development. A key task for the ‘entrepreneurial state’2 is hence to produce visions of innovation as a res publica – a thing that can be produced and governed by society and according to its rules.

    A woman stands behind a lectern and gives a lecture.
    Nina Frahm during her lecture.

    A new spirit of technoscience

    The public turn to innovation in the 21st century is characterized by a ‘new spirit of technoscience’3 in which 20th century governance paradigms are turned upside down. Rather than following linear models of innovation, ‘techno-fix’ logics, and ideals for the self-governance of research and development, policies advance frames of technoscientific governance in which the public is given a key role to control innovation pathways and to fix potential problems for society upstream4. The new spirit advances a variety of frameworks to integrate the public in the development and governance of innovation, such as Bioethics, ‘Responsible Research and Innovation’, Open Science, or ‘Mission oriented Innovation’. Different tools are mobilized by policy to put such frameworks into practice, ranging from ethics committees and expert advisory boards to public engagement exercises, citizen deliberations, or co-creation processes, to name but a few.

    While all these tools are geared toward ‘opening up’ technoscientific development and governance, each of them follows a particular idea of who the public is and why it should be included, how it can participate and be represented in technoscientific development and governance, through which means, and for what ends. As my research on the governance of emerging neurotechnologies5 and AI6 has shown, differences in governance frameworks and tools can be traced to culturally situated ideals of democracy that vary greatly across contexts. For instance, US approaches to innovation governance are marked by liberal-technocratic ideals of democracy as deliberations around new technologies are often delegated to experts from science, the law, or philosophy. Here, responsibility for good governance of innovation pathways is located within the individual researcher, engineer, and end-user. The EU, in turn, has experimented with more direct and deliberative forms of democracy in which the public participates directly in settling norms and principles and in which governance responsibility is collectivized along the entire innovation process, including public institutions, scientists, actors in R&D, entrepreneurs, as well as citizens. Whereas both approaches follow long-held scripts, or fictions7, of democratic procedures and practices, they also considerably re-order the relationship between publics and technoscience, particularly when it comes to the distribution of authority to reason on emerging technologies and to take decisions for their governance.

    A woman stands behind a lectern and gives a lecture.
    During her lecture, Nina Frahm discusses the interplay of technoscience and democracy.

    Repertoires for expansion

    Expansions in the role of the state and public institutions in the production of innovation are, hence, closely related to expanding the governance of technoscience to new types of publics, forms of expertise, and practices rooted in situated imaginaries of democratic sovereignty and self-rule. To study this dynamic relationship between changing forms of technoscientific production and governance, STS offers the rich analytical language of ‘interactional co-production’ which has been tried and tested in numerous case-studies that illustrate the complexity and diversity of accommodations between science, technology, and society8. This analytical approach allows us to symmetrically trace how changes in the ways knowledge is produced and technology developed – in changing epistemic and material order – simultaneously reflect changes in democratic order regarding the power to reason on and govern science and technology in the name of society.

    Such analysis encourages us to direct our critical eye beyond the discourses and practices of scientists and engineers to those places and settings which tend to be overlooked in public debates and appraisal of innovation, such as ethics advisory bodies or citizen panels. Next to interrogating their role in the co-production of socio-technical imaginaries, STS can expand analysis to conceptualizing their importance in re-producing imaginaries of democracy and in re-configuring them for the innovation era. As Jan-Peter Voß has argued, “a lot of more work is required to create robust links between empirical studies of how public engagement is conceptualized and done in various ways and the basic presuppositions and tenets of political theories describing specific ways of how ‘society’ or ‘the people’ as a whole become articulated and how the public speaks.”9 In a time where diagnoses of a ‘crisis of democracy’ are permeating the headlines and a shared sense of democratic values seems to be waning, such work is ever more important. But it can also feel uncomfortable as it contributes to further pluralizing, rather than stabilizing, taken for granted understandings of democracy and democratic practice. Doing this work requires institutional spaces that are open for expansion – spaces like KHK c:o/re where interdisciplinarity, curiosity, and intellectual courage are cultivated and cherished. In the workshop, during my lecture, and in different encounters with fellows, I have experienced how ‘expanded STS’ is not just a scholarly ambition but a mode of doing research and of thinking together that is already very much alive. And although every expansion has a limit, I look forward to further stretching it with colleagues at KHK c:o/re and beyond.

    Studying science in public spaces?

    References

    1 Pfotenhauer, S. M., and Juhl, J. (2017). Innovation and the Political State: Beyond the Myth of Technologies and Markets. In Critical Studies of Innovation: Alternative Approaches to the Pro-Innovation Bias, edited by Benoît Godin and Dominique Vinck, 68–94. Cheltenham, England: Edward Elgar. 68–94. https://doi.org/10.4337/9781785367229.00012 .; Block, F. (2008). Swimming Against the Current: The Rise of a Hidden Developmental State in the United States. Politics & Society, 36(2), 169-206. https://doi.org/10.1177/003232920831873 .

    2 Mazzucato, M. (2018). The entrepreneurial state. Penguin Books.

    3 Doezema, T. and Frahm, N. (2023). The New Spirit of Technoscience: Reformulating STS Critique and Engagement. Journal of Responsible Innovation, Vol. 10(1). doi: 10.1080/23299460.2023.2281112

    4 Frahm, N., Doezema, T., & Pfotenhauer, S. (2021). Fixing Technology with Society: The Coproduction of Democratic Deficits and Responsible Innovation at the OECD and the European Commission. Science, Technology, & Human Values47(1), 174-216. https://doi.org/10.1177/0162243921999100.

    5 Frahm, N. (2022) Soft Constitutions: Co-producing Neuro-Innovation and Society in the US, EU, and OECD. PhD Dissertation, Technical University Munich.   

    6 Frahm, N. and Schiœlin, K. (2023) Toward an ‘Ever Closer Union’: The Making of AI-Ethics in the EU. STS Encounters, Vol. 15(2). 

    7 Ezrahi, Y. (2012) Imagined Democracies: Necessary Political Fictions. Cambridge: Cambridge University Press.

    8 Jasanoff, S. (2005) Designs on Nature: Science and Democracy in Europe and the United States. Princeton & Oxford: Princeton Unviersity Press; Laurent, B. (2022) European objects: the troubled dreams of harmonization. Cambridge, MA: The MIT Press; Parthasarathy, S. (2017). Patent Politics: Life Forms, Markets, and the Public Interest in the United States and Europe. Chicago: University of Chicago Press. https://doi.org/10.7208/9780226437996.

    9 Voß, Jan-Peter (2019) Re-making the modern constitution: The case for an observatory on public engagement practices. In: Simon, D., Kuhlmann, S., Stamm, I., Canzler, W. (eds.) (2019): Handbook of Science and Public Policy. Cheltenham: Edward Elgar.

    A Field Trip to the Wandering Mines: Strange Ecologies and the Green Work of Environmental Mitigation

    Four people posing outdoors with constaction fields in the background.

    MATTHEW N. EISLER

    In the Rhenish mining-industrial complex, the past, present, and future of geological and human time intersect. Over 30 million years, intertwined processes of evolutionary biology and geo-biochemistry produced thick seams of brown coal in the region now known as the Cologne Bay (Kölner Bucht) that began to be intensively mined from open pits from the late eighteenth century. As elsewhere, coal-based industrial enterprises enabled asymmetrical social development and came with environmental costs that human beings sought to mitigate with increasingly elaborate infrastructures of waste management. The resulting hybrid ecosystems exist in a state of fragile balance that requires constant effort to maintain, as a mid-May field trip to the Garzweiler and Hambach mines illustrated.

    Profile Image

    Matthew N. Eisler

    c:o/re Fellow 01/25 – 12/25

    Matthew N. Eisler is a lecturer in the Department of Humanities at the University of Strathclyde. He researches how ideology and policy inform practices of energy and materials conversion and shape social relations and environments.

    I am a historian of clean and green technology and have become interested in the historical sociology of the “green work” of environmental mitigation, a project I am developing as a KHK fellow. I was curious to hear the perspectives of my companions on this question. The field trip was led by the retired hydrogeologist H. Georg Meiners, joined by Lars M. Blank, head of RWTH Aachen University’s Institute of Applied Microbiology, and Victor de Lorenzo, RWTH Kármán-Fellow and professor of research in the Spanish National Research Council (CSIC), where he heads the Laboratory of Environmental Synthetic Biology at the National Center for Biotechnology. Georg spent much of his career investigating how the lignite mines affect local water quantity and quality while Lars and Victor research microorganisms capable of metabolizing industrial waste. They were eager to get out of the laboratory and lecture hall and into the field to see just what the microbial world is up against.

    Group photo from the field trip to the Garzweiler and Hambach mines; f.l.t.r. H. Georg Meiners, Lars M. Blank, Victor de Lorenzo, and Matthew N. Eisler

    What we found was a vast “organic machine,” a term coined by the historian Richard White to connote large-scale industrial infrastructure in its ecological context. If the history of environmental mitigation can be characterized by a single phenomenon, it is ‘displacement:’ solve one problem and another pops up elsewhere unexpectedly. In the Rhenish mining complex (Reinisches Braunkohlerevier), we witness a cascading series of displacements. A key set of problems issue from the high sulphur content of lignite. In the 1970s and 1980s, lignite-burning German industry emitting sulphur dioxide and nitrogen oxides caused acid rain in faraway Sweden. Fitting power plants with scrubber technology fixed that problem but did nothing to mitigate emissions of climate-changing carbon dioxide. Moreover, sulphuric mine tailings can acidify soil and ground water. To neutralize this form of pollution, miners mix lime into the spoil.

    Managing such problems is complicated by the vast scale of open cast mining. Garzweiler is an artificial canyon, and canyons can shape airflow and create their own weather systems. Georg had warned us to bundle up because it would be windy, and at Garzweiler, that wind is harvested as a resource. Energy planners have ringed the mine-canyon with dozens of giant wind turbines, exploiting an unintended ecological consequence of this industrial enterprise.

    These winds cause numerous accidents on nearby motorways and also stir up particulates, a less-discussed problem of open cast mining, observes Lars. At Garzweiler’s eastern rim, where overburden is slowly processed into massive lime-laced mesas by giant earthmovers, a bouquet of hoses spray thousands of liters of water a minute in an effort to tamp down dust. But it is impossible to water all of the 40 square kilometers of the mine’s operating area. Scrubby vegetation growing atop the reclamation mesas fixes some of the particulate matter into place and I wonder aloud if this brush belt will grow into a green lung. Georg responds that this proto-forest is only an interim measure that will disappear according to the master mitigation plan of the terraformers of the Cologne Bay. From the 2030s on, the mine-canyons will be decommissioned and filled with water diverted from the Rhine, creating deep lakes in a project that could take until the end of the century and has many unknowns.

    A less visible but equally problematic effect of the mine is on the realm of subterranean water. Garzweiler is surrounded by a vast network of wells, pipes, and pumps working constantly to lower the water table to enable coal excavation. This is a delicate operation. If the pumped water is not properly reinfiltrated back into the ground, says Georg, local forests, streams, lakes, and the underground springs for which the Aachen area is famous could be damaged or even destroyed.

    The Cologne Bay is also a rich and important agricultural region that has become contested terrain in an unequal clash between industrialism and eco-activism. For a century, the mining enterprises followed the coal seams, and their historical progress, as depicted on topographic maps, resemble channels carved by giant coal-hungry worms munching their way through the landscape. These “wandering mines” have destroyed a number of farm communities in their path.

    We visit the village of Keyenberg, a flashpoint in high-profile regional demonstrations against mine operator RWE in 2021 that became a monument to the uneven pace of environmental progress. Keyenberg is a ghost village. From the mid-2010s, it was slated to be engulfed by the Garzweiler mine and began to depopulate but when energy planners decided to phase out lignite, the abandoned village was left intact. Today, the place has the uncanny feel of a Chernobyl-like exclusion zone. A sign on one house reads “zu verschenken,” or in English, “to give away.” Such houses can be obtained for free, at the price of fixing them up and living near a windy and dusty mine-canyon. Georg says there is talk of housing war refugees here, and amidst the boarded-up buildings there are signs of life. One person, perched on a scaffold, repairs a house that boasts a well-tended hydrangea garden, activities suggestive of yet another form of green work heralding Keyenberg’s possible revival, or reincarnation.

    The ongoing management of some of the wandering mine’s wastes, the conversion of its windy microclimate into clean energy, and the gradual reclamation of a portion of its disrupted hinterlands poses the conundrum of how to interpret the co-construction of such awesome desolations and their ingenious eco-infrastructures of life support. Contemporary environmental discourse imposes a dualistic moral-ethical framework of good (green) and bad (non-green) behaviors against which we are supposed to judge ourselves and others, declare a position as optimist or pessimist, and offer normative visions in a calculus that, as some argue, has centered generalized global processes over diverse local experiences of environments and environmental despoliation.

    The case of the Rhenish coal belt calls attention to a particular set of conflicts, contradictions, and puzzles, and what the environmentalist Val Plumwood called “shadow spaces,” occluded from our subjective ecological visions, that invite further investigation. Human history would seem to vitiate the prospect of circumspect do-no-harm environmental activism advocated by the philosopher Arne Naess. In the imagined perspective of geological time, human beings might be perceived as acting in a near-simultaneous spasm of furious environment-altering activity.

    It seems to me that the path to the kind of wise interventions Naess had in mind starts in gaining awareness of the perversities and paradoxes of myriad local projects of environmental mitigation. Understanding how human beings build organic machines like Garzweiler and its environs, learn how these strange ecologies operate as amalgams of human and natural agency, and live with the consequences might be a modest but necessary prelude to deciding the next set of mitigating moves.

    I thank Georg Meiners for hosting and guiding this event and reviewing a draft of this essay, and Victor de Lorenzo and Lars Blank for enriching the experience with their insights and companionship.