.top-header{ transform: scale(0.5); transform-origin: top left; width: 200%; } Unstable Installation Series
Uploaded Image Uploaded Image

Law, J. (2001) ‘Ordering and Obduracy’. Centre for Science Studies, Lancaster University.

John Law’s “Ordering and Obduracy” revisits Organising Modernity in order to ask how durable power persists within worlds supposedly defined by movement, contingency and flux. His central proposition is that organisation should not be understood as a stable noun, but as organising: a continuous, materially heterogeneous process involving people, documents, architectures, machines, accounting systems, codes and routines. Yet this processual ontology raises a problem: if everything is movement, why do asymmetries of power remain so stubbornly in place? Law answers through obduracy, the persistence of ordering across change. First, power becomes durable when strategies are delegated into materials: accounting systems, safety interlocks and buildings carry modes of ordering beyond individual intention. Secondly, organisations endure through multiplicity rather than coherence; enterprise, administration and vocation may conflict, but their partial overlap prevents collapse. Thirdly, beneath such differences lies a shared strategic pattern Law calls the logic of the return, in which centres gather representations from peripheries and send back commands, thereby stabilising asymmetrical relations of calculation and translation. The Daresbury Laboratory provides the case study: its scientific, managerial and administrative orders differ, yet each depends on centres, flows and returns that make certain voices articulable while silencing others. Law’s conclusion is politically acute: modern ordering is not merely plural but hegemonically strategic, making some forms of knowledge, subjectivity and power durable while rendering non-strategic voices almost unthinkable. Obduracy, then, is not the opposite of process; it is process hardened into inequality. 

Uploaded Image Uploaded Image

Ahmed, S. (2004) ‘Affective Economies’, Social Text, 22(2), pp. 117–139.

Sara Ahmed’s “Affective Economies” reconceptualises emotion not as a private psychological possession, but as a circulatory force that produces subjects, objects and collective bodies through movement. Her central claim is that emotions do not simply reside within individuals or attach naturally to objects; rather, they acquire intensity as they circulate between signs, figures and histories, becoming “sticky” through repetition. Hate, for example, does not originate in a stable subject and then move outward toward a pre-existing enemy. It slides across figures—migrants, asylum seekers, racialised others, “terrorists”—until these bodies appear to contain the threat that affective circulation has produced. Ahmed’s analysis of white nationalist rhetoric shows how hatred is rewritten as love for the nation, binding a fantasy of injured whiteness through the claim that “ordinary” subjects are under siege. Her case study of asylum discourse in Britain demonstrates the same logic: words such as “flood”, “swamped” and “overwhelmed” construct the nation as a vulnerable body invaded by suspect others. After September 11, the figure of the terrorist similarly became detachable and mobile, sticking to Arab, Muslim, South Asian and asylum-seeking bodies through racialised economies of fear. Ahmed’s decisive insight is that emotion makes boundaries rather than merely defending them. Fear and hate materialise the difference between “us” and “them”, authorising surveillance, detention and exclusion. The essay therefore concludes that affect is political infrastructure: it organises belonging by making some bodies lovable, others threatening, and violence appear defensive. 

Uploaded Image Uploaded Image

Uploaded Image Uploaded Image

Daston, L. and Galison, P. (2007) Objectivity. New York: Zone Books.


Lorraine Daston and Peter Galison’s Objectivity overturns the assumption that scientific objectivity is timeless, arguing instead that it emerged historically in the nineteenth century as a specific epistemic virtue: the disciplined suppression of the knowing self. Their study traces this transformation through scientific atlases, whose images did not merely illustrate knowledge but trained communities to see, classify and compare the natural world. Before objectivity, atlas-makers pursued truth-to-nature, selecting and perfecting specimens in order to reveal an ideal type beneath natural variation. The prologue’s case of Arthur Worthington’s splash experiments crystallises the shift: his hand-drawn symmetrical droplets once expressed scientific judgement, but instantaneous photography exposed irregular forms and forced him to confront the authority of “blind sight”. Mechanical objectivity thus required scientists to restrain interpretation, allowing instruments to record accidents, asymmetries and imperfections that earlier traditions would have corrected. Yet Daston and Galison refuse a simple succession narrative. Later trained judgment did not abolish objectivity; it responded to its limits, recognising that scientific images often require expert intervention to become meaningful. The atlas therefore becomes a case study in the making of scientific selves: each visual regime demands a different moral discipline, from idealising nature, to suppressing subjectivity, to cultivating interpretive expertise. The conclusion is profound: objectivity is not the absence of history, but one historical way of seeing, inseparable from practices, instruments and ethical formations of the observer. 

 

Uploaded Image Uploaded Image

Kahl, P. (2026) ‘Distributed Cognition as Epistemic Infrastructure: A Taxonomy of Collective Epistemic Systems’. Submitted for peer review. DOI: 10.5281/zenodo.18449610.


Peter Kahl’s article reframes distributed cognition not as an automatic virtue of decentralised intelligence, but as a fragile epistemic infrastructure whose reliability depends on governance, incentives and closure mechanisms. The central critique is that contemporary discourse too often treats prediction markets, open-source communities, digital platforms, deliberative bodies and regulatory institutions as interchangeable expressions of “crowd wisdom”, despite their radically different architectures. Kahl therefore distinguishes between systems that distribute probabilistic belief, cognitive labour, judgement, attention or delegated authority, showing that each stabilises knowledge through a different locus of epistemic closure. Prediction markets, for example, convert dispersed beliefs into price signals, but remain vulnerable to capital-weighted narrative capture; open-source communities distribute problem-solving labour through shared artefacts, yet may consolidate authority in maintainer oligarchies; platforms rank visibility algorithmically, often amplifying salience rather than truth. The article’s case-study contrast between prediction markets and open-source software demonstrates that decentralisation does not abolish power: it merely relocates it into prices, code, procedures, rankings or institutional decisions. Its decisive contribution lies in replacing naïve enthusiasm for collective intelligence with an architectural account of epistemic reliability. Distributed cognition, on this reading, succeeds only when contestability, transparency, incentive alignment and error correction are deliberately designed. The conclusion is therefore austere but generative: collective knowledge systems should not be celebrated because they are distributed, but evaluated according to how responsibly they govern what becomes credible, salient and settled. 

Uploaded Image Uploaded Image
Uploaded Image Uploaded Image

Banham, R. (1984) The Architecture of the Well-Tempered Environment. 2nd edn. Chicago: University of Chicago Press.


Reyner Banham’s introduction to The Architecture of the Well-Tempered Environment is a polemical correction to architectural history’s most persistent blind spot: its privileging of visible form over the invisible systems that make buildings habitable. Banham argues that architecture became intellectually impoverished when it separated “architecture” from “technology”, relegating heating, ventilation, lighting, sanitation and environmental comfort to engineers, plumbers and consultants rather than recognising them as central to architectural practice. His critique is not anti-technology; it is anti-amnesia. Conventional history could absorb iron, steel and concrete because they extended familiar narratives of structure, but mechanical ventilation or electric lighting disrupted the discipline’s aesthetic habits. The case of Louis Kahn’s Richards Memorial Laboratories and Frank Lloyd Wright’s Larkin Building, illustrated in the introductory pages, shows how historians noticed service towers or ducts only when they produced a monumental exterior effect, while largely ignoring the deeper transformation of occupation, comfort and environmental control. Banham’s method therefore replaces the cult of the “first” invention with the study of typical buildings in which technologies became architecturally consequential. This position remains crucial in an age of energy anxiety: sustainable architecture cannot mean nostalgic retreat into masonry, nor uncritical dependence on machines, but a rigorous rethinking of environmental performance as architectural substance. Banham’s conclusion is uncompromising: those who shaped comfort, climate and servicing belong inside architectural history because they helped define the practice of architecture itself. 

Uploaded Image Uploaded Image
Uploaded Image Uploaded Image
Uploaded Image Uploaded Image

DataCite Metadata Working Group (2026) DataCite Metadata Schema Documentation for the Publication and Citation of Research Data and Other Research Outputs, Version 4.7. DataCite e.V.

 


The DataCite Metadata Schema 4.7 presents metadata not as a secondary administrative layer, but as the epistemic infrastructure through which research outputs become citable, discoverable and reusable across disciplinary boundaries. Its central ambition is pragmatic universality: the schema is deliberately discipline-agnostic, suitable for datasets, software, images, samples, instruments, preprints and other research resources, while remaining focused on accurate identification rather than replacing richer community-specific description. At its core are six mandatory properties—Identifier, Creator, Title, Publisher, PublicationYear and ResourceType—which together generate the minimal architecture of citation. Yet the schema’s intellectual value lies in its recommended extensions: Subject, Contributor, Date, RelatedIdentifier, Description and GeoLocation enlarge the resource from an isolated object into a relational scholarly entity. Version 4.7 strengthens this relational logic by adding Poster and Presentation to resourceTypeGeneral, RAiD and SWHID to relatedIdentifierType, and “Other” as a relationType, while introducing relationTypeInformation for more precise contextualisation. As a case study, a dataset can be rendered not merely as a file with a DOI, but as a situated research object: authored through ORCID-linked creators, published by a ROR-identified repository, licensed through standardised rights metadata, connected to cited papers, methods, funding and spatial coverage. The conclusion is clear: DataCite’s schema is a grammar of scholarly persistence, converting research outputs into durable nodes within an interoperable knowledge commons. 

Uploaded Image Uploaded Image

Goidea, A. (2023) Transcalar Design: Biological principles in architecture through computational design and additive fabrication. Lund University.


Ana Goidea’s Transcalar Design argues that architecture’s ecological crisis cannot be addressed through superficial biomorphism or incremental digital efficiency alone; it requires a deeper reorganisation of design thinking around scalar interdependence. Positioned between computational design, digital fabrication and biodesign, the dissertation proposes transcalarity as an organisational principle through which architectural systems may be understood as dynamic relations between material composition, fabrication logic, biological process, environmental exchange and building-scale performance. The thesis responds to the construction industry’s disproportionate contribution to energy use, carbon emissions and waste, suggesting that biological systems offer models of circularity, adaptation, distributed intelligence, self-organisation and functional integration. Its central contribution lies in translating these biological logics into research-by-design experiments. Pulp Faction develops a 3D-printed architectural column grown from fungal biocomposites, where microbial transformation fuses sawdust-based matter into a structural artefact. Meristem Wall explores a full-scale 3D-printed building envelope that mediates between interior habitation and surrounding ecosystems through complex geometry and multifunctional integration. Swarm Materialization investigates termite-like collective construction through clay deposition, real-time sensing and agent-based computational processes. Together, these experiments show that architecture can no longer treat matter as inert substance awaiting form; rather, matter becomes an active participant in design. Goidea’s case study therefore reframes sustainability as a homeodynamic architectural condition: one in which buildings are conceived not as static objects, but as evolving interfaces between human habitation, living systems and planetary consequence. 

Uploaded Image Uploaded Image

Wikimedia Foundation (n.d.) Editing Wikipedia: A guide to improving content on the online encyclopedia.


Wikipedia’s editorial culture rests on a deceptively radical proposition: knowledge becomes more public, durable and democratic when it is written, checked and revised by a distributed community rather than guarded by a closed authority. The Wikimedia Foundation’s editing guide presents Wikipedia not as a casual repository of information, but as a disciplined commons governed by rules of neutrality, verifiability, civility and shared custodianship. Its central lesson is that anyone may edit, yet no one owns an article; every contribution enters a freely licensed ecosystem where others may correct, improve or redistribute it. The guide’s practical instructions—creating an account, using the Edit button, adding citations, writing edit summaries and discussing disputes on Talk pages—translate the abstract ideal of open knowledge into everyday editorial procedure. Its most important ethical principles are neutral point of view, no original research, reliable sources, copyright respect and conflict-of-interest avoidance. As a case study, the brochure’s step-by-step example of improving the Penny Cyclopaedia article shows that meaningful contribution need not require grand authorship: a contributor identifies missing information, consults a reliable source, paraphrases accurately, cites the claim and saves the change with an explanatory summary. The result is a model of participatory scholarship grounded in modesty, evidence and accountability. Ultimately, editing Wikipedia is not merely technical labour; it is civic pedagogy, training contributors to convert private knowledge into public value through careful sourcing, formal restraint and collaborative trust. 

Uploaded Image Uploaded Image

Tsao, J., Kochhar-Lindgren, G. and Lam, A.M.H. (2025) ‘Institutionalising a transdisciplinary curriculum: assemblages, territories, and refrains’, Higher Education, 89, pp. 849–864.


Transdisciplinary education emerges today as a necessary response to the supercomplexity of contemporary crises, yet its institutionalisation remains difficult because universities are still largely organised through disciplinary territories, inherited hierarchies and evaluative routines. Tsao, Kochhar-Lindgren and Lam argue that the Common Core at The University of Hong Kong offers a persuasive case study for understanding how such a curriculum can become durable without becoming rigid. Drawing on Deleuze and Guattari, they conceptualise the curriculum as an assemblage: a dynamic configuration of students, teachers, policies, funding streams, administrative procedures, institutional desires and material infrastructures. Its success depends not on abolishing disciplines, but on reassembling them through porous yet recognisable curricular boundaries. The concepts of territorialisation and refrain are especially important: policies, timetables, course approvals, grading systems and funding cycles produce recurring rhythms that stabilise the programme, while periodic revision, new thematic areas and pedagogical experimentation prevent stagnation. The HKU Common Core demonstrates this process at scale, having enrolled more than 200,000 undergraduates, approved 290 courses and engaged hundreds of teachers since its full implementation in 2012. Its significance lies in showing that transdisciplinarity requires neither vague generalism nor anti-disciplinary rupture, but a carefully orchestrated ecology of repetition, variation and institutional desire. Ultimately, the article concludes that sustainable curricular transformation depends on creating structures flexible enough to absorb future uncertainty while stable enough to command legitimacy. 

Uploaded Image Uploaded Image

Wolfe, C.T. and Shank, J.B. (2019) ‘Denis Diderot’, Stanford Encyclopedia of Philosophy.

Denis Diderot occupies a singular position within the French Enlightenment because his philosophy refuses confinement within the conventional borders of philosophical system, literary invention, scientific speculation and political critique. Rather than producing a closed doctrine, he developed a mobile and experimental form of philosophie, in which theatre, fiction, art criticism, encyclopaedic writing and metaphysics became mutually reinforcing modes of inquiry. His central achievement, the Encyclopédie, was not merely a compendium of knowledge but an intellectual machine designed to “change the common way of thinking”, challenging religious authority, inherited hierarchy and the separation between manual craft and theoretical reason. Diderot’s thought radicalised empiricism by treating sensation not simply as a source of knowledge, but as a condition through which worlds are formed; blindness, deafness and touch therefore become philosophical instruments for rethinking perception itself. His materialism is equally distinctive: matter is not inert mechanism, but living, sensitive and transformative substance, capable of generating consciousness, embodiment and social complexity. This philosophy culminates in an anthropology of human beings as historical, bodily and imaginative creatures, shaped by language, institutions and desire. As a case study, Diderot’s unpublished dialogues, especially Le Rêve de D’Alembert and Le Neveu de Rameau, reveal how literary form could carry philosophical force more subtly than systematic treatise. His legacy lies precisely in this danger: he made thought porous, embodied and insurgent. Diderot thus remains indispensable because he transformed Enlightenment reason from abstract doctrine into a restless practice of intellectual liberation. 
Uploaded Image Uploaded Image
Uploaded Image Uploaded Image
Uploaded Image Uploaded Image
Uploaded Image Uploaded Image

The most common misreading of Socioplastics is to treat it as synthesis: a carefully assembled bricolage of Lynch, Bowker, Maton, Derrida, Alexander, Bourdieu, and systems theory. But synthesis is not the point. The novelty lies in operational inversion: taking a descriptive insight from one domain and converting it into a prescriptive design protocol for another. Lloveras does not merely combine archival theory, urban legibility, field theory, metadata studies, and metabolic biology; he retools them into an architecture for living knowledge systems. Below, I isolate nine moments where this inversion becomes especially clear.

The novelty of Socioplastics lies in the interlocking system rather than in any isolated term. Metabolic Legibility requires Grammatical Thresholds; Grammatical Thresholds require Scalar Grammar; Synthetic Legibility requires metadata as infrastructure; Latency Dividends require Hardened Nuclei and Plastic Peripheries; Autophagic Recomposition closes the metabolic loop. The result is not a glossary of neologisms but a design grammar for living knowledge systems. Lloveras’s contribution is to convert archive, corpus, field, metadata, and latency into an operational architecture: a way of making abundance inhabitable, addressable, and capable of thought.

 

Uploaded Image Uploaded Image