The emergence of a new field is rarely declared. It is inferred from accumulation, repetition, and relation. Before a discourse acquires a name, a journal, a conference series, or a departmental line, it exists as a distributed set of practices, texts, and affinities that lack a common address. The problem is not production—fields often produce abundantly in their early stages—but legibility. How does an emerging constellation of work become findable, citable, and traversable by others who were not present at its origin? The answer, increasingly, is infrastructure. And among the available infrastructures, none is more peculiar, more powerful, or more misunderstood than the wiki, and specifically its semantic extension: Wikidata.
The mistake is to treat Wikidata as a repository to fill. Its logic is not archival but relational. A field does not become real in Wikidata by volume—by importing every preprint, every blog post, every internal note—but by coherent anchoring. The graph rewards pattern, not abundance. To fix a nascent field in this environment, one must first define its minimal architecture: an identifiable framework item, a clearly attributed author, and an institutional or contextual anchor. These three elements—field, author, institution—form a triangle that stabilises meaning. Without this triad, entries remain isolated data points; with it, they begin to behave as a system. The field is not declared; it is inferred by the graph through the recurrence of relations.
This is the first lesson for any new field approaching a wiki environment: do not import your whole archive. Import your skeleton.
Selective materialisation as epistemic design
The second step follows directly from the first. A field becomes legible under constraint when a few elements repeat a recognisable pattern. In practice, this means choosing a compact set of works (preferably those with persistent identifiers such as DOIs, ORCIDs, or ISBNs), a limited number of structural units (books, series, or numbered nodes), and a handful of internal concepts that serve as semantic anchors. Each item must be described using shared properties—instance of, author, main subject, part of—so that the graph can recognise recurrence. Through repetition, relations accumulate density, and density produces visibility.
This is the opposite of naive crowdsourcing. It is epistemic design. The emerging field decides what minimal set of entities, properly typed and linked, will allow the wiki to infer its existence. The field does not wait for external validation; it constructs the conditions under which validation becomes possible. In this sense, selective materialisation is a form of infrastructural sovereignty. It does not ask permission to exist. It builds the addressable traces through which existence can be discovered.
Wikidata is particularly suited to this task because its data model is not a tree but a graph. One can assert that a book is instance of scholarly work, that its author is a particular researcher, that its main subject is the nascent field, and that it is part of a series or tome. Each assertion is a weak link, but repeated across several works, across several concepts, across several publications, those weak links harden into a recognisable cluster. The field becomes discoverable not because someone wrote a manifesto, but because the graph’s own query mechanisms—SPARQL, the entity search, the reasoner—begin to return coherent results.
The wiki as translation layer, not the field itself
The third and most difficult lesson is that the wiki is a translation layer, not the field itself. A mature epistemic system—say, Socioplastics with its twenty books, two thousand nodes, two cores, and distributed lexicon—is far richer, more precise, and more autonomous than any Wikidata representation could ever be. Its internal relations are topological, not merely taxonomic; its operators carry torsional force; its recurrence is metabolic, not statistical. None of that translates directly. The wiki reduces and standardises. It flattens curvature into categories, compresses gradients into properties, and replaces self-causing recursion with static links.
This reduction is not a bug. It is the condition of interoperability. The wiki allows external systems—search engines, citation indexes, library catalogues, recommendation algorithms—to locate, index, and connect what you have built. It does not understand your field. It does not need to. It only needs to find it. The goal is not completeness but legibility under constraint. A field is fixed when it can be found, traversed, and cited by others without prior knowledge of its internal logic. At that point, it ceases to be private infrastructure and becomes a shared epistemic object.
The ethics of infrastructural patience
None of this happens quickly. A new field that follows the logic of selective materialisation must accept a temporal asymmetry: the work of anchoring is small, but the work of waiting for the graph to accumulate density is slow. Recurrence takes time. Inference takes time. The wiki does not celebrate novelty; it rewards persistence. This is why the method described here is incompatible with the accelerated temporality of platform capitalism. You cannot hack Wikidata. You can only build it patiently, one well-typed statement at a time, and let the relations thicken.
There is an ethics in this. It is the ethics of LAPIEZA since 2009: persistence before recognition, infrastructure before applause, repetition before consecration. The wiki, properly used, is not a shortcut to visibility. It is a long-term support system for fields that intend to outlast the attention cycles that ignore them.
Conclusion
New fields do not need more data. They need better anchors. The wiki is not a warehouse; it is a scaffold. A field becomes real in Wikidata when it accepts that its internal richness will be translated into a sparse, standardised graph—and when it designs that graph with as much care as it designs its own concepts. The triad of field, author, and institution; the discipline of selective materialisation; the acceptance of reduction as the price of interoperability: these are the protocols of epistemic emergence in the age of linked data. A field that masters them does not wait for permission to persist. It builds the conditions under which persistence becomes inevitable.
https://www.wikidata.org/wiki/Q139504058 LAPIEZA-LAB
https://www.wikidata.org/wiki/Q139530224 Socioplastics
https://www.wikidata.org/wiki/Q139532324 Anto Lloveras