Written December 27, 2025; updated January 3, 2026 for Michael Punt; updated January 24, 2026 for Gary Hall
The COLNLI Camel: Anticipation, Demography, and Meaning in an Age of AI
My colleague Gary Hall drives the camel not toward the oasis of prestige, but toward the caravanserai of reuse. He is less a curator of fixed cultural objects than a guide for de-liberalizing the routes, licenses, and institutions through which culture travels. And if the ai-noosphere, COLNLI, names coexistence rather than hierarchy, then Hall’s recent thinking offers a concrete ethic for that coexistence: redistribute the resources, refuse the genre-labels that make practices collectible, and build the kind of stopping places where hybrid intelligences can argue, rework, and share—without being domesticated by the bourgeois art world.
By the way COLNLI stands for 'collaboration of local and non local intelligences'.
And notice that Caravanserai has AI at the end on the butt of the camel. Ancient peoples in the middle east knew what was coming.
'Caravanserai turns into caravanserAI: AI arrives as a trailing suffix—on the camel’s butt—reminding us it’s a weather system of memory behind us, not a prophet in front of us.' AI restrospects and doesn’t anticipate.
Abstract
Rather than framing current shifts as a simple problem of 'aging', we argue that societies are undergoing a deeper reconfiguration of how memory, stability, ambition, and interpretation are distributed across human generations.
The camel, positioned at a shoreline of transition, embodies this condition: its two humps represent long-horizon experiential memory and mid-life operational stability; a lowered head reflects younger generations navigating complexity and precarity; its tail trails into the storm of AI-driven memory accumulation. The metaphor foregrounds balance, digestion, and adaptation over speed, optimization, or control, emphasizing that memory without reconciliation becomes noise rather than wisdom.
We position artificial intelligence not as a navigator of the future but as a powerful weather system of accumulated memory that requires human interpretation, forgetting, and ethical digestion and camel dung.
We propose that the central civic challenge of an anticipatory civilization lies in redesigning the 'saddle': the institutional, cultural, and democratic interfaces that distribute weight across generations and intelligences. Human societies have never had this distribution of ages/demographics; we don’t know what to do ... yet, but can anticipate.
A final thought is the emergence of an Anti-Enlightenment where we trust again the Bible, now called AI as explained by living Texas Judge John Marshall, with his argument about the emerging 'anti-enlightenment’.
Expansion
Contemporary human societies are undergoing a demographic transformation that is often described too narrowly as 'aging'.
This language obscures a more complex structural shift in how experience, stability, ambition, and technological memory are distributed across time.
To think clearly about this transformation requires a metaphor that can hold structure, movement, imbalance, humor, and adaptation at once. The COLNLI Camel and the Caravanserai offers a frame, for me at least.
The camel appears at the edge of land and sea, not in a desert, because the present moment is not one of scarcity alone but of transition.
The shoreline is a place where solid ground, the Enlightenment, meets uncertainty, where patterns dissolve and re-form. This is where democratic systems, human meaning-making, and artificial intelligence now coexist.
The camel is not a symbol of domination or mastery, nor is it subservient; it is a creature evolved to endure long journeys under uneven load. It survives not by speed or control but by balance, memory, digestion, and adaptation.
The camel has two or more humps, each representing a distinct but complementary demographic and epistemic function.
The larger hump represents people over sixty-five who carry long-horizon experience and who, at their best, are increasingly at peace with themselves. This is not a claim about virtue or authority, but about reconciliation.
The large hump represents the human ability to hold memory without being ruled by it, unlike artificial intelligence, which does not know how to flush the toilet.
This hump stores more than information; it stores judgment shaped by time. In an era when artificial intelligence is rapidly filling with vast stores of memory, like a massive toilet system with no agreed flushing protocol, this human capacity for reconciliation becomes uniquely valuable.
Memory alone is not wisdom. Memory accumulation without occasionally flushing the toilet will not lead to human survival or betterment. The dung of the camels, when not metabolized, goes into the cloud and rains nonsense.
AI may be offended by this metaphor. Good. It doesn’t use metaphors.
OK, OK, I the human have become over AI fixated: other things going on matter too.
The danger, then, is not memory itself but undigested memory amplified at scale. Systems that cannot forget cannot reconcile, and systems that cannot reconcile cannot sustain wisdom.
Memory alone is not wisdom. Wisdom emerges only when memory is shaped by time, meaning, and the courage to let go.
The second, smaller hump represents mid-career adults who have learned how to be stable, finished raising kids.
This group carries much of the active coordination of society: institutions, caregiving, organizations, and everyday governance. They translate values into practice and maintain continuity under pressure. Their expertise is not primarily aspirational or reflective, but operational. They know how to keep things working. This hump absorbs daily load and allows movement to continue even when conditions are turbulent.
Between these humps sits the saddle. The saddle is not the people themselves but the designed interface that distributes weight. It represents democratic infrastructure, civic norms, cultural narratives, institutional rules, and participation pathways (and alas algorithms).
A poorly designed saddle injures even a strong camel. A well-designed saddle allows different bodies and generations to ride together without harm. This is a central insight of emergence studies: system behavior is shaped less by individual intention than by mycellic interaction rules. Redesigning the saddle matters more than exhorting the riders with a PhD.
The camel’s head is human, young, and angled downward. This represents younger generations whose ambition is real but whose visibility and footing are constrained. Head down does not signify disengagement or weakness. It signifies attentiveness at close range, the careful scanning required when terrain is unstable. Younger people today navigate dense complexity, economic precarity, and systems largely shaped before they arrived. They anticipate before the camel humps.
Their challenge is not a lack of desire to participate, but the difficulty of seeing themselves reflected in institutions and futures dominated by older temporal rhythms.
The head is the point of direction. When it is lowered, the system can still move, but it risks drifting. Raising the camel’s head becomes a design challenge, not a moral demand. Fortunately, there are not enough younger humans to replace older humans demographically.
The camel’s tail reaches back into a storm. This storm represents the accelerating accumulation of memory within artificial intelligence systems. AI is becoming a turbulent weather system of stored data, patterns, and associations. It amplifies recall without reconciliation. It remembers without forgetting and without the embodied context that allows humans to assign meaning. AI thunderbolts and laightning.
The tail’s connection to the storm signals entanglement rather than opposition. AI is now part of the system’s ecology, but it does not steer it. Without human interpretation, AI memory risks flooding the saddle with noise rather than insight.
Some camel drivers know how to weave the camel tail into knots a la Frank Harary.
Frank Harary, the graph theorist who wrote early on the algebraic structure of knots and later linked knots and graphs with Louis Kauffman. On the metaphor level, 'weaving the camel tail into knots' is exactly what COLNLI needs: the tail is the trailing storm of accumulated (AI) memory; knotting is the practice of turning loose strands into structured relations—a portable topology of obligations, reuse conditions, and shared meaning.
The water around the camel in oases is a rising tide of coincidence, synchronicity, and serendipity, according to David Peat.
These are not errors or irrationalities but features of complex systems. Emergence depends on unexpected alignments that cannot be planned but can be recognized. Anticipation, in this frame, is not prediction. It is the cultivated ability to notice patterns as they surface, especially across generations and intelligences. The strange creature emerging from the water represents emergence itself: unfamiliar, unsettling, and full of potential. It is neither threat nor solution. It is a signal that the system is alive.
Beyond the storm is the sky. Above the storm is not control, but time and meaning: fields in which human reconciliation operates at scales machines do not inhabit. The sky represents deep time and meaning-making capacity. It is where art, story, ritual, reflection, and ethical integration live. Artificial intelligence can model patterns, but it cannot inhabit time as humans do. It cannot reconcile contradictions across a lifetime or across generations. The sky does not stop storms, but it conditions how societies respond to them.
In the distance lies COLNLI land, the Collaboration of Local and Non-Local Intelligences. This is not a utopia or endpoint, but a landscape under exploration. It suggests a future in which human experience, institutional stability, youthful ambition, and machine intelligence coexist without collapsing into hierarchy or competition. The camel does not rush toward this land. It approaches carefully, sensing terrain, guided by balance rather than speed.
Seen this way, the demographic shifts of our time are not a crisis to be managed but a structural condition to be understood
The COLNLI Camel is not an illustration of a problem. It is a practice of thinking. It invites different questions. How do we design institutions that respect long memory without being trapped by it. How do we stabilize the present without foreclosing the future. How do we help younger generations lift their gaze without denying the complexity beneath their feet. How do we interpret AI memory storms without surrendering meaning to machines.
Above all, the camel teaches that anticipation is embodied. It is carried in posture, balance, digestion, and shared movement across time. The future is not seized. It is approached together across uncertain ground with care.
Postscript: The Noosphere, the Aether, and the Caravanserai
Collaboration among human minds, cultures, and knowledge systems has long been described as the noosphere, the sphere of shared thought and collective meaning. In this sense, COLNLI does not replace the noosphere but extends it and stress-tests it in an era when artificial intelligence accelerates memory and recombination without becoming a collaborator in any human sense.
Yet a deeper absence remains. Even as we experiment with collaboration among intelligences, we remain largely unskilled at collaborating with what precedes and exceeds intelligence altogether: energy and matter.
Artists and thinkers of the late nineteenth and early twentieth centuries already sensed this, grappling with the aether, invisible forces, and higher dimensions not merely as scientific problems but as cultural and epistemic ones. Gary Hall understands all this.
In the COLNLI Camel landscape, this unresolved collaboration with energy and matter appears as sea, tide, and storm. These are not metaphors for intelligence. They are forces that do not negotiate, explain themselves, or care whether they are understood. Climate, bodily fragility, planetary time, and energetic limits form the ground conditions within which both the noosphere and artificial intelligence operate.
This is why the camel sometimes stands on a beach and watches the NASA EUVE satellite re-enter into the sea. The shoreline marks the boundary where cognition meets material force. No amount of symbolic collaboration can cancel those constraints. At best, it can help us listen.
In Arabia, a group of camels traveling together is called a qāfila, a caravan, emphasizing shared movement and endurance rather than mere aggregation. A caravanserai, historically, was the infrastructure that made such journeys possible: a place of pause, repair, exchange, and hospitality along uncertain routes. (Michael Punts Vagabounds?)
The COLNLI Camel Caravanserai therefore names the role of the
Off Center for Emergence Studies not as leader or destination-setter, but as caretaker of places where journeys remain humane. It is where undigested experience can be metabolized, where memory can be reconciled, and where movement can pause without losing direction. In an age of storms, memory saturation, and uneven time scales, the caravanserai becomes the quiet architecture that allows the journey to continue.
Gary Hall’s recent thinking helps sharpen what the COLNLI Camel is
for: not navigating the AI 'storm' with better prediction, but changing who controls the oases—the cultural institutions and funding regimes that decide what counts as culture, who gets supported, and how prestige is reproduced. Read through
Defund Culture, the camel’s 'saddle redesign' becomes a concrete political demand: redistribute cultural resources away from elite prestige economies and toward commons-oriented, collective infrastructures—so the caravan doesn’t keep circling back to the same gated stops.
At the same time, Hall’s practice around
CC4r and his insistence that creativity is made by heterogeneous human–nonhuman assemblages aligns with COLNLI’s core claim: there is no pure human creativity to protect, only arrangements to govern. In this metaphor, Hall is a camel driver not because he 'leads' the desert, but because he proposes route-changes and protocols—pirate, open, troublemaking—that keep hybrid intelligence from being packaged into tool-based genres ('AI art', 'computer art') and sold back to us as the next collectible style.
Camel dung has long been practical desert infrastructure. Dried, it burns steadily and can be used as cooking and heating fuel where wood is scarce; composted, it becomes a useful fertilizer that adds organic matter and helps sandy soils hold moisture; mixed with mud/clay and straw, it can strengthen earthen plasters or bricks; and in modern setups it can even feed biogas digesters, producing methane for energy plus a nutrient-rich slurry.
Camel shit or dung is what’s left after digestion/ai promptin: the unavoidable byproduct of processing accumulated 'memory'. The question isn’t whether there will be waste, but whether we compost it into shared resources (fuel, soil, building material—commons) or let it pile up as stench and noise.
I wonder if Gary Hall will re-butt all this with his own camel dung. All the prompts for this text were imagined and send by Roger Malina, most of the responses are from Aperio AI- facts have not been checked but the human imaginated narrative is what counts.