Clare Martin asks what happens when shared inner life is damaged by trauma or violence, and what does it take to safeguard sacred values in an age of exponential tech?

We often hear of Easter Island as a cautionary tale for our time. Its inhabitants deforested the island, leaving it a wasteland. They starved, and all that remained were the toppled statues with their distinctive over-sized heads. 

The truth, though, is probably quite different. According to Paul Cooper (whose Fall of Civilisations podcast I highly recommend), this account of the collapse of Easter Island was largely a confection of the European explorers who eventually colonised the island. Yes, the islanders cut down nearly all the trees on the island. This didn’t in itself make the island uninhabitable. They cleared forest for farmland, as all peoples from the Neolithic age onward have done throughout the world.  What looked like a rocky wasteland to the early explorers was, in fact, agricultural land that was painstakingly mulched with a layer of stones. This was an ingenious way to keep moisture in the soil under hot, dry growing conditions. 

No, what most likely caused the rapid decline in Easter Island’s population was the arrival of the Europeans themselves, who brought with them novel viruses to which the islanders had no immunity. Later, the transatlantic slave trade and European colonisation compounded the harm. 

It’s a harrowing story. One of the especially heartbreaking details is what happened to the statues themselves. These were seen as literal embodiments of the people’s ancestor gods. Powerful nature deities who protected the people and the land. When the islanders began falling sick and dying in their dozens and then in their hundreds, they couldn’t understand why life had turned against them. They grew angry at their own gods. Why hadn’t they protected them? Why had nature itself grown so hostile? In their despair, they toppled the statues themselves.  

That is one theory, impossible to verify. But something about it rings true, and for me it speaks to questions that I’ve been sitting with recently. What are the impacts on spiritual life of collective experiences of catastrophic loss? Are there inner wounds that can’t be healed? Can a civilisation survive beyond the point where its people have lost trust in their own sacred values? 

Because I think we’re like the Easter Islanders, as they’re described in this version of the story. Like them, we’re imbibing an invisible contagion so far outside our normal experience that we don’t even realise it’s there. Under its influence, we’re in danger of becoming so inwardly dislocated that we may tear to pieces the very fabric of inner lives. This contagion isn’t a virus. It’s the algorithm. This computational entity – endowed with autonomous powers of calculation and action –  is a threat to inner life, new in the history of humanity. 

Many in the tech world name AI as the most serious near-term existential threat we face. Apparently we’re very close to living in a world with autonomous AI weapons. This is beyond terrifying. And yet, the truth is that autonomous AI weapons are already ubiquitously deployed across and within national boundaries. These weapons weren’t designed to cause conflict. Algorithms were designed by a handful of engineers working for social media giants seeking to maximise profit. But where profit equates to user engagement, it was inevitable that algorithms would target our strongest emotions. What scares us? What outrages us? What makes us feel righteous? 

The intention wasn’t conflict: conflict was a consequence of the business model. Nor was the business model particularly new. It was just the natural next step in the capitalist project of industrial extractivism. Only in this case, it isn’t land being intensively farmed for the extraction of food, or mined for energy. Rather, it’s the users themselves who are being farmed. We’re all being farmed, collectively, by a million billion bits of autonomous tech. Capitalism, having exhausted the capacity to extract value out of the material world and convert it into money, has now turned its rapacious appetites to the inner world. 

Think of it like mountaintop removal mining. The point isn’t really to blow up the mountain. The explosion is a side-effect of the primary goal: to extract coal from the substrate. AI algorithms follow the same model. What the algorithm is seeking to exploit – the coal in the mountain – rightly belongs to our inner life: our emotions, feelings and beliefs. These are of interest only because they can be monetised. Through them we can all become addicted to online interactions which are then translated into advertising revenue.This is the extractivism of the inner, by way of exponential tech. But inner life, like the planet, is not an infinite and renewable resource. It can’t be endlessly extracted without becoming horribly damaged. 

I don’t claim to have the answer to this problem. But I do think that to combat any threat it’s crucial to recognise it for what it is. Faith leaders, and others who share an interest in the custodianship of inner life, have a role to play in this. So often when faith communities are fighting over issues of identity, politics, or gender, these disagreements feel like arguments between different camps. Between orthodoxy and progressivism, perhaps, or activists and traditionalists. Of course, such conflicts and disagreements exist. They are important in their own terms. And yet, I believe that much of the heat in these conflicts comes from a different source, and that the true adversary is AI itself. Here the best analogy is the false flag attack. 

This is a military strategy intended to provoke conflict by sowing confusion. A combatant strikes a target to make it appear that the enemy has done so, and thus incites an escalation of violence. This is the real source of so many of the conflicts that beset us. Every day, all over the world, a billion false flag attacks are laid by AI algorithms. They lay the mines. When the mines go off, we think it’s our opponent who has set them off. It isn’t. It’s AI. 

My wish would be for faith communities to wake up to the existential threat to inner life that AI poses. To become far more disciplined about how they relate to AI-driven conflicts. And to call forth the essential sacred values that no machine can ever be programmed to hold to: values of paradox, mystery, awe and love. 

Once a month we gather in our Bedouin Peace tent for shared contemplation. In November we welcomed contemplatives from the Christian, Sikh, Buddhist, Sufi and Hindu traditions. In an age of widespread AI- insurgencies into inner life, there’s something so wholesome about gathering together for this simple act of turning within. Do join us for our next Contemplative practice gathering on Tuesday December 13 at 6.30 pm. We look forward to seeing you there!

Clare Martin


Clare is Co-Director of St Ethelburga’s. Previously Development Director, Clare created and led on the Radical Resilience programme and went on to be the strategic lead on our viewpoint diversity work, before stepping up to co-lead the centre alongside Tarot Couzyn. She brings more than 20 years’ experience facilitating groups for the sake of inner enquiry and outer change, and is interested in how contemplative practices can play a role in cultural repair. She has has worked on numerous interfaith projects, most notably for Nisa Nashim, the Jewish Muslim Women’s Network. Prior to this, Clare worked as a communications consultant in the corporate and charitable sector. Currently she runs a community garden on her Hackney housing estate, where she lives with her husband and 9-year old daughter. Raised a Christian, Clare has also studied Buddhism and Sufism. You can read her thoughts on the role of visionary imagination in resilience building here, and here is a short piece about contemplation as an antidote to conflict.