De-powering human societies

Replacing monologues and monocultures with dialogues and biodiversity at all levels of scale

  1. Cultural inertia
  2. Attempting to apply the scientific method in complex domains and transdisciplinary contexts
    1. Visualising conceptual causal models as an antidote to sloppy reasoning and invalid statistics
  3. Modularising complex domains
    1. A practical example
    2. The limits of understandability of linear languages
  4. Linear language – the ultimate learning disability
  5. Recovering diversity
    1. Open Space
    2. Multi-solving in Open Space
  6. The tomb stone of powered-up civilisations : Consistently Too Little Too Late
    1. The planetary reboot sequence

10,000 years ago, homo sapiens began farming a grain surplus. This surplus led to the creation of societal and cultural hierarchies which divorced our species from our long relationship with the natural world.

The biggest lie in our culture is the normal language of WEIRD success, finance, legalese, economics and technology.

The current human predicament is a result of the cultural disease of super-human scale powered-up civilisation building endeavours, the origins of which can be traced back to the beginnings of “modern” human history and the social power dynamics resulting from the invention of interest bearing debt around 5,000 years ago.

Maybe now is the time to write a new book: “Debt – The Last 20 Years”. Towards a more honest language to navigate the path ahead:

Cultural inertia

The following presentation to investors contains some interesting numbers, and along the way, it reveals the short-term thinking of those who are caught up in the cult of busyness.

Skagen is a Norwegian investment fund.

On the one hand, for the general public, given growing concerns about the climate and the environment, politicians and clean tech companies enjoy pointing to Norway as green and clean, as a poster child for the energy transition.

On the other hand, for investors, Norwegian fund management companies advertise the bullet proof investment opportunities in the energy sector – across the board, for further oil and gas exploration, and for mining.

I have not independently verified the numbers, but irrespective of the accuracy of the absolute numbers, even if some numbers are off by a factor of two or more, there is one true message:

The so-called energy transition is a feel-good exercise for the general public, and at the same time, it is [marketed as] a rock solid investment opportunity to investors. Norway is one of the biggest oil and gas exporters as well as a poster child for the dream of “clean” energy, propping up the other great investment opportunity, related to mining, globally – and therefore also again for the Norwegian oil and gas that is needed to meet the energy demand resulting from the anticipated extreme increase in mining activities. A win-win-win scenario, with a double win for Norwegian oil and gas.

The presenter correctly points out that mining won’t be able to scale 50-fold and more, to meet the theoretical demand of an “energy transition”, but given the current cultural inertia, this does not diminish the investment opportunities.

The cultural inertia and the optimism of investors is based on the following assumptions, none of which are mentioned anywhere in this presentation:

  1. Societies will continue to rely on GDP as a measure of progress
  2. The amount of greenhouse gases in the atmosphere is going to rise indefinitely, because oil and gas will be consumed indefinitely,without any disruption to the growth in GDP (so-called “wealth”)
  3. Industrialised countries (the WEIRD ones and China) will continue to rely heavily on the convenience of fossil energy powered technology hyper busy ways of life (the equivalent of several hundred energy slaves per person)
  4. The state of the biosphere and the planetary ecosystem is an irrelevant externality on the road of GDP growth and technological “progress”

Investors are obviously having a good time traveling on the “pedal to the metal” bus: No matter what lies ahead, capitalism is the best religion that has ever been invented.

You don’t need to be a genius to see that over the next 10 to 20 years at least two of the above assumptions will blow up in our face, and of course also in the face of investors. It does not matter which assumptions blow up first, the downstream effects will be massive, putting an end to energy intensive, i.e. to industrialised ways of life. Globally fungible money issued as interest bearing debt won’t survive.

In fact, the sooner the bizarre techno optimistic assumptions start to blow up, the more of the planetary ecosystem will still be around, to assist humans in the forced transition to [very] low energy ways of life and in rediscovering that we are part of nature.

Attempting to apply the scientific method in complex domains and transdisciplinary contexts

Visualising conceptual causal models as an antidote to sloppy reasoning and invalid statistics

Modularising complex domains

Modularisation of complex domains and creative collaboration across a diversity of domain boundaries maximises collective intelligence at human scale.

A practical example

The limits of understandability of linear languages

Unfortunately, so far most software has been written in linear languages, resulting in systems that are nearly as opaque, incomprehensible, and subject to surprising and unknown modes of failure as are state-of-the-art artificially intelligent systems. Over the last 50 years, in the busyness of frantic attempts to digitise and automate the tools of civilisation, we have un-learned the art of de-powered dialogue and the art of modularisation – the signature traits of the human species.

Ending the curse of software maintenance

Once we rediscover and appreciate the limits of human scale, we are equipped to replace the busyness of muddling with human dialogue and human comprehensible explanatory models.

Agent based semantic modelling and validation via instantiation is an urgently needed form of grassroots theory building and theory integration (knowledge archaeology) beyond academia. Wherever there is deep human domain knowledge, wherever people still trust each other and are not corrupted by social power dynamics, many experiments in the social sciences and in engineering can be improved and sometimes entirely replaced by semantic / causal modelling and instantiation. 

The semantic approach can be understood as a rapid iterative cycle of theory building and experimentation that surfaces and validates structural and causal mental models. It involves validating a theory, i.e. a formal semantic model, against the collective mental models of all available domain experts, resulting in explicit representations of the domain specific nonlinear multivariate mental models that participants use on a daily basis, usually without even being conscious of it. As an added bonus, instantiation catalyses shared understanding amongst all participants. 

Side note: The Cell Platform is designed to track the number of times categories are instantiated in the context of a particular model artefact, including the reasons for non-instantiation, i.e. unknown information or category is not-applicable in the context at hand. That’s exactly the meta data that turns a semantic model and a corresponding set of instances into a human scale, i.e. an understandable Bayesian model.

A big part of the journey of de-powering human societies involves taking an honest look at the relevance and quality of the digital data that we have been busy collecting – treating it as “the new oil”.

The antidote to misuse of mathematics and junk data

Linear language – the ultimate learning disability

How did we get caught up in a cult of busyness? Humans are curious creatures, we are easily distracted by shiny new toys. At human scale, in a de-powered social environment, our curiosity is a wonderful adaptive trait that generates cultural diversity. Beyond human scale, especially in powered-up environment, our curiosity degenerates into a maladaptive trait with the potential of causing untold harm.

Are you a model builder or a story teller?

Recovering diversity

Designing filtering, collaboration, thinking, and learning tools for the next 200 years

Open Space

Multi-solving in Open Space

Humans acquire knowledge or become aware of new information as individuals, and as part of groups of different sizes: households, teams, organisations, communities, societies, and with the help of ubiquitous global communication tools, even collectively as humanity. Creative collaboration in Open Space can help us break through the barriers of established disciplines and management structures, and power a continuous SECI (socialisation, externalisation, combination, internalisation) knowledge creation spiral at human scale.

Creativity = multisolving + neurodiversity + thinking tools

The tomb stone of powered-up civilisations : Consistently Too Little Too Late

Here is an excellent summary of the human predicament that may come in handy to shorten fruitless conversations with techno optimists and members of other anthropocentric faiths. I only have very few quibbles with the terminology used by Michael Dowd, and I love that he refers to industrialised civilisation as a religious cult, which I fully agree with.

One quibble refers to the use of the word “reality”, as it detracts from the vast diversity of lived human and non-human experiences. Instead I simply refer to the diversity of lived experiences and the commonalities and differences between lived experiences. One of the commonalities of lived experiences amongst sensitive Autistic people is that they tend to reach the doom and post-doom stages of understanding the hubris of anthropocentrism [much] earlier than others, many of whom remain their entire life in a state of cognitive dissonance and denial. Ted Nelson beautifully captured the toxicity of the modern technological human arrow of “progress” back in 1999, which is perhaps my favourite quote from the entire industrial era:

A frying-pan is technology.  All human artifacts are technology.  But beware of anybody who uses this term.  Like “maturity” and “reality” and “progress”, the word “technology” has an agenda for your behavior: usually what is being referred to as “technology” is something that somebody wants you to submit to.  “Technology” often implicitly refers to something you are expected to turn over to “the guys who understand it.”

This is actually almost always a political move.  Somebody wants you to give certain things to them to design and decide.  Perhaps you should, but perhaps not.

This applies especially to “media”.  I have always considered designing the media of tomorrow to be an art form (though an art form especially troubled by the politics of standardization).  Someone like Prof. Negroponte of MIT, with whom I have long had a good-natured feud, wants to position the design of digital media as “technology.  That would make it implicitly beyond the comprehension of citizens or ordinary corporation presidents, therefore to be left to the “technologists”– like you-know-who.

The other quibble I have with the terminology of Michael Dowd is his complete rejection of the word hope, as if the only hope we can have is related to the longevity of industrial civilisation. In the stage that Michael refers to as post-doom, he acknowledges hope for collective action in-the-small in terms of a focus on minimising suffering, being compassionate, de-powering relationships, practicing mutual aid, etc. without using the word hope. He even acknowledges the benefits of gallows humour in terms of catalysing mutual trust and social cohesion, but frames it all under the broad umbrella of “acceptance”, which in my mind only makes sense when qualified to acceptance of civilisational collapse, which is liberating us from a diseased life denying culture.

Without trust in our ability to appreciate local life at [small] human scale and the hope that we can minimise suffering, we fail to be part of life.

The planetary reboot sequence

Mosses seem to be a part of the planetary reboot sequence that gets initiated after evolutionary hiccups like the one triggered by industrial civilisation. The next 200 years will reveal whether humans can be part of the reboot sequence. This documentary contains some superb footage and good commentary – if you ignore the last segment about exporting life to Mars.

The open question is how humans will treat each other and our non-human contemporaries on the journey towards being composted and recycled. Experiences may vary depending on the human scale cultures we co-create on the margins.

The antidote to misuse of mathematics and junk data

transparencyDepending on who you ask, the perceptions of mathematics range from an esoteric discipline that has little relevance to everyday life to a collection of magical rituals and tools that shape the operations of human cultures. In an age of exponentially increasing data volumes, the public perception has increasingly shifted towards the latter perspective.

On the one hand it is nice to see a greater appreciation for the role of mathematics, and on the other hand the growing use of mathematical techniques has led to a set of cognitive blind spots in human society:

  1. Blind use of mathematical formalisms – magical rituals
  2. Blind use of second hand data – unvalidated inputs
  3. Blind use of implicit assumptions – unvalidated assumptions
  4. Blind use of second hand algorithms – unvalidated software
  5. Blind use of terminology – implicit semantic integration
  6. Blind use of numbers – numbers with no sanity checks

Construction of formal models is no longer the exclusive domain of mathematicians, physical scientists, and engineers. Large and fast flowing data streams from very large networks of devices and sensors have popularised the discipline of data science, which is mostly practiced within corporations, within constraints dictated by business imperatives, and mostly without external and independent supervision.

The most worrying aspect of corporate data science is the power that corporations can wield over the interpretation of social data, and the corresponding lack of power of those that produce and share social data. The power imbalance between corporations and society is facilitated by the six cognitive blind spots, which affect the construction of formal models and their technological implementations in multiple ways:

  1. Magical rituals lead to a lack of understanding of algorithm convergence criteria and limits of applicability, to suboptimal results, and to invalid conclusions. Examples: Naive use of frequentist statistical techniques and incorrect interpretations of p-values by social scientists, or naive use of numerical algorithms by developers of machine learning algorithms.
  2. Unvalidated inputs open the door for poor measurements and questionable sampling techniques. Examples: use of data sets collected by a range of different instruments with unspecified characteristics, or incorrect priors in Bayesian probabilistic models.
  3. Unvalidated assumptions enable the use of speculative causal relationships, simplistic assumptions about human nature, and create a platform for ideological bias. Examples: many economic models rest on outdated assumptions about human behaviour, and consciously ignore evidence from other disciplines that conflicts with established economic dogma.
  4. Unvalidated software can produce invalid results, contradictions, and unexpected error conditions . Examples: outages of digital services from banks and telecommunications service providers are often treated as unavoidable, and computational errors sometimes cost hundreds of millions of dollars or hundreds of lives.
  5. Unvalidated semantic links between mathematical formalisms, data, assumptions and software facilitate further bias and spurious complexity. Examples: Many case studies show that formalisation of semantic links and systematic elimination of spurious complexity can reduce overall complexity by factors between 3 and 20, whilst improving computational performance.
  6. Unvalidated numbers can enable order of magnitude mistakes and obvious data patterns to remain undetected. Example: Without adequate visual representations, even simple numbers can be very confusing for a numerically challenged audience.

Whilst a corporation may not have an explicit agenda for creating distorted and dangerously misleading models, the mechanics of financial economics create an irresistible temptation to optimise corporate profit by systematically shifting economic externalities into cognitive blind spots. A similar logic applies to government departments that have been tasked to meet numerically specified objectives.

Mathematical understanding and numerical literacy is becoming increasingly important, but it is unrealistic to assume that the majority of the population will become sufficiently proficient in mathematics and statistics to be able to validate and critique the formal models employed by corporations and governments. Transparency, including open science, open data, and open source software are are emerging as essential tools for independent oversight of cognitive blind spots:

  1. Mathematicians must be able to review the formalisms that are being used
  2. Statisticians must be able to review measurement techniques and input data sources
  3. Scientists and experts from disciplines relevant to the problem domain must be able to review assumptions
  4. Software engineers familiar with the software tools that are being used must be able to review software implementations
  5. Mathematicians with an understanding of category theory, model theory, denotational semantics, and conceptual modelling must be able to review semantic links between mathematical formalisms, terminology, data, assumptions, and software
  6. Mathematicians and statisticians must be able to review data representations

In a zero marginal cost society, transparency allows scarce and highly specialised mathematical knowledge to be used for the benefit of society. It is very encouraging to note the similarity in knowledge sharing culture between the mathematical community and the open source software community, and to note the decreasing relevance of opaque closed source software.

The more society depends on decisions made with the help of mathematical models, the more important it becomes that these decisions adequately accommodate the concrete needs of individuals and local communities, and that the language used to reason about economics remains understandable, and enables the articulation of economic goals in simple terms.

The big human battle of this century

The big human battle of this century is going to be the democratisation of data and all forms of knowledge, and the introduction of digital government with the help of free and open source software

Whilst undoubtedly the reaction of the planet to the explosion of human activities with climate change and other symptoms is the largest change process that has ever occurred in human history in the physical realm, the exponential growth of the Internet of Things and digital information flows is triggering the largest change process in the realm of human organisation that societies have ever experienced.

The digital realm

The digital realm

Sensor networks and pervasive use of RFID tags are generating a flood of data and lively machine-to-machine chatter. Machines have replaced humans as the most social species on the planet, and this must inform the approach to the development of healthy economic ecosystems.

Internet of Things

Sensors that are part of the Internet of Things

When data scientists and automation engineers collaborate with human domain experts in various disciplines, machine-generated data is the magic ingredient for solving the hardest automation problems.

  • In domains such as manufacturing and logistics the writing is on the wall. Introduction of self-driving vehicles and just a few more robots on the shop floor will eliminate the human element in the social chatter at the workplace within the next 10 years.
  • The medical field is being revolutionised by the downward spiral of the cost of genetic analysis, and by the development of medical robots and medical devices that are hooked up to the Internet, paving the way for machine learning algorithms and big data to replace many of the interactions with human medical professionals.
  • The road ahead for the provision of government services is clearly digital. It is conceivable that established bureaucracies can resist the trend to digitisation for a few years, but any delay will not prevent the inevitability of automation.

The social implications

Data driven automation leads to an entirely new perspective on the purpose of the education system and on the role of work and employment in society.

Large global surveys show that more than 70% of employees are disengaged at work. It is mainly in manufacturing that automation directly replaces human labour. In many other fields the shift in responsibilities from humans to machines initially goes hand in hand with the invention of new roles and loss of a clear purpose.

Traditional work is being transformed into a job for a machine. Exceptions are few and far between.

Data that is not sufficiently accessible is only of very limited value to society. The most beneficial and disruptive data driven innovation are those that result from the creative combination of data sets from two or more different sources.

It is unrealistic to assume that the most creative minds can be found via the traditional channel of employment, and it is unrealistic that such minds can achieve the best results if data is locked up in organisation-specific or national silos.

The most valuable data is data that has been meticulously validated, and that is made available in the public domain. It is no coincidence that software, data, and innovation is increasingly produced in the public domain. Jeremy Rifkin describes the emergence of a third mode of commons-based digitally networked production that is distinct from the property- and contract-based modes of firms and markets.

The education system has a major role to play in creating data literate citizen-scientists-innovators.

The role of economics

It is worthwhile remembering the origin of the word economics. It used to denote the rules for good household management. On a planet that hosts life, household management occurs at all levels of scale, from the activities of single cells right up to processes that involve the entire planetary ecosystem. Human economics are part of a much bigger picture that always included biological economics and that now also includes economics in the digital realm.

To be able to reason about economics at a planetary level the planet needs a language for reasoning about economic ecosystems, only some of which may contain humans. Ideally such a language should be understandable by humans, but must also be capable of reaching beyond the scope of human socio-economic systems. In particular the language must not be coloured by any concrete human culture or economic ideology, and must be able to represent dependencies and feedback loops at all levels of scale, as well as feedback loops between levels of scale, to enable adequate representation of the fractal characteristic of nature.

The digital extension of the planetary nervous system

In biology the use of electrical impulses for communication is largely confined to communication within individual organisms, and communication between organisms is largely handled via electromagnetic waves (light, heat), pressure waves (sound), and chemicals (key-lock combinations of molecules).

The emergence of the Internet of Things is adding to the communication between human made devices, which in turn interact with the local biological environment via sensors and actuators. The impact of this development is hard to overestimate. The number of “tangible” things that might be computerized is approaching 200 billion, and this number does not include large sensor networks that are being rolled out by scientists in cities and in the natural environment. Scientists are talking about trillion-sensor networks within 10 years. The number of sensors in mobile devices is already more than 50 billion.

Compared to chemical communication channels between organisms, the speed of digital communication is orders of magnitude faster. The overall effect of equipping the planet with a ubiquitous digital nervous system is comparable to the evolution of animals with nervous systems and brains – it opens up completely new possibilities for household management at all levels of scale.

The complexity of the Internet of Things that is emerging on the horizon over the next decade is comparable to the complexity of the human brain, and the volume of data flows handled by the network is orders of magnitudes larger than anything a human brain is able to handle.

The global brain

Over the course of the last century, starting with the installation of the first telegraph lines, humans have embarked on the journey of equipping the planet with a digital electronic brain. To most human observers this effort has only become reasonably obvious with the rise of the Web over the last 20 years.

Human perception and human thought processes are strongly biased towards the time scales that matter to humans on a daily basis to the time scale of a human lifetime. Humans are largely blind to events and processes that occur in sub-second intervals and processes that are sufficiently slow. Similarly human perception is biased strongly towards living and physical entities that are comparable to the physical size of humans plus minus two orders of magnitude.

As a result of their cognitive limitations and biases, humans are challenged to understand non-human intelligences that operate in the natural world at different scales of time and different scales of size, such as ant colonies and the behaviour of networks of plants and microorganisms. Humans need to take several steps back in order to appreciate that intelligence may not only exist at human scales of size and time.

The extreme loss of biodiversity that characterises the anthropocene should be a warning, as it highlights the extent of human ignorance regarding the knowledge and intelligence that evolution has produced over a period of several billion years.

It is completely misleading to attempt to attach a price tag to the loss of biodiversity. Whole ecosystems are being lost – each such loss is the loss of a dynamic and resilient living system of accumulated local biological knowledge and wisdom.

Just like an individual human is a complex adaptive system, the planet as a whole is a complex adaptive system. All intelligent systems, whether biological or human created, contain representations of themselves, and they use these representations to generate goal directed behaviour. Examples of intelligent systems include not only individual organisms, but also large scale and long-lived entities such as native forests, ant colonies, and coral reefs. The reflexive representations of these systems are encoded primarily in living DNA.

From an external perspective it nearly seems as if the planetary biological brain, powerful – but thinking slowly in chemical and biological signals over thousands of years, has shaped the evolution of humans for the specific purpose of developing and deploying a faster thinking global digital brain.

It is delusional to think that humans are in control of what they are creating. The planet is in the process of teaching humans about their role in its development, and some humans are starting to respond to the feedback. Feedback loops across different levels of scale and time are hard for humans to identify and understand, but that does not mean that they do not exist.

The global digital brain is currently still in under development, not unlike the brain of a human baby before birth. All corners of the planet are being wired up and connected to sensors and actuators. The level of resilience of the overall network depends on the levels of decentralisation, redundancy, and variability within the network. A hierarchical structure of subsystems as envisaged by technologist Ray Kurzweil is influenced by elements of established economic ideology rather than by the resilient neural designs found in biology. A hierarchical global brain would likely suffer from recurring outages and from a lack of behavioural plasticity, not unlike the Cloud services from Microsoft and Amazon that define the current technological landscape.

Global thinking

The ideology of economic globalisation is dominated by simplistic and flawed assumptions. In particular the concepts of money and globally convertible currencies are no longer helpful and have become counter-productive. The limitations of the monetary system are best understood by examining the historic context in which money and currencies were invented, which predates the development of digital networks by several thousand years. At the time a simple and crude metric in the form of money was the best technology available to store information about economic flows.

As the number of humans has exploded, and as human societies have learned to harness energy in the form of fossil fuels to accelerate and automate manufacturing processes, the old monetary metrics have become less and less helpful as economic signals. In particular the impact of economic externalities that are ignored by the old metrics, both in the natural environment as well as in the human social sphere, is becoming increasingly obvious.

The global digital brain allows flows of energy, physical resources, and economic goods to be tracked in minute detail, without resorting to crude monetary metrics and assumptions of fungibility that open the door to suppressing inconvenient externalities.

A new form of global thinking is required that is not confined to the limited perspective of financial economics. The notions of fungibility and capital gains need to be replaced with the notions of collaborative economics and zero-waste cyles of economic flows.

Metrics are still required, but the new metrics must provide a direct and undistorted representation of flows of energy, physical resources, and economic goods. Such highly context specific metrics enable computational simulation and optimisation of zero-waste economics. Their role is similar to the role of chemical signalling substances used by biological organisms.

Global thinking requires the extension of a zero-waste approach to economics to the planetary level – leaving no room for any known externalities, and encouraging continuous monitoring to detect unknown externalities that may be affecting the planetary ecosystem.

The future of human economics

The real benefits of the global digital brain will be realised when massive amounts of machine generated data become accessible in the public domain in the form of disruptive innovation, and are used to solve complex optimisation problems in transportation networks, distributed generation and supply of power, healthcare, recycling of non-renewable resources, industrial automation, and agriculture.

Five years ago Tim O’Reilly predicted a war for control of the Web. The hype around big data has let many organisations forget that the Web and social media in particular is already saturated with explicit and implicit marketing messages, and that there is an upper bound to the available time (attention) and money for discretionary purchases. A growing list of organisations is fighting over a very limited amount of potential revenue, unable to see the bigger picture of global economics.

Over the next decade one of the biggest challenges will be the required shift in organisational culture, away from simplistic monetisation of big data, towards collaboration and extensive data and knowledge sharing across disciplines and organisational boundaries. The social implications of advanced automation across entire economic ecosystems, and a corresponding necessary shift in the education system need to be addressed.

The future of humans

Human capabilities and limitations are under the spot light. How long will it take for human minds to shift gears, away from the power politics and hierarchically organised societies that still reflect the cultural norms of our primate cousins, and from myopic human-centric economics, towards planetary economics that recognise the interconnectedness of life across space and time?

The future of democratic governance could be one where people vote for human understandable open source legislation that is directly executable by intelligent software systems. Corporate and government politicians will no longer be deemed as an essential part of human society. Instead, any concentration of power in human hands is likely to be recognised as an unacceptable risk to the welfare of society and the health of the planet.



Humans have to ask themselves whether they want to continue to be useful parts of the ecosystem of the planet or whether they prefer to take on the role of a genetic experiment that the planet switched on and off for a brief period in its development.

Quality of service in the digital age

Oh the irony. Last week I wrote an article on the role of service resilience in shaping a positive user experience, and today I’m trying to use a basic digital service to charge up a mobile with credit before travelling overseas – and receive the following notification, along the lines of:

Dear customer, unfortunately the opening hours of our digital service are top secret.

Dear customer, unfortunately the opening hours of our digital service are top secret.

Not even an indication of when it may be worthwhile trying again. The local 0800 number is also not of much help to a traveller. The particular incident is just one example of typical quality of service in the digital realm. Last week, before this wonderful user experience, I wrote:

The digitisation of services that used to be delivered manually puts the spotlight on user experience as human interactions are replaced with human to software interactions. Organisations that are intending to transition to digital service delivery must consider all the implications from a customer’s perspective. The larger the number of customers, the more preparation is required, and the higher the demands in terms of resilience and scalability of service delivery. Organisations that do not think beyond the business-as-usual scenario of service delivery may find that customer satisfaction ratings can plummet rapidly.

Promises made in formal service level agreements are easily broken. Especially if a service provider operates a monopoly, the service provider has very little incentive to improve quality of service, and ignores the full downstream costs of outages incurred by service users.

All assurances made in service level agreements with external service providers need to be scrutinised. Seemingly straightforward claims such as 99.99% availability must be broken down into more meaningful assurances. Does 99.9% availability mean one outage of up to 9 hours per year, or a 10 minute outage per week, or a 90 second outage per day? Does the availability figure include or exclude any scheduled service maintenance windows?

My recommendation to all operators of digital services: Compute the overall risk exposure to unavailability of services and make informed decisions on the level of service that must be offered to customers. As a rule, when transitioning from manual services to digital services, ensure that customers benefit from an increase in service availability. The convenience of close to 24×7 availability is an important factor to entice customers to use the digital channel.