A system can be defined as a set of interconnected elements that form a coherent whole with a distinct pattern of behaviour. These elements or agents can be as diverse as animals, cells, humans, organisations or businesses. In contrast to an aggregate, in a system the properties of the elements depend on the systemic context within which they are located. In other words, the system consists of the elements and, in turn, the elements are influenced by the systemic whole (Juarrero, 1999). For example, as part of a community people shape the way things work in the community but their individual behaviours are in turn shaped by the rules and norms of the community they create. This phenomenon is called emergence.
Kurtz and Snowden (2003) describe in their paper two different types of order in natural systems: ‘directed order’ and ‘emergent order’.
Directed order describes a system where “the relationship between an action and its consequences is knowable by bringing in relevant expertise” (Hummelbrunner & Jones, 2013:2). In this space, solutions can be designed as it is clear what the problem is and an agreement can be found on how it can be fixed. These systems can be highly intricate and analysis difficult, which is when they are called complicated. In complicated contexts, the system can be taken apart, defective individual elements can be fixed or optimised and then the system can be put back together. This can be seen for example when a car engine is fixed or when parts of a solar power generation plant are optimised. This works because the functionality of the system is given by the sum of the functionality of the parts. Taking the system apart and fixing or optimising parts individually leads to improved performance of the overall system. If one part fails, these systems often malfunction completely.
Emergent order is different. In these systems “there is a fascinating kind of order in which no director or designer is in control but which emerges through the interaction of many entities” (Kurtz & Snowden, 2003:464). Emergent order gives the system abilities that individual components do not have. Most abilities that we attribute to complex systems are emergent properties, such as consciousness emerging from a system of individually unconscious neurons; intricate patterns in the murmuring of hundreds or thousands of starlings emerging from individuals that follow simple rules and only receive signals from their immediate neighbours; a set of rules and norms emerging from a community of individuals living in close proximity; and so on.
Emergence is a process of the elements self-organising into a qualitatively novel state of interrelation, and hence a higher-level order. Emergence occurs when previously uncorrelated elements or processes in the system suddenly become coordinated and interconnected (Juarrero, 1999). An example of this process is the emergence of impersonal exchange in economies. Interrelations between individual market actors over time lead to the establishment of institutions that allow for impersonal exchange. Yet societies have not simply decided to design these institutions and put them in place from one day to the next – rather, they have evolved over time.
Under emergent order, causality is not predictable because the structure of these systems is not fixed but continuously created by the interactions of the actors. The structure changes with the behaviour of the actors in the system. The behavioural choices in turn depend on the structure. This feedback loop creates continuous, dynamic adaptation. Interventions change the system in a way so a repeated intervention will lead to a different result. Hence an understanding of the causal relations for each change can only be gained in hindsight and not through foresight. Snowden (2011) therefore describes emergent order as being only retrospectively coherent. In other words, the causality between an intervention and its effect can only be assessed once it has been implemented. In such systems, analysis and intervention have to merge into a process of continuous trial, learning and adaptation.
Typically in these situations, “there is not only considerable disagreement about the nature of the situation and what needs to be done, but also about what is happening and why. The relationship between an action and its consequences is unknowable beforehand, depending considerably on context” (Hummelbrunner & Jones, 2013:2). These systems are called complex systems or complex adaptive systems.
The current overall functionality of the system has emerged because of the way the components currently function or behave, whether they are perceived as working correctly or being broken. Complex systems often continue to work when one component fails as each part continuously adapts to the functioning of the other parts to preserve the overall functionality of the system. Optimising individual parts will have unintended and unpredictable effects on the functioning of the overall system.
The description of complexity and complex systems builds the basis of the understanding of the economy as presented in complexity and evolutionary economics. Social technologies and effective institutions emerge without a central director or designer and provide an emergent order for human interaction. Effective institutions are the reason humans can achieve capabilities that are not accessible to the individual. For example, institutions are needed to coordinate specialised knowledge in an industry. The institutional landscape co-evolves together and the institutions are consequently strongly interrelated. Optimising them in isolation will have unintended and unpredictable effects on the overall system.
HUMMELBRUNNER, R. & JONES, H. 2013. A Guide to Managing in the Face of Complexity. ODI Working Paper. London: Overseas Development Institute.
JUARRERO, A. 1999. Dynamics in Action: Intentional Behavior as a Complex System. Cambridge, Massachusetts; London, England: MIT Press.
KURTZ, C.F. & SNOWDEN, D.J. 2003. The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 423 462-483.
SNOWDEN, D.J. 2011. Good fences make good neighbors. Information Knowledge Systems Management, 101-4 135-150.