What Is Complexity Science?
It Is Really Order-Creation Science

Bill McKelvey
UCLA, USA

Introduction

Order and its synonyms mean put persons or things into their proper places in relation to each other. Disorder, to natural scientists, means the 2nd law of thermodynamics, namely, inexorable dissipation toward entropy and randomness. Kauffman (1993) and Holland (1995) use the term order in the titles of their books, respectively The Origins of Order and Hidden Order. Mainzer (1997) titles his book Thinking in Complexity, but on page 1 he says: The theory of nonlinear complex systems is an interdisciplinary methodology to explain the emergence of certain macroscopic phenomena via the nonlinear interactions of microscopic elements in complex systems. Every subsequent chapter starts with a question about the emergence of orderin matter, life, brain, computer, and social systems. It is not by happenstance that our journal is titled Emergence!

Views of order creation have changed over the last century, as one might expect. Classical management theorists (Massie, 1965) say that order comes solely from the (rational?) thoughts and actions of owner/managers, captured nicely in the following quote attributed to Henry Ford: Why is it that whenever I ask for a pair of hands, a brain comes attached?1 The Darwin-Wallace theory of natural selection (Darwin, 1859) explains speciation in the biological world, that is: Why are there different kinds of organisms? Durkheim (1893) and Spencer (1898) also define order as the emergence of kinds, specifically social entities. Half a century later, however, Sommerhoff (1950), Ashby (1956, 1962), and Rothstein (1958) define order not in terms of entities but rather in terms of the connections among them. Ashby adds two critical observations. Order (organization), he says, exists between two entities, A and B, only if the link is conditioned by a third entity, C (Ashby, 1962: 255). If C symbolizes the environment, which is external to the connection between A and B, environmental constraints are what cause order (Ashby, 1956). This results in his law of requisite variety (Ashby, 1956): For a biological or social entity to be efficaciously adaptive, the variety of its internal order must match the variety of the environmental constraints. Furthermore, he also observes that order does not emerge when the environmental constraints are chaotic (Ashby, 1956: 131-2).

But what causes emergent order and self-organization? What is the underlying generative mechanism or engine of order creation? How is the order-creation process inside firms linked to their competitive context? Science is about finding causes of phenomena (Pearl, 2000; Salmon, 1998). If you start with the Prigogine line of thought (updated in Nicolis & Prigogine, 1989) and continue with Mainzer's (1997) development, it is clear that the only engine of order creation considered in complexity science, so far, is the Bénard process:

  1. Negentropy becoming available because of the energy differential or adaptive tension existing between a system and its surroundings, and imposing on microagents within the system, causes emergence.
  2. The 1st and 2nd critical values of R, the measure of tension, define the upper and lower bounds of the region of emergence (self-organization or complexity) sandwiched between the regions of order (slow change) and chaos (dysfunctional change).

Prigogine's basic argument is that the 1st and 2nd laws of thermodynamics would not exist if order had not been created in the first place. Darwin's process of natural selection is irrelevant if order has not been created in the first place. Complexity scienceas a general explanation of emergent orderis problematic and inconsistent in accounting for the Bénard process, as is evident in the literature emerging from the physical, biological, and social sciences. Worse, attention to the basic causal process underlying emergence has largely been ignored in most managerial and organizational applications of complexity science.

First, I review explanations of how order in matter (what Gell-Mann calls coarse-graining) emerges from the fine-grained structure of the entangled (correlated) histories of pairs of agents. Then I consider biological systems, dissipative structures, the Bénard process, and order creation in organizations. Following Mainzer (1997), my analysis leads to the inescapable conclusion that complexity science is really ordercreation science mistakenly characterized by a relatively extreme end state, complexity.

ORDER-CREATION THEORY IN COMPLEXITY SCIENCE

COARSE-GRAINING

In a book written for popular consumption, Gell-Mann (1994: Chapter 11) uses a few simple terms to explain how electrons interact with one another such that the quantum state of the one is affected by the other thus, over a series of time intervals, their quantum states are correlated.2 This is referred to as entanglement. The quantum state of a given electron is, thus, a function of its entanglement with all the other electrons with which it is correlated. At any given time, in a sequence of time intervals, each electron has a history of effects from all the other electrons with which it has come in contact. Because of the countless correlations, and the differing quantum states of all the other electrons, each individual history is likely unique. Consequently, quantum theorists cannot attach a probability of occurrence to each individual electron's history. Instead, they use a quantity, D(A, B), to record the relation between the quantum histories of two correlated electrons over timethus, D is always assigned to pairs of individual electron histories, A and B. Entanglement occurs when the correlated histories of pairs of electrons are greater than zero. If the individual histories are correlated, they are said to interfere with each other. Since most histories are correlated with other histories, D is seldom a probability. If histories almost always interfere, and thus D is almost never a probability, the root question is: How can physicists predict with probability, let alone with what seems to most of us virtual certainty?

Gell-Mann refers to the world of interference-prone histories as finegrained structure. Thus, the quantum world is the fine-grained structure, whereas he labels the world of quasiclassical physics the coarse-grained structure. The question then arises: How does coarsegrained structure emerge from fine-grainedentangledstructure? He uses the metaphor of a racetrack. As you get to your seat at the racetrack and consider the odds of your favorite horse winning, you ignore all of the other factors that could affect the racequality of horse feed and vets, the state of the track, sunlight, temperature, wind, swirling dust, flies, nature of the other people betting, track owners, mental state and health of the jockeys, and a hundred other factors that conceivably could affect the outcome of a race. All other times and the history of everything else in the universe is ignored.

How do the race probabilities emerge from the interference of the fine-grained structure? Gell-Mann says that when we sum over all of the detailed factors left outnot the tips of the noses of the few horses in, say, the fourth racethe interference effects average out at approximately zero. Hence, all the effects of the myriad tiny correlations among the details have no effect. The context of our interest in the winning horse causes us to sum over all the other fine-grained correlations. The racerelevant correlations among all the fine-structure effects are focused on to become the coarse-grained structure, whereas all the other detail correlations are summed over and their interference made irrelevant. When this happens, there are really three effects: (1) most of the history quantities, D, are ignored, that is, summed over; (2) the few correlated histories that become important do so because of the particular time and placemeaning that the histories are similar and conjoined, or the horses wouldn't be in the same race at the same place at the same time; and (3) since the interferences among these few correlated histories disappear, they become truly probabilistic and, thus, we can talk reasonably of the probability that one horse will nose out another.

Gell-Mann says: A coarse-grained history may be regarded as a class of alternative fine-grained histories, all of which agree on a particular account of what is followed, but vary over all possible behaviors of what is not followed, what is summed over (Gell-Mann, 1994: 144). Empirical researchers play this game every time they assume that the various effects not specifically hypothesized, or designed into the study as control variables, are randomized. That is, they neutralize each other and are, thus, summed over. The emergent coarse-graining process overcomes the interference-term effect by translating entanglement into probability, what Gell-Mann speaks of as decoherence (Gell-Mann, 1994: 146).3 Recall that the interference terms are the myriad correlations between pairs of particles in the fine-grained structure. Coarse-graining results in selecting out from the myriad the correlated histories of the same kind and the same level of relationship. Gell-Mann says that coarse-graining washes out the interferences among histories in the fine-grained structure (Gell-Mann, 1994: 145-6).

Roland Omnès (1999)4 develops an interpretation that connects better with complexity science. He makes a strong association between irreversibility, dissipation, and decoherence, arguing that the essential character of decoherence appears to be irreversibility (Omnès, 1999: 196).

He shows that decoherence is an irreversible dynamical process (Omnès, 1999: 206). Complexity scientists should note the parallel of Omnès's and Prigogine's treatment of time irreversibility (Prigogine & Stengers, 1984). Omnès suggests a total Hamiltonian: H = Hc + He + H1, where Hc is the Hamiltonian of the relevant internal variables of a system, He is the Hamiltonian of the environmental variables (potentially all other variables or degrees of freedom in the universe), and H1 a coupling of the two systems representing how the environmental variables affect or are affected by the internal variables (Omnès, 1999: 198). He shows that the dynamical suppression of the environmental interferences of the He Hamiltonian almost immediately produces a large decoherence effect (Omnès, 1999: 203). He bases many of his statements on an axiom by the French mathematician Borel (1937) that: one must consider that events with too small a probability never occur (Omnès, 1999: 84, 236). While probability mathematicians have to take vanishingly small probabilities into account, he summarizes Borel as saying, this kind of event cannot be reproducible and should be left out of science (Omnès, 1999: 84).

Omnès's view must be taken into account. His introduction of He recognizes that decoherence and emergent coarse-graining, even in quantum theory, are now subject to the regular-to-chaotic forces imposed on these fields. The external force, and its nature, results from the tension created by the Bénard energy differentials recognized by chaos and complexity scientists that foster negentropy and create emergent structure. In the simple Bénard cell, and in the atmosphere, an energy differential causes energy transfer via bulk (current) movements of gas molecules rather than via in-place vibrations and collisions. More broadly, think of an energy differential as producing coarse-graining among histories of the vibrating moleculesor among histories of bottom-level microagents in general. In this view, the energy differentials of complexity theory become the causes of emergent coarse-grained structure from entanglement pools.

COLLAPSE OF CHAOS

Cohen and Stewart (1994) refer to naturally occurring coarse-graining as emergent simplicity and the collapse of chaos. Their explanation of how coarse-grained structure emerges from fine-grained structure is the opposite of reductionismthus, their explanation is the antithesis of Gell-Mann's. Gell-Mann's laws of nature, to Cohen and Stewart, are Sherlock Holmes stories that scientists use to explain emergent simplicity. That they are predictive, especially in physics, is a fortuitous accident. In their view, laws of nature are [coarse-grained] features. They are structured patterns that collapse an underlying sea of chaos [the finegrained entanglement pool], and they are conditioned by context (Cohen & Stewart, 1994: 433). Their explanation is contextualist rather than reductionist. Their prime example is evolution (Cohen & Stewart, 1994: 418), really co-evolution (Cohen & Stewart, 1994: 420). Cohen and Stewart see emergent order as resulting from several dynamics.

First, there is the emergence of feedback loops that join entities that otherwise could evolve separately. For example, Cohen and Stewart say that DNA sequences live in DNA space, and in the absence of any other influences would wander around dynamically through the geography of DNA space, seeking attractors and settling on them. Similarly [for] organisms [that] live in creature space. They, too, can evolve independently seeking attractors and settling on them (Cohen & Stewart, 1994: 419). Both DNA and organism could evolve independently of each other. But, it is the joining of these two spaces by feedback loopsthe co-evolution of hierarchically related spacesthat counts. This parallels Omnès's coupling of Hc and He. More broadly, it is the interaction of heretofore independent spaces that are inherently conflicting, but coupled because of the effect of other influences, that causes coarse-graining (Cohen & Stewart, 1994: 414). Because the attractors in DNA space are likely to differ from those in creature space, once the feedback loop exists, novel structures are apt to emerge. In this example, and indeed all of the examples that Cohen and Stewart give, the mechanisms for coarse-graining in biology are Darwinian selectionist processes.

Second, Cohen and Stewart argue that entanglement pools are seldom purely random: really random systems would not possess statistical regularities (Cohen & Stewart, 1994: 233; their italics). Thus, emergent structure can follow from statistical features. Absent pure randomness, the correlated histories of quanta or higher-level entitiesmolecules, genes, organisms, etc.are distributed probabilistically, with the more probable correlations more likely to lead to emergent coarse-grained structure or the observation of same. Instead of Gell-Mann's dependence on photon scattering to create collapsed wave functions in purely random entanglement pools, they argue that many, if not most, pools are not purely random, and therefore coarse-graining is probable.

Third, Cohen and Stewart observe that many kinds of emergence do not stem from statistical distributions:

There is nothing statistical about π , the Feigenbaum number, the Mandelbrot setor chlorophyll, DNA, or homeotic genes, for that matter Statistics is just one way for a system to collapse the chaos of its fine structure and develop a reliable large-scale feature. Other kinds of feature can crystallize out from underlying chaosnumbers, shapes, patterns of repetitive behavior. (Cohen & Stewart, 1994: 233-4)

Fourth, Cohen and Stewart identify some kinds of emergencespecifically crystallographyas immune to the state of entanglement (Cohen & Stewart, 1994: 237). Recall that in Gell-Mann's view of quantum mechanics, the correlated histories of quanta result in purely random quantum states and a purely random entanglement pool. And, in his view, coarsegraining is only a function of photon scattering. In contrast, Cohen and Stewart see the correlated histories of atoms as following the rules of deterministic chaos: since the motion of atoms is chaotic, their precise behavior is sensitive to initial conditions (Cohen & Stewart, 1994: 236; their italics). They say:

Quantum systems don't exhibit chaos in the conventional sense, but any classical (that is, nonquantum) theory of large numbers of particles certainly does. Quantum systems aren't chaotic because the infinitely fine structures that are important for chaos are forbidden in quantum mechanics, thanks to the uncertainty principle. (Cohen & Stewart, 1994: 236)

But then they say:

Quantum mechanics has its own form of small-scale chaosgenuinely random fluctuations, rather than the deterministic but effectively random fluctuations of conventional chaos. (Cohen & Stewart, 1994: 237)

What emerges is a level-of-analysis effect: in their view, correlated histories of quantum states are purely random, but the correlated histories of atomsand, derivatively, all higher levelsare deterministically chaotic (Cohen & Stewart, 1994: 236).

Finally, they say: Crystal lattices are not just immune to small-scale chaos; they are immune to most of quantum mechanics (Cohen & Stewart, 1994: 237). Why?

The main thing we need to know is that physical systems tend to minimize their energy This argument in favor of an atomic lattice is independent of the shape of the atoms or their detailed properties; energy minimization is enough Crystal lattices are not just phenomena that emerge from quantum mechanics. They have a universal aspect; they will emerge from any theory sufficiently close to quantum mechanics that involves identical roughly spherical atoms and energy minimization. This kind of universality is common to many, perhaps all, emergent phenomena. (italics added; Cohen & Stewart, 1994: 237)

Cohen and Stewart focus on the selectionist effect in biology and the chaos and energy-minimization effects in physics at the level of atoms. They recognize that selection effects produce increasing complexity and increasing degrees of freedom. And although they don't use the term, still, in their view, biological organisms are emergent dissipative structures that, once formed, dissipate imported negentropy. In this sense, their collapse of chaos produces coarse-graining far from equilibrium, to use Prigogine's phrase.

DISSIPATIVE PRESSURE

Prigogine uses dissipative structures to explain both the cause and disappearance of coarse-graining. Dissipative structures are shown to exist far from equilibrium and seemingly counter to the 2nd law of thermodynamicsthe entropy law holding that all order in the universe eventually reverts to purely random disorder and thermal equilibrium (Prigogine, 1962). In this classic monograph, he develops a general theory of irreversibility, that is, entropy, demonstrating systematically the process whereby atoms and molecules showing different momenta and coordinatesthe qs and ps in a Hamiltonian expressionreduce to a “‘sea' of highly multiple incoherent correlations (Prigogine, 1962: 8). Having translated the qs and ps into correlated histories, Prigogine sets the stage for carrying his analysis across the seeming discontinuity between atoms and molecules and the lower-level correlated histories that Gell-Mann mentions in his analysis. Prigogine's analysis shows how the coarse-graining apparent in the universe can actually, and eventually, reduce to the random correlated quantum histories in the fine-grained structure.

CONTROL PARAMETERS

Control parameters, as Mainzer (1997) uses the term, refers to external forces causing the emergence of dissipative structures in the region of complexity. He begins with a review of Lorenz's (1963) discovery of a deterministic model of turbulence in weather systems. A discussion of research focusing on Bénard cells follows. Here we discover that critical values in the energy (temperature, T) differential between warmer and cooler surfaces of the cell affect the velocity, R, of the air flow, which correlates with ΔT. The surfaces of the cell represent the hot surface of the earth and the cold upper atmosphere. The critical values divide the velocity of air flow in the cell into three kinds:

  1. Below the 1st critical value, heat transfer occurs via conductiongas molecules transfer energy by vibrating more vigorously against each other while remaining essentially in the same place.
  2. Between the 1st and 2nd critical values, heat transfer occurs via a bulk movement of air in which the gas molecules move between the surfaces in a circulatory pattern. We encounter these in aircraft as up and down drafts.
  3. Above the 2nd critical value a transition to chaotically moving gas molecules is observed.

Prigogine's emergent dissipative structures form in the region of emergent complexity in between the critical values. Cramer (1993) observes that the three regions defined by the critical values define three kinds of complexity: subcritical |1st| critical |2nd| fundamental. His definitions appear in Table 1 overleaf. The algorithmic compressibility characterizing all the laws of classical Newtonian science appears mostly in the subcritical region, but also in the fundamental region of deterministic chaos. Mainzer (1997: 63) says, mathematical symmetry is defined by the invariance of certain laws with respect to several transformations between the corresponding reference systems of an observer. Thus, symmetry dominates the subcritical region and to some extent also applies to the fundamental region. Furthermore, the invariant laws are reversible (Prigogine & Stengers, 1984). As a control parameter causes R to move across the critical values, however, the consequence is symmetry breaking, at least in part, because the laws of classical physics do not remain invariant.

As Prigogine (1962; Nicolis & Prigogine, 1989) observes, in the region of emergent complexity are created emergent dissipative structures far from equilibrium as a result of importing energy into the system (at some rate) as negentropy. Although this process is nonlinear and not subject to symmetry, Cramer (1993) observes that once created, dissipative structures become subject to the symmetry and invariant laws of classical physics. The final state of dissipation, that is, of perfect entropy, is easily describable by a master equation from statistical mechanics; the probable positions of millions of particles subject to Brownian motion can be reduced to minimal degrees of freedom. In reverse, the creation of emergent dissipative structures is in fact a creation of degrees of freedom. As Mainzer puts it, complexity means that a system has a huge number of degrees of freedom (Mainzer, 1997: 65).

• “Subcritical complexity” exists when the amount of information necessary to describe the system is less complex than the system itself. Thus a rule, such as F = ma = md2s/dt2 is much simpler in information terms than trying to describe the myriad states, velocities, and acceleration rates pursuant to understanding the force of a falling object. “Systems exhibiting subcritical complexity are strictly deterministic and allow for exact prediction” (Cramer, 1993: 213) They are also “reversible” (allowing retrodiction as well as prediction, thus making the “arrow of time” irrelevant (Eddington, 1930; Prigogine & Stengers, 1984).

• At the opposite extreme is “fundamental complexity,” where the description of a system is as complex as the system itself—the minimum number of information bits necessary to describe the states is equal to the complexity of the system. Cramer lumps chaotic and fundamental systems into this category, although deterministic chaos is recognized as fundamentally different from fundamental complexity (Morrison, 1991; Gell-Mann, 1994), since the former is “simple rule” driven, and fundamental systems are random, although varying in their stochasticity. Thus, three kinds of fundamental complexity are recognized: purely random, probabilistic, and deterministic chaos. For this article I narrow fundamental complexity to deterministic chaos, at the risk of oversimplification.

• In between Cramer puts “critical complexity.” The defining aspect of this category is the possibility of emergent simple deterministic structures fitting subcritical complexity criteria, even though the underlying phenomena remain in the fundamentally complex category. It is here that natural forces ease the investigator's problem by offering intervening objects as “simplicity targets,” the behavior of which lends itself to simple-rule explanation. Cramer (1993: 215-17) has a long table categorizing all kinds of phenomena according to his scheme.

Table 1 Definitions of kinds of complexity by Cramer (1993)

PHASE TRANSITION

In the following points I trace out the order Mainzer describes and match his steps with Gell-Mann's coarse-graining process:

  1. Start with an existing dissipative structure behaving according to a Newtonian Hamiltoniana coarse-grained structure in Gell-Mann's terms.
  2. Just before the 1st critical value is reached (from below), unstable vectors (wave packets, modes, energy, forces, motions) appear along with the stable waves.
  3. As the unstable vectors multiply they begin to enslave the stable vectors, thus eliminating the latter. Degrees of freedom are thereby reduced, as is complexity. Decoherence is crumbling, resulting in interference and entanglement. Consequently, coarse-graining is reduced.
  4. The unstable vectors and their degrees of freedom disappear into a stochastic pool of Brownian motion. This leads to a vast reduction in degrees of freedom. Decoherence has nearly disappeared.
  5. The last few unstable vectors remaining become order parameters, acting to create the emergent dissipative structures as the system tips over the 1st critical value into the region of emergent complexity meaning that the order parameters surviving across the complete phase transition are totally the result of a stochastic process.
  6. At this juncture, order, complexity, and increased degrees of freedom emerge. The result is decoherence and emergent coarse-graining. This is where context has the greatest impact.
  7. The region of emergent complexity persists until the energy differential is reduced by virtue of the continuing emergence of dissipative structures. That is, coarse-graining continues until the energy differential is reduced. Of course, if the energy differential is continuously renewed at the same rate as, or even faster than, the existing dissipative structures can reduce it, more dissipative structures will continue to emergeunless, of course, the energy differential rises over the 2nd critical value; then chaotic processes take over.

Mainzer teases out the fine-grained process events just before and after the phase transition at the 1st critical value. Recalling Omnès's (1999) argument that decoherence processes occur more rapidly than can ever be measured, we realize that a physical system passes through the several states outlined above very rapidlyperhaps too rapidly to measure. Nevertheless, we see that emergent structure is stochastically driven by the tail end of the disappearing unstable vectors. By this process, at the phase transition, most of the vectors simply disappear into entanglement. But the trace number at the end collapses the vectors (wave packets), thereby creating the order parameters governing the emergence of dissipative structures. This amounts to an explanation of emergent quantum chaos and the vanishingly small initial order parameters that, like the butterfly effect, eventually influence the forms of emergent dissipative structures of quasi-classical physics.

The Bénard energy differential figures centrally in Mainzer's treatment of complexity theory. Omnès does not refer explicitly to something akin to the Bénard process, but he does focus on an external Hamiltonian and context. Ashby and Rothstein emphasize external environmental constraints as causes of order, but they do not define constraints in terms of anything looking like an energy differential. The latter might be inferred vaguely in the background, perhaps, in the Cohen and Stewart treatment. And energy differentials do not figure in Gell-Mann's photon scatteringcaused coarse-graining, although the photons do represent the context of an external energy source. However, no mention is made of whether they can appear below, between, or above the 1st and 2nd critical values although presumably, and perhaps rather obviously, background radiation could be below the 1st and an exploding star well above the 2nd. Nevertheless, Mainzer and Omnès argue that energy differentials could or should be taken into account.

Mainzer views complexity science as an exploration of endogenously created nonlinearities operating in the context of control parameters and threshold effects. His analysis carries this theme across matter, life, and mind (real and artificial), and into economic and other social systems. Whether firms are analogized as biological ecologies governed by Darwinian selection, as brains and distributed intelligence, as economies, or as networks of human and social capital (Morgan, 1997; McKelvey, forthcoming-a), Mainzer's analysis applies. Following Schumpeter, Mainzer identifies innovation and technological change as the primary engine setting both the nonlinear and Bénard processes in motion and, thus, creating dissipative structures and emergent order. He specifically mentions Allen's (1988) discovery of these processes at work in urban development as a social system application. Allen's study of Atlantic fisheries (Allen & McGlade, 1986, 1987) and recent analysis of knowledge management (Allen, forthcoming) also instruct.

An even broader extension of the Bénard process stems from Swenson's work (1989, 1998). His law of maximum entropy production holds that a system will select the path or assembly of paths out of otherwise available paths that minimize the potential or maximize the entropy at the fastest rate given the constraints The world will select order whenever it gets the chancethe world is in the order-production business because ordered flow produces entropy faster than disordered flow (Swenson, 1998: 173; his italics). Consider the Big Bang as the ultimate heat source and outer space as the ultimate heat sink. At some point in time, every particle of matter in the universe will pass through the 1st and 2nd critical values of the Bénard process. Order creation of dissipative structures is pervasive and inevitable. Galaxies, the Sun, and the Earth are all order creations for maximizing entropy creation. Life on the surface of Earth emerged in the context of the giant atmospheric and plate tectonic Bénard processes. Western civilization, including all its social systems, organizations, and firms, is a lesser order-creation device that, in fact, is so effective a dissipative process that it is rapidly depleting the resources on which it depends. Innovations and new technologies create energy and resource disparities in economiesBénard thresholdswith the result that firms, as order creations, emerge to dissipate the energy/resource differentials. Complexity science applications have now spread to the physical, life, social, and management sciences (Nicolis & Prigogine, 1989; Cowan et al., 1994; Belew & Mitchell, 1996; Arthur et al., 1997; Mainzer, 1997; McKelvey, 1997; Byrne, 1998; Cilliers, 1998; Anderson, 1999; Maguire & McKelvey, 1999a, 1999b), among many others. Complexity science is now pervasive and at its core are endogenous nonlinearities and the Bénard process.

EXPLAINING ORDER IN ORGANIZATIONS

KINDS OF ORDER

Three kinds of order exist in organizations: rational, natural, and open systems (Scott, 1998). Rational systems result from prepensive conscious intentionalities, usually by managers. Natural systems, such as informal groups, typically emerge as employees attempt to achieve personal goals in the context of a command-and-control bureaucracy. Open systems are in various ways defined by external forces. That all three exist goes unquestioned. What remains vague, however, are explanations about how they emerge, co-evolve, come to dominate one another, and collectively affect organizational performance. Specifically, how do these three forces combine to produce the order we see in firms, where order is defined in terms of formal structure and process and other patterns of behavior within and by a firm?

McKelvey (1997) defines organizations as quasi-natural phenomena, caused by both the conscious intentionality of those holding formal office (rational systems behavior) and naturally occurring structure and process emerging as a result of co-evolving individual employee behaviors in a selectionist context (natural and open systems behavior). With respect to the latter, two general order-causing effects appear in firms: selectionist microco-evolution (McKelvey, 1997, 1999a, 1999c; forthcoming-a); and complexity catastrophe (Kauffman, 1993; McKelvey, 1999a, 1999c). More broadly, according to thick description researchers (Geertz, 1973) and relativists and postmodernists (Burrell & Morgan, 1979; Lincoln, 1985; Reed & Hughes, 1992; Hassard & Parker, 1993; Weick, 1995; Chia, 1996), naturally occurring order in firms emerges from the conflation of the inherent stochastic idiosyncrasies of individuals' aspirations, capabilities, and behaviorsthe social scientists' analog of entanglement, I argue.5

Where to look for developing a theory of natural order emergence in firms? Complexity science, of course.6 Management writers mostly emphasize chaos and complexity theories as a means of better understanding the behavior of firms facing uncertain, nonlinear, rapidly changing environments (Maguire & McKelvey, 1999b). This view is somewhat off the track (McKelvey, 1999b). As demonstrated above, going back to the roots of complexity science in quantum physics and Prigogine's work, we see more accurately that complexity science is fundamentally aimed at explaining order creation. Much of normal science focuses on equating energy translations from one form of order to anotherworking under the 1st law of thermodynamics. This is all in the context of the order within existing dissipative structures. The 2nd law of thermodynamics focuses on the inevitable disintegration of existing order. Also, I have argued that complexity science aims to explain the emergence of order it is order-creation science.

DECOHERENCE AND EMERGENCE

Using complexity science, I have outlined the idea that quantum wave packets are collapsed by external forces and particularly by imposed energy differentials, following the Modern Interpretation. Not to have done this would have left entanglementand the decoherence of it via the human observer (Mermin, 1991; or Mills' (1994) watcher of the universe)solidly in the hands of relativists and postmodernists who decry normal science because everything that is ostensibly and objectively detected by science is interpreted subjectively by the human observers: What we see is nothing more than the result of wave packets collapsed by subjective human observers. This would encourage the subjective, loose, metaphorical treatment of the term entanglement, as it is applied to social systems.

I can now remind organization scientists that the most fundamental question of complexity science is: What causes order creation? Complexity theory applications to firms rest on environmental constraints in the form of Bénard energy differentials as the engines of order creationdefined as the emergence of both entities and connections constrained by context. The latter, when applied to firms, are best thought of as adaptive tension parameters (McKelvey, forthcoming-a). Going back to the Bénard cell, the hot plate represents a firm's current position; the cold plate represents where the firm should be positioned for improved success. The difference is adaptive tension. This tension motivates the importation of negentropy and the emergence of adaptation fostering dissipative structuresassuming that the tension lies between the 1st and 2nd critical values.

My review of entanglement, decoherence, and coarse-graining, modified by reference to complexity science and ranging from quanta to social systems, uncovers the second fundamental question in applying complexity science to firmsso far totally unrecognized: Emergence from what? Organization scientists and managers about to apply complexity science to firms cannot willy-nilly assume that entanglement exists uncorrupted in a given firm. Absent entanglement, altering adaptive tension parameters could produce maladaptive results. The nature of the initial pool of entangled particles appears essential to the coarse-graining process. In Gell-Mann's view, coarse-grained structure emerges from entangled fine-grained structure as a result of external influences. Remove the external influence and macro structure disappears in the Bénard cell and coarse-grained quanta disappear back into wave packets. If energy differentials are viewed as causes of coarse-graining, four critical differences appear:

  1. Given an initially pure, uncorrupted, or untampered-with pool of entanglements, the first coarse-graining resulting from an imposed energy differential could alter entanglement in an irrevocable fashionwhether in physical, biological, or social entanglement pools.
  2. Whereas in the Newtonian physical world (Cramer's 1993 subcritical complexity) of quanta and molecules the energy-differential effect is time reversible, in the biological and social worlds, as Prigogine would say (Prigogine & Stengers, 1984), it is a time-irreversible process. Omnès includes the physical world as well.
  3. As a consequence, especially in biological and social entanglements, any subsequent coarse-graining starts with some vestige of the prior coarse-graining effects remaining in the entanglement pool. This means that complexity science in the biological and social worlds is fundamentally different than in the physical world.
  4. In the social worldand particularly in the world of firmsthere is the possibility, if not the actual advantage or necessity, of constantly managing to preserve or recreate one or more pools of fine-grained entanglements as primordial bases from which subsequent energy differential-caused coarse-grained structures emerge.

To summarize, the logic sequencein agent7 termsis as follows:

  1. There is some level of correlation between the histories of all possible pairs of agents in the fine-grained structure.
  2. Because each agent interferes with all the others, probabilities of how one agent affects another cannot be assignedtheir destinies, thus, are entangled.
  3. Coarse-graining washes out interference terms in the fine-grained structure, that is, coarse-graining washes out entanglement and results in probabilitiesand probabilistic natural lawsrather than interferences.
  4. Energy differentialsadaptive tensionimpinging on agents can, therefore, cause coarse-graining and the creation of probable outcomes emerging from the pool of entangled agents.
  5. In addition to causing coarse-graining, the likelihood that the energydifferential field effect will disrupt the entanglement pool so as to corrupt the purity of entanglement, so to speak, increases, going from physical to biological to social worlds.
  6. Because of the feedback effect, the interrelation of entanglement and adaptive tension in social systems sets them apart from physical and to some extent biological systemsalthough I would not rule out the effect in physical systems. For example, in a Bénard cell, if one removes the energy differential the molecules revert to the conductivity state and it is as if there had been no emergent structure. With organizations, however, successive emergent orders leave an accumulated legacy that usually does not disappear if the adaptive tension is removedalthough it could easily deteriorate into a somewhat different coarse-graining.

Given the definition of complexity science presented here, what should managers worry about? I don't have space for details (see instead McKelvey, forthcoming-a, forthcoming-b), but some key elements are as follows:

  1. Before emergent order creation has any chance of being efficacious, the uncorrupted entanglement pools from which order emerges have already to exist or be created. This creates initial conditions.
  2. Goal setting becomes a context-identification process. This is a process of identifying which kinds of adaptive tension parameters should be the center of attention. Besides being identified, incentives for paying attention to them have to be put in place. A classic example is Jack Welch's Be #1 or #2 in your industry or your division will be sold. It defines context, an adaptive tension, and motivation all in one short phrase. This sets up the Bénard process.
  3. Focus on enlarging the region of emergent complexity (order creation). Some firms cycle between bureaucracy and chaos, because the region of emergence is virtually nonexistent. Focus on lowering the 1st and raising the 2nd critical values. This increases the probability of Bénard processes and emergence.
  4. Agency problems and other noxiants need to be avoided (via strange attractor management) so as to avoid emergence in directions clearly not in a firm's best interest.

CONCLUSION

My review suggests the following:

The root question in quantum theory expands, in complexity science, into a multidisciplinary concern about the engine that causes order creation in matter, life, brains, artificial intelligence, and social systems (Mainzer, 1997). Is there one primary engine working up and down the hierarchy of phenomenafrom matter to social systemsor are there several and do they differ across disciplines? From all of this, I draw out two key elements that seem particularly relevant in the application of complexity theory to organizations: the notion of correlated histories between pairs of agentsthat is, entanglementas the initial condition; and the Bénard process as the main engine of order creation so far discovered that applies across the hierarchy of phenomenaand down into organizationsin addition to the Darwinian selectionist process, and human rationality, of course, that we already know about.

NOTES

  1. Quoted in Hamel (2000: 102).
  2. I have double checked everything Gell-Mann says with the recent modern interpretation by Omnès (1999), whom Gell-Mann cites with approval. The Omnès treatment is more technical and treats at book length what Gell-Mann covers in one chapter. Their views are consistent, but, for example, they do view the collapse of the collective wave packet(s) that is Mars in somewhat different ways. In addition, Omnès holds that decoherence in the universe is so pervasive and instantaneous that decoherence has occurred long before any observer happens on the scenethus observers such as the watcher (Mills, 1994) are superfluous.
  3. Omnès (1999: 75) defines decoherence as the absence of macroscopic interferences.
  4. It is worth noting that Gell-Mann (1994: 138, 140) says of Roland Omnès as follows: Among those who have made especially valuable contributions are Robert Griffiths and Roland Omnès, whose belief in the importance of histories we [referring to James Hartle and himself] share Hartle and I, like Griffiths and Omnès, make use of the fact that the questions always relate ultimately to alternative histories of the universe. ( A history is merely a narrative of a time sequence of eventspast, present, or future.)
  5. See McKelvey (forthcoming-c) for further discussion of the marriage of postmodernist ontology and normal science epistemology.
  6. Sociologists have studied the process of emergent social order since Durkheim (1893) and Spencer (1898). For recent examples, see Ridgeway & Berger (1986, 1988); Berger et al. (1998), and Mark (1998). Ridgeway and Berger focus on power legitimation. For them, differentiation follows from the influence of forces external to the social system. Mark focuses on information effects. For him, however, differentiation can emerge in totally undifferentiated systems without the effect of external forces.
  7. In agent-based computational models, an agent can represent any microentity, such as electrons, atoms, molecules, cells, organisms, species, language/process/conversation elements, individuals, groups, divisions, firms, etc. I use it in this catch-all sense here.

REFERENCES

Allen, P M. (1988) Self-organization in the Urban System, in W. C. Schieve & P M. Allen (eds), Self-Organization and Dissipative Structures: Applications in the Physical and Social Sciences, Austin, TX: University of Texas Press: 142-6.

Allen, I? M. (forthcoming) A complex systems approach to learning, adaptive networks, International Journal of Innovation Management, 5.

Allen, II M. & McGlade, J. M. (1986) Dynamics of discovery and exploitation: The Scotian Shelf fisheries, Canadian Journal of Fisheries and Aquatic Sciences, 43: 1187-200.

Allen, II M. & McGlade, J. M. (1987) Modelling complex human systems: A fisheries example, European Journal of Operations Research, 30: 147-67.

Anderson, II (1999) Complexity theory and organization science, Organization Science, 10: 216-32.

Arthur, W. B., Durlauf, S. N., & Lane, D. A. (eds) (1997) The Economy as an Evolving Complex System, Proceedings of the Santa Fe Institute, Vol. XXVII, Reading, MA: Addison-Wesley.

Ashby, W. R. (1956) An Introduction to Cybernetics, London: Chapman & Hall.

Ashby, W. R. (1962) Principles of the self-organizing system, in H. von Foerster & G. W. Zopf (eds), Principles of Self-Organization, New York: Pergamon: 255-78.

Belew, R. K. & Mitchell, M. (eds) (1996) Adaptive Individuals in Evolving Populations, Proceedings of the Santa Fe Institute, Vol. XXVI, Reading, MA: Addison-Wesley.

Berger, J., Ridgeway, C. L., Fisek, M. H., & Norman, R. Z. (1998) The legitimation and delegitimation of power and prestige orders, American Sociological Review, 63: 379-405.

Borel, E. (1937) Valeur Pratique et Philosophic des Probabilites , Paris: Gauthier-Villars.

Burrell, G. & Morgan, G. (1979) Sociological Paradigms and Organizational Analysis, London: Heinemann.

Byrne, D. (1998) Complexity Theory and the Social Sciences, London: Routledge.

Chia, R. (1996) Organizational Analysis as Deconstructive Practice, Berlin, Germany: Walter de Gruyter.

Cilliers, P. (1998) Complexity and Postmodernism, London: Routledge.

Cohen, J. & Stewart, I. (1994) The Collapse of Chaos: Discovering Simplicity in a Complex World, New York: Viking.

Cowan, G., Pines, A., & Meltzer, D. (eds) (1994) Complexity: Metaphors, Models, and Reality, Proceedings of the Santa Fe Institute, Vol. XIX, Reading, MA: Addison-Wesley.

Cramer, F (1993) Chaos and Order: The Complex Structure of Living Things, D. L. Loewus (trans.), New York: VCH.

Darwin, C. (1859) On the Origin of Species, London: John Murray.

Durkheim, E. (1893) De la Division du Travail Social: Etude sur l'Organization des Societes Superieures, Paris: F. Alcan.

Eddington, A. (1930) The Nature of the Physical World, London: Macmillan.

Geertz, C. (1973) The Interpretation of Cultures, New York: Basic Books.

Gell-Mann, M. (1994) The Quark and the Jaguar, New York: Freeman.

Hamel, G. (2000) Reinvent your company, Fortune, 141(June 12): 98-118.

Hassard, J. & Parker, M. (1993) Postmodernism and Organizations, Thousand Oaks, CA: Sage.

Holland, J. H. (1995) Hidden Order, Reading, MA: Addison-Wesley.

Kauffman, S. A. (1993) The Origins of Order: Self-Organization and Selection in Evolution, New York: Oxford University Press.

Lincoln, Y. S. (ed.) (1985) Organizational Theory and Inquiry, Newbury Park, CA: Sage.

Lorenz, E. N. (1963) Deterministic nonperiodic flow, Journal of the Atmospheric Sciences, 20: 130-41.

Maguire, S. & McKelvey, B. (1999a) Complexity and management: Moving from fad to firm foundations, Emergence, 1(2): 19-61.

Maguire, S. & McKelvey. B. (eds) (1999b) Special issue on complexity and management: Where are we?, Emergence, 1(2).

Mainzer, K. (1997) Thinking in Complexity: The Complex Dynamics of Matter, Mind, and Mankind, 3rd edn, New York: Springer-Verlag.

Massie, J. L. (1965) Management theory, in J. G. March (ed.), Handbook of Organizations, Chicago: Rand McNally: 387-422.

McKelvey, B. (1997) Quasi-natural organization science, Organization Science, 8: 351-80.

McKelvey, B. (1999a) Avoiding complexity catastrophe in coevolutionary pockets: Strategies for rugged landscapes, Organization Science, 10: 294-321.

McKelvey, B. (1999b). Complexity theory in organization science: Seizing the promise or becoming a fad?, Emergence, 1(1): 3-32.

McKelvey, B. (1999c) Self-organization, complexity catastrophe, and microstate models at the edge of chaos, in J. A. C. Baum & B. McKelvey (eds), Variations in Organization Science: In Honor of Donald T. Campbell, Thousand Oaks, CA: Sage: 279-307.

McKelvey, B. (forthcoming-a) Dynamics of new science leadership: Strategy, microcoevo- lution, distributed intelligence, complexity, in A. Y. Lewin & H. Volberda (eds), Mobilizing the Self-Renewing Organization, Thousand Oaks, CA: Sage.

McKelvey, B. (forthcoming-b) Emergent order in firms: Complexity science vs. the entanglement trap, in E. Mitleton-Kelly (ed.), Organizations Are Complex Social Systems, Amsterdam: Elsevier.

McKelvey, B. (forthcoming-c). From fields to science, in R. Westwood & S. Clegg (eds), Point/Counterpoint: Central Debates in Organization Theory, Oxford, UK: Blackwell.

Mermin, D. N. (1991) Is the moon there when nobody looks? Reality and the quantum theory, physics today, in R. Boyd, P Gasper, & J. D. Trout (eds), The Philosophy of Science, Cambridge, MA: MIT Press: 501-16.

Mills, R. (1994) Space, Time and Quanta: An Introduction to Contemporary Physics, New York: Freeman.

Morgan, G. (1997) Images of Organization, 2nd edn, Thousand Oaks, CA: Sage.

Morrison, F. (1991) The Art of Modeling Dynamic Systems, New York: Wiley Interscience.

Nicolis, G. & Prigogine, I. (1989) Exploring Complexity: An Introduction, New York: Freeman.

Omnes, R. (1999) Understanding Quantum Mechanics, Princeton, NJ: Princeton University Press.

Pearl, J. (2000) Causality, New York: Cambridge University Press.

Prigogine, I. (1962) Non-Equilibrium Statistical Mechanics, New York: Wiley Interscience.

Prigogine, I. & Stengers, I. (1984) Order Out of Chaos: Man's New Dialogue with Nature, New York: Bantam.

Reed, M. & Hughes, M. (eds) (1992) Rethinking Organization: New Directions in Organization Theory and Analysis, London: Sage.

Ridgeway, C. L. & Berger, J. (1986) Expectations, legitimation, and dominance behavior in task groups, American Sociological Review, 51: 603-17.

Ridgeway, C. L. & Berger, J. (1988) The legitimation of power and prestige orders in task groups, in M. Webster Jr. & M. Foschi (eds), Status Generalization: New Theory and Research, Stanford, CA: Stanford University Press: 207-31.

Rothstein, J. (1958) Communication, Organization and Science, Indian Hills, CO: Falcon's Wing Press.

Salmon, W. C. (1998) Causality and Explanation, New York: Oxford University Press.

Scott, W. R. (1998) Organizations: Rational, Natural, and Open Systems, 4th edn, Englewood Cliffs, NJ: Prentice-Hall.

Sommerhoff, G. (1950) Analytical Biology, London: Oxford University Press.

Spencer, H. (1898) The Principles of Sociology, New York: D. Appleton.

Swenson, R. (1989) Emergent attractors and the law of maximum entropy production: Foundations to a theory of general evolution, Systems Research, 6: 187-97.

Swenson, R. (1998) Spontaneous order, evolution, and autocatakinetics: The nomological basis for the emergence of meaning, in G. van de Vijver, S. N. Salthe, & M. Delpos (eds), Evolutionary Systems: Biological and Epistemological Perspectives on Selection and Self-Organization, Dordrecht, Netherlands: Kluwer: 155-80.

Weick, K. E. (1995) Sensemaking in Organizations, Thousand Oaks, CA: Sage.