Kennesaw State University, USA
It may be intuitively appealing to draw parallels between the implications of complexity science and postmodernism, but it is harder to substantiate them. I will argue in this article that the respective explicit and implicit assumptions of complexity science (or complexity theory, or just complexity for short) and postmodernism are, for the most part, anchored in different ontologies and epistemologies and that they use different sets of methods. I will also argue that stronger parallels can be found between the assumptions of complexity theory and postpositivism.
Young (1991a, 1991b) and Cilliers (1998) argue that chaos and complexity theories are postmodernist. I will briefly address Young's arguments here and return to Cilliers's more detailed arguments at the end of the article. Young contends that chaos theory decenters all claims of perfection, finality, normality, or historical necessity and alters the mission of science to the extent that it is no longer a quest for universal laws yielding prediction, uniformity, certainty, and stability of findings. Thus, it provides a theoretical envelope for a postmodern science.
I disagree. As Price (1997) points out, chaos and complexity theory recognizes the need for a modification of the reductionist classical model of science, but remains grounded in the scientific tradition. Postmodernism, on the other hand, rejects scientific and empirical methods altogether. Complexity theory renders the Newtonian notion of universal laws questionable, but it still offers generalizations about natural and social phenomena. Postmodernism abhors totalizations, however. While postmodernists stress the arbitrary, conventional, historically contingent, situationally specific, and linguistically constructed nature of social realities, complexity theorists aim to make sense of a world that exists objectively, beyond our language games, although they admit that complex interactions may obscure reality.
Currently there is no unified or coherent framework for complexity theory. The complexity theory I discuss in this article is my reconstruction of the literature to which I have had access. I will use the terms Newtonian/positivist science and postmodernism/poststructuralism without making distinctions between Newtonian and positivist or postmodernist and poststructuralist. These distinctions may be warranted in historical analyses and more elaborate discussions, but for my purposes here they are not necessary.
The term “postpositivism” is used to refer to the loosely connected criticisms of positivism during the second half of the twentieth century, including Popper's falsificationism, Berger and Luckman's social constructionism, Kuhn's paradigmatic explanation of scientific revolutions, and Feyerabend's anarchistic philosophy of science. These theorists criticized the objectivist and empiricist epistemological assumptions of positivism and contributed to the understanding that knowledge, including scientific knowledge, is contextual, presupposed, constricted, and/or constructed. More recent postpositivist theorists (e.g., Lincoln & Guba, 1985; Fischer, 1995) make similar assertions and draw on the hermeneutic and phenomenological propositions, such as that reality is subject to interpretations and that the subject-object distinction positivists make is either unsustainable or problematic. Postmodernists/poststructuralists agree with these criticisms and assertions in general, but stretch them to the point where knowledge becomes a closed system, a language game.
In this article, I will summarize the respective ontological, epistemological, and methodological assumptions of Newtonian/positivist science and postmodernism/poststructuralism, and compare them with those of complexity science. Table 1 summarizes my argument. It shows that the assumptions of complexity theory are closer to those of Newtonian/positivist science than to those of postmodernism/poststructuralism. Both Newtonian/positivist science and complexity theory are realist in their ontological assumptions, but while the former depicts a strictly deterministic picture of reality, the latter adopts both deterministic and indeterministic characterizations of reality. The question of determinism is not even meaningful for postmodernism/poststructuralism, because it is nominalist and sees reality as nothing but a text that is subject to multiple readings, none of which constitutes a true reading.
|Newtonian/positivist science||Complexity science||Postmodernism/poststructuralism|
|Realist ontology||Realist ontology||Nominalist ontology and epistemology|
• Discrete entities and events
• Linear causality
• Total predictability
|Co-existence of determinism and indeterminism
• Nonlinear relations
• Limited predictability
• Dissolution of subject-object distinction
• Dissolution of subject
• No objective reality or knowledge
|Reality as an emergent whole
• Simple-complex: blurred
• Phase transitions
• Knowledge as a language game
• No form of knowledge has epistemological privilege
|Positivist epistemology||Postpositivist epistemology|
• Subject-object distinction
• Objective knowledge
• Correspondence theory of truth
• Fact-value distinction
• Universal laws Instrumentalism
• Subject-object distinction: problematic
• Endophysical (contextual) nature of knowledge
• Limited generalizations, or laws of complexity
• Reductionist/analytical models
• Primacy of quantification
• Holistic methods (simulations)
• Some use of analytical and deductive methods
• Qualitative and quantitative methods
Complexity theory distinguishes itself from Newtonian/positivist science mainly by its epistemological assumptions. While Newtonian/positivist science is empiricist, complexity theory is potentially postpositivist; at least, a group of complexity theorists recognize the contextual nature of knowledge, as I will discuss below. Complexity rejects the Newtonian/positivistic notion of universal generalizations, but accepts that contextual and local generalizations can still be made. Both Newtonian/positivist scientists and complexity theorists use analytical/reductionist and deductivist methods of investigation, but the latter also use holistic simulation methods. Postmodernists/poststructuralists do not require or desire any methodology; their only legitimate method is the deconstruction of texts.
The term “Newtonian science” is usually used in reference to developments in the scientific worldview in the seventeenth century. Its philosophy is still pervasive in most of today's scientific practices. Positivism is a term used interchangeably with “empiricism,” “behaviorism,” “naturalism,” and “science” (Hughes, 1990: 16), but it mainly refers to Auguste Comte's positivist philosophy and the principles of the Vienna Circle of logical positivists. Some other theorists who did not call themselves positivists, such as J. S. Mill and Durkheim, shared its foundational beliefs (Giddens, 1995: 136-46). I will use the terms “Newtonian science” and “positivism” together, because for the most part their philosophical assumptions are the same, or similar.
Newtonian/positivist science is realist; its most fundamental ontological assumption is that reality exists independently from the knowing subject. It also makes the assumption that reality is deterministic in nature. The deterministic worldview is composed of multiple layers of assumptions, as Kellert (1993: 49-76) points out. I will use a modified version of Kellert's layers of determinism in this article. The first layer of deterministic assumptions is that reality is composed of discrete entities and events that can be aggregated hierarchically. Newtonian physics is based on the idea that the universe is made of discrete entities: Matter is composed of molecules, which are in turn composed of atoms, which are composed of subatomic particles, and so forth. The nature of physical reality can be understood by breaking it down to these constituent parts. Such assumptions are echoed in the social sciences, primarily in J. S. Mill's Newtonian view of society. He considered individuals as the elements of society and suggested that the laws of society could be deduced from the study of individuals (Keat & Urry, 1975: 76).
The second layer of deterministic assumptions is that entities and events are causally connected. Philosophers conceptualized the nature of causality in a variety of ways, which are beyond the scope of this article. It is important to emphasize, however, that Newtonian/positivist science adopted Descartes' interactionist view of causality, which suggests that material events have material causes (i.e., that causal relations are independent of influences or perceptions of the knowing subject). The Newtonian/positivist notion of causality is also mainly linear (i.e., that cause-effect relations are proportionate). Newtonian/positivist scientific methods are designed to detect linear relations between variables. Linear equations constitute the backbone of Newtonian/positivist scientific practice; typically, nonlinearities are ignored or considered as afterthoughts (Kellert, 1993: 134).
The third level of deterministic assumptions is that the universe is completely determined and totally predictable. Even if there are practical difficulties in knowing all the details of all the events in the universe, they are predictable in principle “by an all-powerful intelligence or computational scheme, given complete information of instantaneous conditions and the complete set of physical laws” (Kellert, 1993: 60). Total predictability was the founding myth of Newtonian science, as Kellert points out.
Newtonian/positivist science separates the knowing subject from the object of their knowledge. It is assumed that this separation makes objective knowledge possible. The truthfulness of any knowledge can be determined by empirically testing its correspondence to reality (this principle is called the correspondence theory of truth). To make sure that knowledge corresponds to reality, Newtonian/positivist science attempts to separate facts (empirically testable knowledge) from the values of the knowing subject (inexplicable subjective states of their mind or heart).
Newtonian/positivist science aims to discover the universal laws of nature and society. Newton and his followers believed that the universe was an orderly place, and that its governing rules transcended space and time. Newton himself articulated a series of general laws of physics, such as the law of universal gravitation. Later, logical positivists codified the Newtonian notion that universal laws are meant to hold true for all times and places (Keat & Urry, 1975: 9-26).
In Newtonian/positivist science, knowledge is used as an instrument for manipulating realities to accomplish predetermined goals (Bobrow & Dryzek, 1987). Auguste Comte transferred this instrumentalist notion of knowledge into the study of social phenomena. He suggested that the basis of a positive knowledge was its practical applicability and that science was an instrument of control over both physical and social conditions (Keat & Urry, 1975: 71-95).
The methodology of Newtonian/positivist science is reductionist and analytical. If reality is composed of discrete elements and events, then it is assumed that reality can be broken down to its parts, which in turn can be isolated and analyzed to determine the relationships between them. Descartes, a major contributor to Newtonian/positivist scientific thinking, articulated this reductionist view of science. He complemented reductionism with the ideas that logical deduction and quantification were the primary components of the scientific method. He credited mathematics with the ability to set standards of certainty (Hughes, 1990: 17-18). The logical positivists of the early twentieth century also aimed to establish logical deduction and mathematics as the main methods of science and refined the applications of both (Giddens, 1995: 158).
The central problem of hermeneutics is whether and to what extent the true meaning of a text can be found out. This is important for the comparisons I make in this article, because the roots of both postmodernist/poststructuralist thinking and some versions of postpositivist thinking are in this problem.
Hermeneutics has traditionally been concerned with the interpretation of meaning in communications, but contemporary hermeneutic scholars have expanded its scope to include all linguistic forms (speech as well as written text) and other forms of symbolic interaction (rituals, architectural style, and cultural artifacts). Hermeneutic philosophers pose the question whether, or to what extent, a reader can understand the experience of the author of a text or a cultural artifact. Is there a true meaning in the text? Alternatively, does the reader impose their own meaning on a text?
On this issue, hermeneutic theories can be roughly grouped into three. First, there is the “objectivist hermeneutics” of E. D. Hirsch, which suggests that there is a true meaning in a text (i.e., the meaning intended by the author) and that a reader can understand the meaning as it was intended. With this assumption, objectivist hermeneutics comes close to the position of positivist sociological theories. Positivist sociologists look for true meanings in social structures or functions, with the assumption that trained sociologists can interpret them objectively.
The “radical hermeneutics” of Stanley Fish takes the opposite view and posits that there is no predetermined (true) meaning in a text. According to this theory, the original experience and intent of the author are shaped in a historical and cultural context, which cannot be known by the reader. The interpretation of a meaning is a product of the reader's own historical and cultural contexts. The radical hermeneutic position constitutes the basis of the epistemology of postmodernism/poststructuralism.
The “phenomenological hermeneutics” of Hans-Georg Gadamer and others admits that a reader's context influences the meaning they attribute to a text, but the text also has autonomy. According to Gadamer, a good hermeneutic understanding is the result of an authentic dialog between the past (authors) and present (us). This position, I suggest, encapsulates the foundational belief in postpositivist epistemology. Postpositivist epistemology is based on the notion that the contents of all forms of knowledge are contextual, constricted, or constructed. And this is the epistemological implication of complexity theory, as I will discuss later.
Postmodernist/poststructuralist philosophies not only counter the realist ontological assumptions of Newtonian/positivist science, but also dissolve the distinctions between ontological and epistemological questions. These philosophies are inspired by Husserl's dissolution of the subject-object distinction (Madison, 1988: xiii), but they go beyond his position.
Husserl posits that the subject and object are not only relational, but also mutually constituted. There is no such thing as an “objective knowledge of the world,” because what some may call an “objective reality” is nothing but the object of the intentional acts of human consciousness, according to Husserl (Madison, 1988: 10-11). Poststructuralists take another step and dissolve the subject as well. They argue that there cannot be a coherent or unified subject. The modernist notion of a subject is that they are an agent independent of social relations. This is a mystification according to poststructuralists (Madison, 1988: 42-6). The subject, they contend, is structured within language (Sarup, 1989: 4).
Once the subject-object distinction and the subject are dissolved, no objective knowledge is possible. According to poststructuralists, social reality is constituted as it is interpreted. They deny primacy, in determining the meaning of a text, to even the author of a text (Rosenau, 1992: 37). As I mentioned above, poststructuralists adopt Fish's radical hermeneutic theory, which suggests that both the author's and the reader's meanings are structured within their respective cultural and historical contexts, therefore the meaning imposed by the reader cannot match the meaning intended by the author.
Furthermore, poststructuralists consider all forms of social communication or action as “language games.” The meaning in a situation is determined completely within the language, and language does not refer to anything outside of itself (Rosenau, 1992: 35). In Saussure's and Derrida's poststructuralist theories, linguistic categories do not have any extralinguistic referents (Cilliers, 1998: 38-45). There is no inherent structure in the language that could determine the relationships between linguistic categories (“signifiers”) and the concepts to which they refer (“signified”). Therefore, the interpreter who reads a text cannot rely on a common methodology to discover, extract, or understand any true meanings. Thus, poststructuralism denies any “epistemological privilege” to any interpreter and levels the playing field for all forms of interpreting reality. Scientific discourse is no more privileged than anyone's belief system, for example.
Then there is no need for a specific system of scientific methods. The only method that is considered legitimate by postmodernists/poststructuralists is deconstruction. The main goals of deconstruction are to expose the problematic nature of “centered discourses” (those discourses that depend on concepts such as truth, presence, origin, etc.) and overturn metaphysics by displacing its conceptual limits (Makaryk, 1993: 25). Poststructuralists use deconstruction as a tool to expose the power bases of the seemingly rational and scientific discourses. By doing so, they attempt to de-legitimize the socially constructed bases of authority and elevate the status of alternative, underprivileged discourses. Since all texts can be deconstructed, no privileged position is allowed to any (Schram, 1993).
Where do the assumptions of complexity science stand vis-à-vis those of Newtonian/positivist science and postmodernism/postpositivism? An important difficulty in answering the question lies in determining what complexity science is. As complexity theorists admit, there is no agreement on the definition of the term “complexity” and the basic principles of a complexity theory (Anderson, 1999; Marion, 1999). Arguably, a theoretical perspective is emerging (Dent, 1999), but there still are different schools of thought and terminological preferences. I define complexity science to include chaos theory, nonlinear dynamics, complex systems theory, and self-organization theory. In my discussion below I will summarize both the commonly accepted propositions of complexity theory and the differences between its schools of thought.
Complexity science is realist in general, but not all complexity theorists see reality as entirely independent from the knowing subject. Some argue that human knowledge is contextual and that the subject-object distinction is problematic. Even if they see reality as having an independent existence, complexity theorists define its nature somewhat differently from Newtonian/positivist science. Complexity theorists describe emergent holistic systems, whose properties are not reducible to those of their parts. These systems are integrated into and co-evolve with their environments. Like Newtonian/positivist science, complexity theorists look for causal relations between events and elements. In general, complexity theory accepts a deterministic ontology. But some complexity theorists (e.g., Prigogine & Stengers, 1984) acknowledge that determinism and indeterminism co-exist in the universe. Even those who do not acknowledge indeterminism as a characteristic of the universe depict the nature of deterministic relations somewhat differently from Newtonian/positivist science. They suggest that the nature of physical reality is mostly nonlinear. Such nonlinear relations allow only for a limited predictability of future events.
There are two main conceptualizations that tie together the different schools of thought in complexity studies. First, the distinctions between “simple” and “complex” become blurred. Second, the notions of “time” and “evolution” are considered essential in all theories. Consequently, systems are seen as undergoing phase transitions from simple to complex or vice versa. With some degree of oversimplification, two schools of thought can be defined in complexity theory: chaos theory and selforganization theory. The phase transitions from simple to complex (or from order to chaos) are the focus of the studies in chaos theory school of thought. Prigogine and Kauffman's theories of self-organization describe the transitions from complex to simple (or from chaos to order; Goldstein, 1995: 44).
The core concepts of chaos theory (nonlinearity, sensitivity to initial conditions, evolution, phase transitions, and order within chaos) are illustrated best in the properties of the logistic equation. When the logistic equation is solved recursively, it can be seen that the simple set of relations defined in the equation are capable of generating chaotic patterns, simply due to the changes in the initial conditions (i.e., the seed values of the parameter of the equation). It can also be seen that there is a pattern (order) in the seeming randomness of the chaotic regime that emerges from these recursive solutions. Thus chaos theory conceptually combines the Newtonian/positivist opposites: chaos and order.
The central concepts of the self-organization school of complexity theory are emergence, bifurcation points (threshold of complexity), autocatalysis, and co-evolution. Self-organization theorists suggest that order emerges out of chaos through phase transitions at bifurcation points. According to Prigogine and Stengers (1984), order emerges out of chaos under what they call “far-from-equilibrium” conditions: “new dynamic states of matter may originate” in interaction of a given system with its surroundings (Prigogine & Stengers, 1984: 12). Kauffman (1995) uses a similar concept: the “threshold of complexity.” This threshold is the point where matter springs into life; it does not happen as an accident of random variation, but it is inherent in the very nature of life (Kauffman, 1995: 43). Life is a natural property of complex chemical systems. When the number of complex molecules passes a certain threshold, an autocatalytic mechanism takes over (i.e., molecules catalyze each other without an external interference). Life emerges as a phase transition; it is a self-organizational phenomenon. Once life emerges, it cannot be explained by reducing its properties to those of its components (molecules); it is an irreducible and emergent whole. Complexity theory shows how a global structure emerges not just from local interaction but also from interactions based on relatively simple rules; complexity emerges from simplicity.
According to Kauffman (1995), evolution is always co-evolution. Organisms co-evolve with other organisms and with a changing abiotic environment. Species live in the niches afforded by other species.
There are no definitive or comprehensive works on the epistemology of complexity science, but two epistemological orientations can be identified in its applications. The first is strictly positivistic, as I discuss below. The researchers in this group apply the mathematical methods of complexity deductively, merely as analytical tools. In the second epistemological orientation, theorists specifically address two issues: contextuality of knowledge and the possibility of discovering universal laws.
Prigogine and Stengers (1984), Rössler (1986), and Casti (1994) emphasize the contextual nature of knowledge. Prigogine and Stengers argue that science should move away from the Newtonian understanding that an observer can step outside the physical reality they are observing toward a new scientific understanding in which the physicist is situated within the observed world. They call it a “new conception of objectivity,” which is “subject to intrinsic constraints that identify us as part of the physical world we are describing” (Prigogine & Stengers, 1984: 218). Similarly, Rössler (1986) emphasizes that a shift toward a new scientific paradigm should include a movement “from the usual detached, ‘exophysical' way of looking at one's model worlds to an understanding of ‘endophysical' one” (Rössler, 1986: 320). Casti (1994) points out that knowledge of a system depends on the position of the knowing subject (i.e., whether they are inside or outside the system). But it is more than being inside or outside, because the nature of a system is partly defined by the subject: “Whatever complexity such systems have is a joint property of the system and its interaction with another system, most often an observer and/or controller” (Casti, 1994: 269).
This new, endophysical view of objectivity brings complexity theory close to the position of phenomenological hermeneutics. The phenomenologist Heidegger argues that our knowledge is thoroughly embedded in history and language (Makaryk, 1993: 91). Complexity theorists' endophysical view complements Heidegger's: Our knowledge is embedded also in our physical existence. Thus the subject-object distinction becomes problematic in complexity theory. Does this place complexity theory in the same camp as postmodernism/poststructuralism? I think not. Complexity theorists still believe that an objective knowledge is possible, although it may be constricted by the situation of the knower. They do not see science as a language game, nor do they use deconstruction as their only or primary method.
Prigogine and Stengers (1984) and Kauffman (1995) disagree with the Newtonian/positivist notion of universal laws. However, unlike postmodernists/poststructuralists who think that no knowledge is generalizable, these complexity theorists believe that generalizations can be made, at least to an extent. Prigogine and Stengers (1984: 60-61) argue that the evolutionary (irreversible) nature of time lies at the origin of selforganization, which in turn nullifies the notion of determinism in the form of predictability of future behavior. Thus, universal generalizations about systems' behaviors are not possible; however, Prigogine and Stengers point out that generalizations can be made about the “islands of determinism” that exist within the mostly indeterministic universe.
Kauffman (1995) takes a different approach. He argues that the laws of complexity can be discovered. This brings his understanding closer to the Newtonian notion of universal laws. However, his notion of law has an important difference. According to Kauffman, the order in the biological world emerges because of the laws of complexity, and the emergent biological whole exhibits collective properties that are not readily explained by understanding the parts; they should be understood in their own right (Kauffman, 1995: vii-viii). Thus, Kauffman breaks up the Newtonian connection between reductionist/analytical thinking and universal laws (i.e., that once the components of a system are understood, the whole system can also be understood by aggregating pieces of information).
Complexity theory does not change the Newtonian/positivist instrumentalist notion of knowledge. Complexity theorists still seek knowledge for its practical applicability and for controlling or modifying natural phenomena. The knowledge of complexity has been applied in engineering problems for some time now. More recently, policy scientists began exploring ways of using complexity models in solving public policy problems (see, for example, Kadtke & Lempert, 2000).
Complexity researchers use a mixture of methods. Some of them use positivist methods analytically and deductively, without questioning their philosophical implications or underpinnings. Others apply simulation methods that represent a holistic approach.
To “detect” chaos in time-series data, chaos theorists use Lyapunov exponents, Fourier power spectrum analysis, calculation of fractal dimensions, spatial correlation, nonlinear and polynomial regression, and phase diagrams (for detailed descriptions of these methods, see Bergé et al., 1984; Casti, 1994; Guastello, 1995). All of these methods are deductive: data are tested for their degree of fitness to an abstract mathematical model. As such, these applications are very much in the spirit of Newtonian/positivist science.
The epistemological implications of the applications of phase diagrams (Poincaré return maps) are mixed. While, for example, Priesmeyer and Davis (1991) utilize phase diagrams in a deductive manner to test the fitness of ideal models, Kiel and Elliot (1992) use them as heuristic tools.
Simulation studies are different from the deductive methods of model testing. They describe the mechanisms and processes of emerging structures and encourage qualitative interpretations of quantitatively generated results. Agent-based simulations (i.e., cellular automata, neural networks, and genetic algorithms, whose differences from each other are of no significance for this discussion) have important implications. In the applications of these methods, the behaviors of agents are simulated using specialized computer software. Collective forms emerge as results of the evolution of agents' behaviors. In other words, “local rules” generate emergent holistic systems (Casti, 1994: 214-19). This approach represents a significant departure from the deductive methodology of Newtonian/positivist science. As Anderson (1999) puts it, causal deductive models strive to simplify reality with a small number of variables, while agent-based simulations generate complex and unique results that probably are more realistic, but are not easy to simplify or make generalizations from.
Neither the discernable ontological or epistemological implications of complexity, nor the nature of the methods that complexity researchers use support the thesis that complexity science is postmodernist. In fact, its ontological assumptions (realism and determinism) and the deductive methods that chaos theorists use position complexity science closer to Newtonian/positivist science, rather than postmodernism/poststructuralism. However, the position of complexity theory can be better characterized as postpositivist. The epistemological assumption that knowledge is contextual (embedded, endophysical) brings complexity science closer to postpositivism in general, and to phenomenological hermeneutics in particular. The holistic, emergent, and qualitative/interpretive implications of agent-based simulations also indicate a postpositivist orientation.
As mentioned above, both poststructuralism and postpositivism have their philosophical roots in hermeneutics, and the differences between the phenomenological and radical versions of hermeneutics are not always clear cut. But, as Fischer (1995: 26) points out, we can still draw a demarcation line between the two. While postpositivists see a chance of establishing a common basis for a valid discourse (a basis for truthfulness of interpretations) among or across perspectives, postmodernists/poststructuralists do not. Postmodernism/poststructuralism denies any “epistemological privilege” to any one interpretation over another.
I want to address Cilliers's (1998) argument in this context. Cilliers finds parallels between the connectionist models of the mind (neural network models) and Saussure's and Derrida's poststructuralist descriptions of language. He argues that the poststructuralist descriptions of meaning generation in language and the connectionist models of the mind share the characteristics of complex systems (Cilliers, 1998: 37). The parallels that Cilliers draws between poststructuralism and complexity are insightful, but there are two problems with his approach. First, his conceptualization of complexity is confined to connectionist models. As mentioned above, there is a whole range of theoretical perspectives and methodological applications that can be legitimately placed under the term “complexity science.” Even if connectionism fits into a poststructuralist framework, most other forms of complexity science will not.
Second, Cillers's explanations do not solve the “relativism problem.” He specifically addresses the criticism of postmodernism that it is relativistic. He wishes to “steer clear of those postmodern approaches that may be interpreted as relativistic,” and argues that Derrida's approach, which he adopts, is not relativistic (Cilliers, 1998: 21-2). The criticism of relativism cannot be dismissed so easily, however. As Cilliers illustrates elaborately, in both Saussure's and Derrida's theories language is seen as a closed system in which meaning is generated. There are no external referents of Saussure's “signifiers”; nor is there a “natural link” between signifiers and concepts within language (Cilliers, 1998: 38-40). In other words, the meaning-making process does not have any structural rules; it is a language game. All forms of metanarratives and discourses are generated solely within language. Consequently, and inevitably, all metanarratives or discourses are denied any “epistemological privilege” over one another in the interpretations of meanings. From this perspective, no external reference is accepted to gauge the validity, or relevancy, of any interpretation, metanarrative, or discourse.
Even if we assume that the poststructuralist argument may be acceptable for the interpretation of social realities, I do not think that it would be reasonable to suggest that all discourses and metanarratives are equal when it comes to the interpretation of natural phenomena. Complexity theorists' notion of endophysical knowledge challenges the Newtonian/positivist correspondence theory of truth and the naïve notions of representation and objective knowledge, and thus problematizes the relationship between knowledge and reality, but it does not indicate that the scientific discourse is not epistemologically privileged, or that scientific signifiers do not have any referents in nature.
Cilliers is aware of the problem: “When we deny the possibility of a theory of representation [as Derrida does], the question concerning the relationship between the distributed system and the world does not, however, disappear” (Cilliers, 1998: 83). His answer to the question is that there must be a “generalized reference: to an outside world in the mind (or in the language)” (Cilliers, 1998: 82). He cites Baudrillard, who replaced the notion of “representation” with “simulation”:
A simulation does not attempt to represent some essential abstraction of something real; it rather attempts to repeat it, thereby undermining the distinction between the real and the simulated. (Cilliers, 1998: 84)
The idea that knowledge provides a generalized representation, a simulation, is a reasonable solution. Postpositivists and most complexity theorists would agree with it. However, what postmodernist theorists really mean by “generalized reference” and “simulation” is not clear. These points may be clarified by a dialog between postpositivists and postmodernists/poststructuralists. For the time being, however, Fischer's (1995) demarcation between postpositivism and postmodernism (while the former sees a chance of establishing a common basis for truthfulness of interpretations, among or across perspectives, the latter does not) is a good reference point for discussion. Complexity science aims to establish some degree of truthfulness. If postmodernists agree with that, they still carry the burden of explaining their positions.
Anderson, P (1999) “Complexity theory and organization science,” Organization Science, 10: 216-32.
Berge, P, Pomeau, Y., & Vidal, C. (1984) Order Within Chaos: Towards a Deterministic Approach, New York: Wiley.
Bobrow, D. B. & Dryzek, J. S. (1987) Policy Analysis by Design, Pittsburgh, PA: University of Pittsburgh Press.
Casti, J. L. (1994) Complexification: Explaining a Paradoxical World Through the Science of Surprise, New York: Harper Perennial.
Cilliers, P. (1998) Complexity and Postmodernism: Understanding Complex Systems, London: Routledge.
Dent, E. B. (1999) “Complexity science: A worldview shift,” Emergence, 1(4): 5-19.
Fischer, F. (1995) Evaluating Public Policy, Chicago: Nelson-Hall Publishers.
Giddens, A. (1995) Politics, Sociology and Social Theory: Encounters with Classical and Contemporary Social Thought, Stanford, CA: Stanford University Press.
Goldstein, J. (1995) “The Tower of Babel in nonlinear dynamics: Toward the clarification of terms,” in R. Robertson & A. Combs (eds), Chaos Theory in Psychology and the Life Sciences, Mahwah, NJ: Lawrence Erlbaum Associates: 39-48.
Guastello, S. J. (1995) Chaos, Catastrophe, and Human Affairs: Applications of Nonlinear Dynamics to Work, Organizations, and Social Evolution, Mahwah, NJ: Lawrence Erlbaum Associates.
Hughes, J. A. (1990) The Philosophy of Social Research, 2nd edn, London: Longman.
Kadtke, J. & Lempert, R. (2000) “Complex Systems and Policy Analysis: New Tools for a New Millennium,” RAND corporation workshop, September 27-28, www.rand.org/centers/stpi/Events/Complexity/index.html.
Kauffman, S. A. (1995) At Home in the Universe: The Search for Laws of Self-organization and Complexity, Oxford, UK: Oxford University Press.
Keat, R. & Urry, J. (1975) Social Theory as Science, London: Routledge & Kegan Paul.
Kellert, S. H. (1993) In the Wake of Chaos: Unpredictable Order in Dynamical Systems, Chicago: University of Chicago Press.
Kiel, L. D. & Elliot, E. (1992) “Budgets as dynamic systems: Change, variation, time, and budgetary heuristics,” Journal of Public Administration Research and Theory, 2(2): 139-56.
Lincoln, Y. S. & Guba, E. G. (1985) Naturalistic Inquiry, Newbury Park, CA: Sage.
Makaryk, I. R. (ed.) (1993) Encyclopedia of Contemporary Literary Theory, Toronto, Canada: University of Toronto Press.
Madison, G. B. (1988) The Hermeneutics of Postmodernity: Figures and Themes, Bloomington, IN: Indiana University Press.
Marion, R. (1999) The Edge of Organization: Chaos and Complexity Theories of Formal Social Systems, Thousand Oaks, CA: Sage.
Price, B. (1997) “The myth of postmodern science,” in R. A. Eve, S. Horsfall, & M. E. Lee (eds), Chaos, Complexity, and Sociology: Myths, Models, and Theories, Thousand Oaks, CA: Sage: 3-14.
Priesmeyer, H. R. & Davis, J. (1991) “Chaos theory: A tool for predicting the unpredictable,” Journal of Business Forecasting, 10(3): 22-8.
Prigogine, I. & Stengers, I. (1984) Order out of Chaos: Man's New Dialogue with Nature, New York: Bantam Books.
Rosenau, P M. (1992) Post-modernism and the Social Sciences: Insights, Inroads, and Intrusions, Princeton, NJ: Princeton University Press.
Rossler, O. E. (1986) “How chaotic is the universe?”, in A. V Holden (ed.), Chaos, Princeton, NJ: Princeton University Press: 315-20.
Sarup, M. (1989) An Introductory Guide to Post-structuralism and Postmodernism, Athens, GA: University of Georgia Press.
Schram, S. F (1993) “Postmodern policy analysis: Discourse and identity in welfare policy,” Policy Sciences, 26: 249-70.
Young, T. R. (1991a) “Chaos and social change: Metaphysics of the postmodern,” Social Science Journal, 28: 289-305.
Young, T. R. (1991b) “Chaos theory and symbolic interaction theory: Poetics for the postmodern sociologist,” Symbolic Interaction, 14: 321-34.