On s'est toujours refusé en Occident à penser l'intensité. La plupart du temps, on l'a rabattue sur le mesurable et le jeu d'égalités; Bergson, lui, sur le qualitative et le connu. Deleuze la libère maintenant par et dans une pensée qui sera la plus haute, la plus aiguë et la plus intense.
Michel Foucault, Dits et Ecrits
What is the role of time? … Time prevents everything from being given at once … Is it not the vehicle of creativity and choice? Is not the existence of time the proof of indeterminism in nature?
Henri Bergson, The Possible and the Real
The argument that this article wants to make is that any attempt to understand the nature of complexity, and complexity's potential impact on other branches of knowledge, that does not give a central role to its notion of time, necessarily produces mixed results. Moreover, such an attempt should also have a contextual backing for its arguments, otherwise words such as “time” and “modernity” run the serious risk of being nothing but convenient terms deprived of any meaning. The article argues that Foucault's enquiry into forms of knowledge (epistemes1) could and should provide such a context.
Much of the debate surrounding the question of whether complexity could be seen as a manifestation of “postmodern” science is focused around vacuous notions of “postmodernity” and “complexity.”2 The Foucauldian framework presented here has the advantages of giving a coherent context for the understanding of modernity's ethos, without necessarily “defining” it, while at the same time allowing us to consider as “postmodern” (strictly in this Foucauldian sense) any construct of knowledge that does not adhere to modernity's conceptualization of time, for modernity is conveniently described as the “age of (a precise form of linear) History.”
As many authors have noticed, defining what complexity science is exactly meant to be constitutes a daunting task. Many perceive complexity as being simply the manifestation of what Lakatos would call a “progressive paradigm.” The emergence of notions such as “deterministic chaos” suggests that complexity is nothing more than a more sophisticated tool for the unfolding of real patterns that just happen to appear to be completely random.3 Others suggest that complexity, in a number of fields, has changed many paradigmatic assumptions to the point of legitimately claiming to be a proper scientific revolution in the Kuhnian sense.4
This article suggests that complexity science—understood as the set of knowledge practices that, as in Prigogine's case, postulate the central importance of time irreversibility—could be understood as an epistemic revolution. Thus Foucault, through his precise analysis of modernity and temporality as interrelated forms of knowledge, allows us to put some order in the debate by characterizing complexity not simply as that which looks for indeterminist, as opposed to deterministic, scientific laws; not simply as a field of knowledge that calls for interdisciplinary inquiry and recognizes the nonlinear character and the interconnectedness of all things; but as a set of practices that depart in their quest for knowledge on the “postmodern” assumption that the arrow of time exists, and that time is irreversibly a constitutive element in the formative processes of things and not simply a convenient parameter. Foucault indicates that Henri Bergson replaces the modern concept of time with one of the most innovative examples of nonlinear, and nonmetric, temporality.
It is precisely this characterization of intensive, as opposed to the metric extensive, notion of time that is described in De Landa's (2002) chapter “The actualisation of the virtual in time.” In this exciting work, De Landa acknowledges the link between complexity science and Deleuze within a general problematic of temporality. More importantly, Deleuze himself draws much of his inspiration from Foucault and Bergson. Thus, after quoting a passage of Prigogine in which a reference to Bergson is made, De Landa notes that complexity's link with some fundamental aspects of the Deleuzian ontology “is not a coincidence, since Deleuze was greatly influenced by those philosophers (such as Henri Bergson) who were the harshest critics of the reversible and uncreative temporality of classical science.”5 However, De Landa does not situate these links within a general Foucauldian framework, despite indicating that Deleuze himself drew much influence from Foucault's description of successive epistemic ruptures, declaring
I simply take the idea that there are recurrent features in … classificatory practices … but not that these form a global entity called an “episteme”. I do not believe that such global entities or totalities exist. (De Landa, 2002: 193)
Despite the potential downfalls of these “totalitarian” epistemic classifications, this article wishes to express the idea these can be very useful in the study of complexity. Moreover, one does not necessarily have to regard these epistemes as strictly defining knowledge boundaries within periods of time: Foucault himself has argued that the boundaries of knowledge are extremely porous.
FOUCAULT, BERGSON, AND PRIGOGINE
As Michel Foucault (1970) proceeds to describe the birth of the modern episteme,6 the reader cannot fail to notice that something is missing. Foucault (1970: 245) explains that as the era of representation withers, paving the way for the modern episteme, the realm of philosophy finds itself divided into three distinct areas of enquiry:
The criticism—positivism—metaphysics triangle of the object was constitutive of European thought from the beginning of the nineteenth century to Bergson.
Why Bergson? Are we to understand that Henri Bergson was the first thinker to push western philosophy beyond the modern organization of knowledge? Foucault does not say, at least not within the book. Presumably, our novice reader should have enough knowledge of Bergson to gather the significant differences between this philosopher and the recently presented modern episteme.
At this point, our reader opts for a temporary withdrawal from Foucault's acclaimed oeuvre. Time is needed to explore Bergson.
However, another book accidentally catches his eye: The End of Certainty. In this exciting work Ilya Prigogine, a flagship writer on the socalled complexity science, explains why and how the deterministic scientific era, characterized by Newtonian principles of certainty embedded in the reversibility of time, is coming to an end. Prigogine, incidentally, declares that his work in the natural sciences has been deeply influenced by philosophical inquiry. He even goes on to declare: “The dream of my youth was to contribute to the unification of science and philosophy by resolving the enigma of time” (Prigogine, 1996: 72). This is a unification of two realms of knowledge that, as we shall see, Foucault argues have been divided as a consequence of the end of representation. And, more interestingly still, he argues that the single most important phenomenon that characterizes the collapse of representation and the consequent birth of the modern episteme is precisely centered on the emerging notions of temporality and history.
But that which really interests our reader is that the problematic of time within Prigogine's arguments brings us back to... Bergson. Consider:
The results of nonequilibrium thermodynamics are close to the views expressed by Bergson ... Indeterminism, as conceived by Whitehead, Bergson, and Popper, now appears in physics. (Prigogine, 1996: 72, 108)
This article is a modest attempt to begin reacting to these tricky ideas. First, it will put them in order. To do so, a general framework of understanding is needed. This is the principal reason why the present work will begin with a general introduction to Foucault's description of the modern configuration of knowledge. This framework should allow us fully to grasp the significance of Bergson's philosophy. A second and final part will explore first the links that exist between Bergson himself and scientists such as Prigogine, and then, with the help of authors such as Deleuze and Dillon, the relevance of these links for our framing of complexity as a “postmodern” science.
The nucleus of the argument is that Foucault describes modernity precisely through the birth of temporality, and that it is this changing notion of time that Foucault argues has been revolutionized by Bergson, who, in turn, is fundamental for the thoughts of Prigogine and philosophers such as Deleuze. Bearing this in mind, the question that follows the reading of Prigogine's arguments, which explain how Bergson's ideas are now being introduced into physics, is whether or not physics is surpassing the modern episteme. The interest in this question is somehow reinforced when one considers that the path undertaken by modern science broadly corresponds to Foucault's description of modernity. On the one hand, “representational” empiricism seems to have been outmaneuvered by “metaphysical” theorizing, while on the other hand, history appears in modern science but has a strictly linear and homogenizing character, much like the time envisaged by Ricardo. I shall also be tempted to affirm, following Foucault, that both developments are linked, for The Order of Things offers a very detailed explanation of why modernity requires the birth of linear temporality.
“THE PHILOSOPHY OF FREEDOM”
That is how Gary Gutting (2001) describes French philosophy in the twentieth century. He offers an account that describes how “from Bergson through the existentialists and poststructuralists, individual human freedom was itself a radical starting point that required no philosophical foundation” (Gutting, 2001: 380). This can explain why, for example, the French philosophy of science occupied such an elevated status. The desire to demonstrate that deterministic science was compatible with human individual freedom was indeed a major motivation for philosophers such as Boutroux, Poincaré, Le Roy, and others (Gutting, 2001: 26-40). As we shall see, Bergson's ideas emerged from initial debates within this current of thought. The need to relativize the determinism with which the scientific methodis often associated perhaps explains the fact that authors such as Bachelard and Canguilhem have preceded Thomas Kuhn's description of scientific revolutions by more than 30 years.7 Foucault's The Order of Things was also, to a certain extent, a history of epistemic revolutions. Within these contexts, he explored how “scientific” disciplines such as biology were constituted.8
THE MODERN EPISTEME
Foucault (1970: 217) defines the era of the modern episteme as the “age of History.” While in the classical age elements were classified according to their identities and differences—that is, by the spatial position they occupied in the ordered tables constituted by taximonia and mathesis9— the collapse of representation obliges elements to be classified according to their proximity in the temporal succession of events. We therefore assist in the substitution of the principle of the classical tables—Order— with the principle inherent in organic structures—History. As Foucault (1970: 219) puts it: “History gives place to analogical organic structures, just as Order opened the way to successive identities and differences.” This sudden mutation is responsible for a double fragmentation of knowledge: Not only modes of enquiry will differ according to how they deal with the concept of temporality, but emerging empirical “sciences” or disciplines will be formed around the perceived “history” of the elements that compose them. This second event leads to the birth of organic structures as fields of knowledge, fields that will be centered on historical transcendental notions.
The classical configuration of knowledge did not require a concept of temporality:
there was no important distinction between analytic (a priori) and synthetic (a posteriori) knowledge. An analysis of representations in terms of identities and differences was at the same time a connection (synthesis) of them in the ordered tables that express their essential reality. (Gutting 1989: 184-5)
Nonetheless, with the decline of the classical episteme, representation cannot be regarded as the sole foundation of knowledge. Thus occurs a crucial schism. Foucault explains that once the representational foundations of thought fade, analytic disciplines are founded on epistemological grounds that fundamentally differ from the basis on which synthetic disciplines are based. The result is that
on the one hand we have a field of a priori sciences, pure formal sciences, deductive sciences based on logic and mathematics, and on the other hand we see the separate formation of a domain of a posteriori sciences, empirical sciences, which deploy the deductive forms only in fragments and in strictly localised regions. (Foucault, 1970: 246)
The split between a priori and a posteriori grounds for understanding is itself explicit in the triangular configuration that characterizes modern philosophy.
The general problem is that if representation ceases to suffice as an explanatory source per se, this explanatory source can only be located within an enquiry that either claims to deal with the essential reality of things (i.e., with an essence that precedes any other reality) or with their synthetic nature. It is precisely this problematic that forces western thought to formulate philosophical justifications for these transcendental notions.
Philosophy, especially after Kant, is given the task of resolving the conflict between the emerging transcendentalisms and the world of representation. Representation, in this sense, is “essentially the relation between a subject and the object it thinks and experiences” (Gutting, 1989: 185). This is how an embryonic conceptualization of temporality is introduced within the general frameworks of knowledge. For this relation can be analyzed either in terms of the conditions that precede and therefore ground the possibility of any representation, or in terms of how the experiencing subject stands in relation to already represented objects (metaphysics). The first solution corresponds to the creation of a transcendental philosophy of the experiencing subject, while the second leads to a transcendental philosophy of the experienced objects. Both solutions respectively correlate to the appearance of analytic and synthetic sciences. Finally, both solutions equally assume that it is necessary to connect representations in terms of which our experience occurs to either a subject or an object that lies outside that experience but grounds its possibility. Philosophical criticism will therefore question representation from the a priori requirement of an experiencing subject, while metaphysics will seek to understand representations in terms of the experienced objects. In both cases, representation is not a causal source of explanation in itself.
There is, however, a third possibility. This consists in accepting what Kant calls the phenomenal world of empirical experience as the only ground for the attainment of knowledge. Positivism is simply defined as a rejection of all transcendental notions.10 But to what extent, then, is positivism not merely a return to the classical method that regards representation, as opposed to both alternative forms of transcendentalism, as the sole grounds for understanding? Gutting (1989: 185) answers that the epistemic difference lies simply in the fact that during the classical era, phenomenal and representational knowledge were the sole grounds for all understanding, while with modernity they constitute only some among other options. Precise ways of understanding have to be situated within an epistemic context that, analyzed as a whole, dictates the evolution of these.
The sciences of economics, biology, and philology adopt a particular position within the modern organization of knowledge. These fields clearly enter the category of synthetic sciences that seek to understand the world of representations through transcendental philosophies of objects. As Foucault (1970: 244) put it:
Labour, life, and language appear as so many “transcendentals” which make possible the objective knowledge of living beings, of the laws of production, and of the forms of language. In their being, they are outside knowledge, but by that very fact they are conditions of knowledge.
This corresponds to the point enunciated above explaining that a transcendental philosophy of objects connects representations in terms of which our experience occurs to an object that lies outside our experience but grounds its possibility. Nonetheless, this philosophy necessitates a crucial analytic of finitude to operate.
Let us take the analysis of the discipline of economics to illustrate the point. Foucault argues that in the classical age only purely representational sources of value were commonly considered. The value of things was established through the notion of exchange; this was a purely representational system, for the value of an object was always represented in the value of another. Things did not really have a proper value as such. That is why in the classical age economic “history” did not exist:
Value is always related to the system of exchange, in which any changes in value are always correlated with changes in other values. Since the system always involves a finite number of correlated values, changes can only be cyclical. (Gutting, 1989: 187)
Foucault sees Smith as the first step toward modernity. Smith introduced the idea that labor, and not exchange, was the source of value. Nonetheless, Smith considered that this value still necessitated a representational system of exchange to operate. Indeed, Smith's thesis is that a thing can purchase another object that embodies a relative quantity of labor: The value of labor is determined by exchange. It is Ricardo who operates the decisive break from the classical era. After Ricardo, labor is considered to be the sole source of value, regardless of how much this labor can represent in the system of exchange. From now on, “a theory of production must always precede a theory of circulation” (Foucault, 1970: 254). But the transcendental notion of labor, the object through which the subject economic “man” will be analyzed, necessitates two fundamental notions of finitude.
First, the notion of anthropological finitude ensures that the subject “man” is studied through the object labor. While the technocrats analyzed how men stood in relation to representations,
modern economics deals with factors that cause men to form such representations—for example, with the bareness of nature that forces us to work, with the bodily deficiencies and the external threats that limit our ability to produce … modern economics is ultimately based … on an “anthropology” of human finitude. (Gutting, 1989: 188)
A problem remains, however. How do we regard “man” simply as a constituted subject, how can we stop this man from building alternative forms of production, how can we make sure that a transcendental philosophy of the object does not become a transcendental philosophy of the subject? This is precisely the problem that the realm of philosophy must try to resolve, and that is how “the study of man, precisely as a constituting subject, is the central concern of modern philosophical reflection” (Gutting, 1989: 199). What is usually referred to in the social sciences, for example, as the “structure and agency debate” is an inherent problematic within the modern episteme.
Historical finitude is crucial for the placement of contingency on the newly born “man.” Ricardo demonstrates that there is a historical precedent in all forms of production and capital accumulation that cannot be transcended. The culmination of this historical contingency is expressed in Marx's maxim: Men produce their own history, but not in the conditions of their choosing. Furthermore, precisely because historical finitude is meant to place contingency, the emerging notion of history must necessarily be unidirectional. That is how Foucault can argue that “we witness the birth of this grand linear and homogeneous series which is the process of production” that introduces “the possibility of a continuous historical time,” which Ricardo, for instance, thinks of “in terms of a persistently slower axis of time, and eventually … a total suspension of history” (Foucault, 1970: 253-63, emphasis added).
If a quote could summarize the analytic of finitude within the modern episteme, it would have to be the following:
What is essential is that at the beginning of the nineteenth century a new arrangement of knowledge was constituted, which accommodated simultaneously the historicity of economics (in relations to the forms of production), the finitude of human existence (in relation to scarcity and labour), and fulfilment to an end to History … History, anthropology, and the suspension of development are all linked together in accordance with a figure that defines one of the major networks of nineteenth century thought … Finitude, with its truth, is posited in time; and time is therefore finite. (Foucault, 1970: 262-73, emphasis added)
The developments within the other two fields studied by Foucault, biology and philology, follow similar patterns of linear historicism.
HENRI BERGSON AND TIME
Gutting notes that the philosophers of the Third Republic had an intense interest in science. Indeed, it would seem that while we “can think of philosophy's premodern period as the time, before the scientific revolution, when it was identical with science, when philosophy was simply the enterprise of understanding the world in all its aspects” (Gutting, 1989: 49), the task of philosophy after the triumph of the “scientific revolution” is relegated to justifying a role for itself. As Gutting puts it, the argument is that after the incredible success of scientific discovery, philosophy had to demonstrate that a number of phenomena were not explainable through the “scientific method” alone.
Philosophy's task was to show that in order to pursue the abstraction that modern science must necessarily adopt, certain nonquantifiable aspects of our experience are ignored. Philosophy, thus, must exhibit that it “can and should root itself in an experience with an immediacy or concreteness that escapes the abstractions of modern science” (Gutting, 1989: 50). And, as we shall see, these abstractions are produced because of the metaphysical nature of modern science: its drive to formulate a posteriori transcendental philosophies of objects. Note how this corresponds to Foucault's arguments regarding the fragmentation of knowledge and the new role of philosophy as a discipline that seeks to recover the lost uniqueness of wisdom (Foucault, 1970: 217-49).
Henri Bergson is in this respect one of the most important figures within the spiritualist tradition, a tradition that advocated the presence of a distinctive philosophical experience. Bergson followed with a great deal of interest the developments that were taking place within the philosophy of science. More precisely, Poincaré's initial doubts concerning the infallible objectivity of science, and his disciple's (Le Roy) reinforcement of these doubts, must have played an essential role in the constitution of Bergson's thought. At the core of Bergson's philosophy of science lies the conviction that the scientific method adopts a cinematographical view of temporality, which implies, as Gutting puts it, “that science views reality not as a continuous flux (the duration that in fact is) but as a series of instantaneous ‘snapshots' extracted from this flux” (Gutting, 1989: 51).
Science's cinematographical view of duration is due to the fact that it is primarily concerned with action. As thought that is primarily concerned with practice, science must abstract from the concrete reality that we experience, in which temporality is not simply another form of space, but a “wholly qualitative multiplicity, an absolute heterogeneity of elements which pass into the other” (Gutting, 2001: 57). For Bergson, in the real continuum of duration there are no distinct elements that precede or follow real points in “time.” In this context, it becomes meaningless to speak of an a priori or an a posteriori: Bergson envisages a notion of temporality as a
continuous flux of novelty in which nothing is ever fixed, complete, or separate. In this flux, anything that we can say exists “now” also incorporates into a qualitative whole everything we can say is “past,” a whole that is itself being incorporated into the new synthesis of the “future.” (Gutting, 2001: 57)
The distinction between the synthetic and the analytic disappears in the flux of time, for it is precisely this continuous temporal vortex that represents the formation of things and their essential reality. This is the main postulate of what has been referred to as Bergson's “superior empiricism” (more on this below). And again, it is precisely this refusal to deal with transcendentalisms that characterizes Bergson's drive for an immanent reality that can be experienced while refusing to be cut into bits and abstracted. Clearly, the emergence of such ontology revolutionizes the basis of the modern episteme. And it is precisely by demolishing the modern episteme's understanding of all possible paths and conditions for the attainment of knowledge that Bergson issues a challenge to Kant's Copernican revolution in philosophy and in science:
unlike the transcendental procedure of Kant, [Bergson's philosophy] does not refer to the conditions of all possible experience; rather, it is moving toward “the articulations of the real” in which conditions are neither general and abstract nor are they broader than the conditioned … Bergson insists upon the need to provide a genesis of the human intellect. (Pearson, 2002: 11-13)
Bergson equally argues that the vision of modern science, through differential calculus, as a method that alters the basically static notion of time is misleading. Modern science differs from classical science in so far as the former democratizes the notion of temporality. While classical science privileged some “moments” over others, modern science seeks to explain phenomena from any temporal standpoint. In this sense, because all temporal standpoints are considered, time “appears” in modern science. Nonetheless, the assumption that the flux of time is divisible into isolated elements does not alter. However, the idea that modern science has given time a fundamental role can be reinforced by the argument that it has made temporality an independent variable of deterministic mathematical equations, and this represents a major qualitative shift from Aristotle's static physics. Bergson, in contrast, argues that the problem lies precisely in turning temporality into a parameter. The flux of time is not merely a quantifiable parameter that can be isolated and used as a variable, but a constitutive process characterized by sheer heterogeneity and multiplicity.11
We could make use of another argument here. One of the main contributions of twentieth-century philosophy of science was to demonstrate how the scientific method is not characterized, as it was generally thought, by the mere objective observation and experimentation of reality. If we apply Foucault's definition of the modern organization of knowledge, arguments such as Popper and Kuhn's make us realize that what we often brand as “science” is not located within the analytic or positivist epistemic poles. Rather, the “scientific method” is centered on presuppositions, or conjectures, that have more to do with metaphysical transcendental theorizing than with positivistic experimentation or mathematical/logical essences.
In this sense, the “scientific method” should be regarded as a transcendental philosophy of the object, for the subjects of scientific enquiry are usually examined by how they stand in relation to the basic presuppositions of the paradigm, and not vice versa. Indeed, Kuhn (1970: 24) demonstrates that those elements that do not comply with paradigmatic assumptions are often discarded. The study of reality seems to follow from metaphysical assumptions. Dupré, in a book with a very pertinent title (The Disorder of Things), seeks to illustrate the “dependence of modern science on metaphysics” and argues that “science itself cannot progress without powerful assumptions about the world it is trying to investigate … without … a prior metaphysics” (Dupré, 1995: 1-2). Although the argument cannot be articulated in any length here, “scientific” theoretical assumptions are not that different from the a posteriori transcendental notions that Foucault argues cause the necessary linearity of history.
Having established the fact that science fails to tackle the issue of real temporality, Bergson argues that philosophy might have been expected to occupy this empty ground. However, this was not to be. Much as Foucault has argued, philosophy has attempted to resolve problems inherent to the modern episteme, instead of attempting to surpass modernity. Modern philosophy, Bergson argues, has not challenged the view of time as “nothing more than a fourth spatial dimension, which could readily be viewed as having no creative efficacy, as merely the vehicle for the automatic unrolling of a nomologically determined sequence” (Gutting, 2001: 57). This modern scientific vision of time, as Prigogine maintains, is all but dead.
A philosophy that sought to adopt a more immanent approach to temporality would reveal how the very notion of time is inherent to any conceptualization of human freedom. To expose the point, Bergson attempted to present just how problematic a divisible notion of time is when applied to the analysis of psychological states. In his doctoral thesis, Bergson posited the problem as follows. Accepting the notion that our decisions, feelings, and emotions occur at identifiable, isolated “spaces in time,” the determinists will maintain that human will is never free because determining causes exist. On the other side of the spectrum, nondeterminists either deny the existence of these casual determinants (libertarianists) or argue that their presence does not infringe human freedom (compatibilists). Bergson's thesis is that to view the self as a constitution of separate psychological states, which might be linked by causal connections, leads us to an unsolvable paradox of human freedom. On the contrary,
a description of the self that accords with the immediate givens of consciousness shows it to be an organic whole that creatively produces its future, and this production is precisely what we mean by freedom. (Gutting, 2001: 58)
Now we can start to understand how Bergson goes beyond the modern episteme. His conceptualization of temporality refuses the a priori/a posteriori distinction on which the modern organization of knowledge is based. His refusal of transcendentalism, coupled with his insistence on the realm of immanence, has produced challenges to the modern notions of abstraction, temporality, empiricism,12 science, and freedom.
THE SCIENCE OF FREEDOM
There are at least two ways in which the scientific developments elaborated by Prigogine can be broadly understood within a Bergsonian framework. First, Prigogine quickly acknowledges the cinematographical nature of pre-complexity science and secondly, indicating how this precise view is caused by the notion of time as an illusion, he relates the problematic of the reversibility of time to the conceptualization of human freedom. For Prigogine it is precisely the positing of temporality as the creative drive inherent to the living and inert elements of nature that pushes a number of thinkers, including himself, to equate complexity science with “freedom.”
Prigogine articulates the idea that contemporary science uses a static notion of time principally by making reference to Einstein and Hawking. Einstein famously declared that “time is an illusion.” More recently, “in his Brief History of Time, Hawking introduces ‘imaginary time' to eliminate any distinction between space and time” (Prigogine, 1996: 58). Thus classical and modern physics have maintained a spatialized vision of time that is basically static, and even today scientists assume as a “matter of faith that as far as the fundamental description of nature is concerned, there is no arrow of time” (Prigogine, 1996: 58).
For Bergson, as we have seen, the cinematographical view of time is adopted because of science's primary concern with action. To a certain extent, Prigogine agrees. For it is through the desire for immediate practical results that the technique of reductionism is adopted. Looking at things on the smallest scale, it is thought, confers a deeper and therefore practical knowledge concerning the object under study. However, it is not principally reductionism that facilitates the immobilization of time. Fundamentally, what reductionist and “practical” science requires in order to justify the idea of static temporality is… a metaphysical dimension. In the following arguments the relevance of Foucault's description of the modern frameworks of knowledge and Dupré's description of the metaphysical nature of science should become apparent.
Prigogine argues that “Nature involves both time-reversible and time-irreversible processes” (Prigogine, 1996: 18). Reversible processes deny temporality as a constitutive apparatus of the process. Examples of these processes can be found in Newton's formulation of classical physics and in Schrödinger's basic equation of quantum mechanics. In both cases, equations are invariant with respect to time inversion. Contrarily, timeirreversible processes break time symmetry. In these processes temporality does affect how the general rules of motion will affect the system in a precise temporal context. More importantly, time irreversibility produces entropy. An example of time-irreversible processes is the second law of thermodynamics.
However, Prigogine argues that time reversibility is produced first because we accept to reduce the analysis to an elementary level, and second because we abstract: “Reversible processes correspond to idealizations: We have to ignore friction to make the pendulum work reversibly” (Prigogine, 1996: 18). In Foucault's words, these metaphysical transcendentalisms adopt reductionism because they “must deploy the deductive forms only in fragments and in strictly localised regions.” Once Prigogine dismisses the idea that entropy might be caused by insufficient data or faulty examination, the ideas that follow from his arguments suggest that, if we bear Foucault in mind, time reversibility is a particular cause of a transcendental philosophy of objects; that is, a metaphysical system that, as Kuhn puts it, ignores elements that do not happen to coexist with the basic premises of a paradigm. This causes the need to discard incompatible elements (precisely such as the second law) on the grounds of humanity's imperfect observation capacities or the inadequacy of its instruments:
The results presented thus far show that the attempts to trivialize thermodynamics … are necessarily doomed to failure. The arrow of time plays an essential role in the formation of structures in both the physical sciences and biology. (Prigogine, 1996: 71, emphasis added)
Time irreversibility becomes undeniable once, on the one hand, we adopt a more immanent approach to nature, and on the other, we look at populations and not at single elements that compose them. In a sense, this approach is very similar to what Deleuze has described as “Bergson's superior empiricism.” Pearson describes one of its features as follows:
Instead of chopping up experience into atomistic sensations, which can then only be brought into union with one another in terms of a purely abstract principle that swoops down upon them from high and folds them in its own conjunctive categories, [Bergson's empiricism] recognizes a continuity and concatenation between things. (Pearson, 2002: 12)
The apparently contradictory pulls toward immanence and connectivity are resolved through the continuum of time and this notion of empiricism.
To present this point, we must illustrate Prigogine's proposals for a unified theory of quantum mechanics. Quantum theory is in a very paradoxical state. Despite providing some remarkable predictions since its formulation about 60 years ago, the scope and meaning of the theory are still widely discussed. As Prigogine explains, this is unprecedented in the history of science. The ambiguity of quantum theory can be exposed through its principal calculus; that is, the Schrödinger equation. This equation is both time reversible and deterministic, but Prigogine explains that it presents us with a paradox:
The basic assumption of quantum theory is that every dynamical problem can be solved at the level of probability amplitudes exactly as every dynamical problem in classical mechanics was traditionally associated with trajectory dynamics. But strangely, in order to attribute well-defined properties to matter, we have to go beyond probabilities amplitudes; we need probabilities themselves [immanent actualities]. (Prigogine, 1996: 47)
Thus, by presenting an example using the stated equation, Prigogine shows that
initially we started with a single wave function ?, but we still end up with a mixture of two wave functions, u1 and u2. This is often called the “reduction” or “collapse” of the wave function. The Schrödinger equation paradoxically seeks to transform a wave function into another wave function, but it ends up moving from a “pure state (the wave function) to an ensemble, or mixture.” (Prigogine, 1996: 48)
As Prigogine sees it, the problem is that “we need to move from potentialities described by the wave function ? to actualities that we can measure.” This is a problem that De Landa describes quite well as he presents the processes characterizing the shift from the nonmetric virtual to the metric real within Deleuzian ontology (see Chapter 3 of De Landa, 2002). As both Deleuze and Prigogine argue, these actualizations occur through the breaking of time symmetry. We should remark that the problem of moving from a potentiality, expressed in probabilistic rather than essentialist terms, to an immanent reality corresponds to what Gutting considers French philosophy's call, to which Bergson vividly adhered, for a form of thought that “can and should root itself in an experience with an immediacy and concreteness that escapes the abstractions of modern science.”
Prigogine's solution for the dual status of quantum theory thus involves a more immanent approach to nature as well as an apparently contrary trend toward anti-reductionism. This is, in the fullest meaning of the term, a flat ontology: The whole is treated as a unit, but without higher, transcendental levels of analysis such as motionless essences or metaphysical assumptions. Prigogine proposes to substitute the function that results from Schrödinger's equation with Poincaré's resonances; the crucial point being that these resonances involve populations and not individual wave functions. In Deleuze's thought, these populations are referred to as virtual multiplicities. In Prigogine's words,
Through Poincaré's resonances, we achieve the transition from probability amplitudes to probability proper without drawing on nondynamical assumptions … Only by going beyond a reductionist description we can give a realistic interpretation of quantum theory. There is no collapse of the wave function, as the dynamical laws are now at the level of ? [populations], the density matrix, and not [individual] wave functions ?. Moreover, the observer no longer plays any special role. The measurement device has to present a broken time symmetry. For these systems, there is a privileged direction of time, exactly as there is a privileged direction of time in our perception of nature, it is this common arrow of time that is a necessary condition of our communication with the physical world; it is the basis of our communication with our fellow human beings. (Prigogine, 1996: 54)
The crux of all this is that the “probabilizing revolution” (which characterizes the proposed Poincaré resonances) is ending up demonstrating that probabilistic results do not follow from imperfect human knowledge but represent a real state of affairs in nature. This situation, Prigogine keenly maintains, produces an end of certainty, which should be welcomed as enhancing a new conceptualization of human freedom that has not been possible since the famous Epicuru's Dilemma (Prigogine, 1996: 10). And, more importantly still, time plays a crucial role in this new conceptualization of the basic laws of nature and human freedom. Prigogine is happy to demonstrate that his conclusions concord with Bergson's conceptualization of temporality:
I'm certainly not the first one to have felt that the spatialization of time is incompatible with both the evolving universe, which we observe around us, and our own human experience. This was the starting point for the French philosopher Henri Bergson, for whom “time is invention or nothing at all” … I mentioned one of Bergson's later articles, “The Possible and the Real” … where he expressed his feeling that human existence consists of “the continual creation of unpredictable novelty”, concluding that time proves that there is indetermination in nature. (Prigogine, 1996: 59)
What is important to understand is the relevance of all of this to Deleuzian and Bergsonian ontology. For Deleuze, the wave function would not represent merely “probabilities,” but it would be a striking example of the virtual; and the virtual, which expresses itself through time-breaking symmetries, is real: “The virtual is real without being actual, ideal without being abstract” (Pearson, 2002: 1). In short, Pearson (2002: 1) describes such ontology as follows:
With Deleuze and Bergson we have the distinction between virtual (continuous) multiplicities and actual (or discrete) multiplicities, a conception of the evolution of life as involving an actualisation of the virtual in contrast to the less inventive or creative realization of the possible.
Interestingly enough, Bergson declared that the new science of spacetime required a concept of “immediate and constantly varying duration” if it was to avoid remaining abstract and deprived of any meaning (Pearson, 2002: 11). In short, with this ontology we understand that the virtual multiplicities are real without necessarily having been actualized.
Numerous “poststructuralist” philosophers have also been enormously inspired by the Bergsonian time/freedom nexus. Among these, one of the more striking examples is Deleuze. While the links between Bergsonianism and Deleuze have been covered extensively elsewhere,13 as indeed have been the relations between Deleuze/Guattari and complexity science,14 the purpose of this article is to situate all these relations within a Foucauldian framework. And it is precisely Foucault's concise definition of “modernity” as an organization of knowledge that will provide some parameters for the understanding of these relations. It seems obvious, for instance, that a great deal of debate concerning the nature of complexity science is flawed precisely because it lacks a general philosophical landscape within which notions such as “modernity” can be situated.
As an example of the utility of these Foucauldian arguments, we could consider Dillon's important contribution. In an audacious attempt to understand the relation between complexity and poststructuralism, Dillon's (2000) main line of argument is that both share important intellectual grounds, but ultimately what divides them is still fundamental. Dillon does say that “defining” complexity is as difficult (and perhaps as pointless) as defining poststructuralism, but if we attempt to capture their respective ethos we would realize that what these two currents of thought share is a commitment to the “anteriority of radical relationality.” This means that “nothing is without being in relation, and that everything is— in the way that it is—in terms and in virtue of relationality” (Dillon, 2000: 4). Nonetheless, this commitment differs depending on whether this anteriority of radical relationality is seen as being simply that, as is the case for complexity, or whether
the anteriority of radical relationality is relationality with the radical nonrelational. Here the radical non-relational is the utterly intractable, that which resists being drawn into and subsumed by relation albeit it transits all relationality as a disruptive movement that continuously prevents the full realization or final closure of relationality, and thus the misfire that continuously precipitates new life and new meaning. (Dillon, 2000: 5)
For Heidegger, the radical nonrelational is death. For Levinas, it is the Other. For Derrida, it is alterity, while for Lacan, it is the real. Ultimately, it is because of the presence of these radical nonrelationals that poststructuralism assumes a poetic character, while complexity is still entangled with its technical quest for an “implicate orderliness—the orderliness as such even if the notion of order is developed in novel ways—of the anteriority of radical relationality” (Dillon, 2000: 4).
However, we can argue that following Dillon's description of the radical nonrelational, complexity does seem to have a perfect candidate to assume this role: temporality. The arrow of time is constitutive and present in all formations, but the virtual that the arrow of time represents is never fully actualized; the process never stops. As De Landa describes the ontology of Deleuze and Prigogine, these are philosophies of becoming without being.15 And it is precisely the fact that no processes ever fully “become” something, following Prigogine, that the arrow of time can ensure “the continuous precipitation of new life and new meaning.” Furthermore, it is precisely the presence of temporality as a radical nonrelational that makes complexity “poetic.” Guattari has explored the notions of chaos theory and their relation to his own work in psychoanalysis, and has argued that, following complexity,
In this conception of analysis, time is not something to be endured: it is activated, orientated, the object of qualitative change … in these conditions, the task of the poetic function, in an enlarged sense, is to recompose artificially rarefied, resingularised Universes of subjectivation. (Guattari, 1995: 18-19)
Foucault's categorization of epistemes allows the observer to understand the centrality of the notion of time as that characterizing the qualitative difference inherent within complexity science.
Other attempts to relate complexity with “postmodernism” have mainly dealt with issues concerning connectivity (Cilliers, 1998) and/or the possibility of effective prediction (Morçol, 2002), but have generally failed to address to central notion of temporality, which, as we have seen, should be central to any discussion surrounding the nature of “modern” or “postmodern” knowledge.16
This article has sought to present Foucault's archaeology of knowledge in order to trace past and present developments within the philosophy of science and their potential impact on the understanding of complexity theory. It has demonstrated that the notion of temporality is crucial for the organization of elements within the modern organization of knowledge. The article is an initial attempt to resolve contemporary debates on the nature of complexity science by framing ideas within the notion of epistemes as elaborated in Michel Foucault's The Order of Things.
Many of the debates surrounding complexity deal with the issue of whether the new sciences can be regarded as a “postmodern” form of knowledge. The article argues that complexity should not be defined nor categorized, but that in order to capture its innovative ethos it is important to consider the important work of Foucault, especially for what concerns an alternative, and emerging, notion of temporality. The article thus suggests that temporality is the most important issue around which any consideration of complexity science should be formulated, and that, incidentally, the notion of a new temporality is the crux of Foucault's description of modernity.
The article concludes that, if we bear in mind the defining characteristic of modernity (linear temporality), the challenges that Bergson-Deleuze pose to such modernity, and the links between Prigogine and Bergson-Deleuze, complexity could indeed be viewed as a “postmodern” science.
Foucault describes an episteme as the ensemble of the underlying rules that characterize the general configuration of knowledge within a precise historical context. See Foucault (1970).
For instance, problems with such a categorization are evident in Rash & Wolfe (2000).
Byrne, for instance, considers complexity exclusively for its potential in quantitative social science, and believes that the new science will “eradicate post-modernism.” See Byrne (1998).
For complexity as a paradigmatic revolution, see Gleick (1998: 37-9).
De Landa (2002: 84). However, Deleuze's relation to Bergson is a complex one, and cannot be fully detailed here. K. A. Pearson, for instance, correctly reminds us that “It is inadequate to describe Deleuze as a Bergsonian, not simply because of the many and varied sources he draws upon, but rather because of the highly innovative character of his Bergsonism” (Pearson, 2002: 167-204).
Foucault describes an episteme as the ensemble of the underlying rules that characterize the general configuration of knowledge within a precise historical context. See Foucault (1970).
Gutting (2001: 87). Similarly, when Foucault was accused of omitting Kuhn's arguments from his The Order of Things, he replied that since his book made several references to Canguilhem, further references to Kuhn were not necessary. See Gutting (2001: 39, n.16).
Foucault has undoubtedly been influenced by the work of historians of science, especially Canguilhem and Bachelard. Indeed, Gutting believes that the ideas of these two thinkers are crucial for the understanding of Foucault's arguments. Moreover, Canguilhem proposed the very concept of “Archaeology of Knowledge” to Foucault. See Gutting (1989: 1-54).
For a description of taximonia and mathesis, consider Chapter 3 in The Order of Things.
Indeed, Foucault states that “if rejection of a transcendental approach means that one is a positivist, then I'm quite happy to be one.” Cited in Gutting (1989: 242).
The argument is rather different when it comes to Bergson's position on relativity. See Pearson (2002: Chapter 2).
For Bergson's “superior empiricism,” see Pearson (2002: 12).
Deleuze himself wrote a book on Bergson, see Deleuze (1988). For other works on Deleuze and Bergson see, for example, Constantin (1996); Douglass (1992a; 1992b). For an interesting piece in Italian, consider Restuccia (1983).
See De Landa (2002). Guattari's last book (1995) deals with the relations between his and Deleuze's work and complexity theory.
As opposed to essentialist practices of being without becoming (De Landa, 2002: 84). Note that this is reflected in the title of Prigogine's From Being to Becoming (1980).
Byrne, for example, commentates with surprise that Cilliers' exploration of complexity and postmodernism did not entail an absolute relativism. See JASS book review, http://jasss.soc.surrey.ac.uk/2/2/review1.html.