Full spectrum analysis:
Practical OR in the face of the human variable1

Graham Mathieson Defense Science and Technology Laboratory, ENG

Abstract

This paper draws together material from Operational Research (OR) and Human Sciences (HS) conferences. It explores the need for OR to embrace the full range of knowledge available from HS and for HS, in turn, to integrate and express that knowledge in a form that OR can use. The paper discusses humans as a source of individual and collective variability in complex, socio-technical systems and exposes the challenges facing OR in advising on interventions in them. It then outlines practical approaches to those challenges based on balanced problem formulation, modelling systems ‘naturally’ and embracing uncertainty, leading towards ‘full spectrum’ analysis.

Introduction

This paper is based on presentations given at the 20th International Symposium on Military Operational Research2 and NEC - The Human Dimension3, combined with reflections on the presentations and discussions from Human Factors of Decision Making in Complex Systems4. The paper explores the need for Operational Research (OR) studies to embrace the broad range of variability arising from human participation in the operations and systems studied, and the consequent need for the human science communities to integrate and present their knowledge in forms which are amenable to exploitation by OR practitioners. Although the paper is principally concerned with the exploitation of human sciences by OR there are clear implications for cross-disciplinary integration across the full spectrum of the human and system sciences as a pre-requisite for satisfying OR’s needs. The paper is, therefore, a call to action for the various scientific and technical communities relevant to OR.

The nature of OR

OR, is concerned with the analysis of interventions with the operation of systems or organizations of interest to executive decision-makers, who are the OR study customer. Since OR tackles real world problems of interest to human executives, and since the systems involved are usually embedded in human organizations, it can fairly be asserted that OR is principally concerned with the analysis of socio-technical systems. (NOTE: for the rest of this paper the terms ‘system’ and ‘system of interest’ will be assumed to refer to socio-technical systems).

Consequently, it is important for OR methods to be able to deal with socio-technical factors and issues. This raises several challenges for OR methods which have not, to date, received enough attention, but in the face of which steps can be taken to improve the state of practice.

Challenges for OR

The key challenges for OR can be categorized into the problems of modelling, data, prediction and intervention.

The problem of modelling

Good OR studies begin with a problem formulation stage involving the construction of a conceptual model of the system of interest. This problem model captures the joint understanding of the analysts and their client about the system (and intervention options), serves as a common description between the analyst and providers of expert knowledge, and is a key factor in selecting the analysis methods (NATO, 2002).

It is critical that the problem model be requisite, i.e., that it faithfully represents real world factors, structures, processes and effects that significantly impinge on the study problem. A small study team cannot hope to have detailed knowledge of all of the disciplines required to model a socio-technical problem, and must rely upon knowledge from specialists in a range of disciplines, including: information technology, systems engineering, organizational psychology, management science, economics, cognitive psychology, anthropology, etc. It is critical that such knowledge be trustworthy, comprehensible and usable.

This challenges providers of specialist knowledge to ensure that their conceptual models and theories are adequately comprehensive, comprehensible and coherent across the variety of disciplines needed to model the OR problem. For example, Farrell and Lichacz (2003) emphasize the need to have a more rigorous framework to analyze complex systems. Using multiple perspectives is risky if the perspectives have inconsistent models. Although an element of post-modern thinking is useful to OR, the problem model must be kept self-consistent to avoid misleading the executive decision maker.

The problem of data

To construct a requisite problem model critically depends on having data describing the key variables of the system, their likely values (or distributions) and the nature of the relationships between them, including the current operation of the system, its structures and processes. Such data are not easy to acquire, particularly if the system of interest is not in continuous operations (like a production line) but only called into action on a contingency basis (like a military capability). In the latter case even the constituents of the system may be unclear in advance of the contingency. In this context the OR practitioner needs either generic data or potential data distributions in order to transform the conceptual problem model into a generative model which can support inference about the system. Data are often less accurate, precise, reliable or available than would be ideal, but OR has evolved robust methods for dealing with such data, while still producing insights and advice which improve upon the executive’s intuitive understanding. OR methods can make use of logical, descriptive or numerical data, although numerical is preferred for a variety of reasons. As with modelling, the reliability of data is paramount and this implies coherence across the domains of expertise providing it.

The problem of prediction

At the heart of OR is the presumption that the system of interest can be analyzed to produce insights which the executive will use to intervene in order to produce desired effects. This, in turn, presumes some measure of predictability either in the system response to intervention or in some more abstract properties which influence future system behavior. Even a good conceptual model with adequate data is no guarantee of predictability. Socio-technical systems tend to be complex adaptive systems (CAS), as described by Allen, (1988), presenting a fundamental problem with macro-behavior forecasting, even with complete system knowledge. CAS may exhibit simple and stable macro-behaviors despite micro-level complexity and variability, or vice versa, and they may become chaotic, responding so sensitively to minute variations in the detail that the macro-behavior appears effectively random. They may flip between different modes of behavior in response to apparently insignificant changes or even with no apparent change at all.

This difficulty has led many analysts to declare that prediction is not possible in CAS. However, drawing useful inferences about the consequences of executive intervention does not require precise prediction of system behavior. Executives are prepared to take risks based on general trends, or statistical forecasts; anything that gives them information to take a better gamble. OR is, so to speak, ‘loading the dice’ in favor of success. For example, Mintzberg, (1979) and others have identified relationships between organizational structure and task environment. This knowledge can be used as a generic model to give insights, in broad terms, about the likely response of an organization to a forced change of structure or environment.

However, one must remain aware of the assumptions behind theory and be constantly skeptical about whether those assumptions hold. For example, one feature of Industrial Age organizations identified by Mintzberg is that their structure evolves in the face of environmental variety in ways which limit the demands on individual managers. In the Information Age, there is the promise of empowering managers to cope with more complex tasks, allowing greater freedom to create more complex organizations. If so, then perhaps the empirical basis of Mintzberg becomes less valid.

Historical data alone are not a sufficient basis for executive action without at least the prediction that the data used will remain valid in the timescales of the proposed intervention.

The problem of intervention

The final challenge for OR, and the providers of its underpinning domain knowledge, is the fact that socio-technical systems are all, to a greater or lesser extent, self-aware and liable to behave reflexively in the context of interventions perpetrated upon them. Such complex adaptive reflexive systems (CARS) have an extra dimension to their response in which individual and collective decision-making become a critical feature. CARS can generate behavior which either reinforces or undermines the executive’s intentions, and this behavior can be pre-emptive, driven by perceptions of the executive’s intent prior to substantive executive action or its effect. Thus bizarre situations can occur such as organization members reacting to a false perception of a hostile executive intent and taking mitigating actions, which the executive falsely perceive as hostile, leading to the adoption of a hostile executive intent where none previously existed. Human affairs are full of such self-fulfilling prophecies and OR needs to allow for the social processes involved in them.

The human sciences (HS) have a key role to play in providing OR practitioners with the knowledge needed to take account of such reflexive behaviors and the subsidiary interventions needed to avoid or mitigate them as required. OR practitioners, for their part, need to understand the possibilities for reflexive response, on top of the other challenges, and to think of executive intervention as a multi-cycle process, with interactions between intentioned and motivated actors, rather than as an event with consequences. An essential element of the understanding required by OR practitioners is a clear concept of what a socio-technical system is and what it implies.

What is a socio-technical system?

The concept of socio-technical system has a long history in the literature (see for example, Salvendy, 1987: 454). However, for the purposes of this paper, only a very basic distinction needs to be made between technical, social and socio-technical systems.

A system is as an interacting collection of parts. If all of the parts of a system are non-human technologies then one has a purely technical system, for example an autonomous robot or an unmanned production facility. For the present purpose, socially aware artificial intelligences are neglected (as are systems of non-human animals).

A purely social system is an interacting collection of humans in which non-human technologies are either not present or not significant to system operation. A community of people doing something like talking, for which technology is not really an issue, might be considered a purely social system although, in modern societies, such technology-free activity is rare.

By extension, a socio-technical system is a collection of human and non-human parts interacting in an integrated way, in which overall system behavior arises from multiple cycles of interaction within and between the human and non-human parts. This implies that socio-technical systems are also likely to be CARS.

The assertion that all systems of interest to OR are socio-technical is especially, though not exclusively, relevant in the military domain. Military conflict is essentially a social affair, but one in which technology is deeply and inseparably embedded. Technology is so important to modern military affairs, especially with the increasing use of automation, ‘smart’ munitions, and unmanned vehicles, that is has been tempting for military OR practitioners to consider the technical component alone. It is equally tempting, and equally misguided, to fixate on the human component of conflict to the exclusion of all else.

Whilst much about the impact of humans on systems is difficult to predict or understand, the one dependable fact is that humans bring variability to systems - they are, so to speak, the constant variable - a fact which presents challenges for systematic analysis. A consideration of human variability and its impact is, therefore, a good starting point for the dialogue between the HS and OR.

Humans: The constant variable

Any system with humans involved will change and adapt. Research clearly shows that there is significant variability between individuals and groups and also within individuals and groups over time. Understanding sources of variability will allow OR to include them in its methods and models. This paper will, therefore, spend some time cataloguing and discussing the nature and sources of human variability and the consequences for OR’s exploitation of HS.

Individual human variability

Humans differ from each other in ways which affect how they behave and perform tasks inside systems. Individual humans change over time as a consequence of learning and experience, or in response to changing context.

Human cognition and behavior generation is still poorly understood, but many things are known with some certainty. Studies using brain scanning technology have begun to unravel some of the richness and complexity of human cognition. For example, clinical studies of patients with specific types of brain damage (Carter, 1998: 122-123) have shown that the affective component of cognition is deeply implicated in higher reasoning and the formation of belief. Further, the study of left-right brain duality (Carter, 1998: 50-51) provides evidence that the two halves of the human brain, far from being just parallel processors, are capable of thinking different thoughts and holding different aspirations and goals. It is logical to conclude that human reasoning is more likely the result of a complex interaction of multiple, possibly competing, thought processes, and that the coherence of behavior that results is more akin to an emergent property of sub-conscious processes than the result of a conscious and coherent directing mind.

This role of affect on belief and reasoning has implications for the understanding of awareness and the exploitation of work such as that of Endsleigh, (2003). It is important to recognize the importance of the sub-conscious component in decision-making when interpreting the work of Klein, (2003).

The pioneering OR modelling work of Moffat, (2002) makes some use of human science theories such as those of Klein, but synthesizes ideas from Janis and Mann (1977) to justify treating decision making as a rational process driven by coherent concepts of utility. The most recent understanding of individual human variability would indicate that such a synthesis may be inconsistent. It is important for the human science community to synthesize it own literatures to clarify current knowledge before OR practitioners can rely upon that literature to inform executives.

Variation between-individuals

Given exactly the same situation, under exactly the same conditions, two typical people will react and behave differently. How differently depends on many things. For example, imagine that you are reading this paper seated on a park bench when a man in soldier’s field uniform carrying a rifle comes up and stands in front of you. How will you react? What will you make of the situation? What will you ‘see’ standing there in front of you? One person may see a strong protector of freedom and security, and perhaps wonder if there is a threat nearby, a terrorist bomb or a riot. Another person will see the soldier himself as a threat, a menacing representative of repression and injustice. Yet another may see a young, immature fool, suckered into a dangerous profession by propaganda and the promise of a trade. Each of these views could legitimately be held by a citizen of the UK, depending on their past experiences with soldiers. A resident of certain streets in Belfast will see the soldier quite differently from the landlord of a public house near the army barracks in Bordon, Hampshire, and a quite different view again may come from a proud old war veteran.

Such different perceptions arise from a host of sources - memories of past experiences, cultural norms instilled since childhood, self perceptions and the way a person ‘spins’ their position in relation to the world. It has been shown experimentally (Malish, et al., 2003) that personality can significantly affect how different military commanders choose to act given the same situation and information. Work by Sicard, et al. (2003) suggests the intriguing possibility that risk taking behavior is related to a need to regulate some internal risk ‘thermostat’ which is linked to physiological responses in the brain.

At a more basic level, what each person ‘sees’ is the result of a complex cycle of perception, attention, and recognition involving the imposition of previously formed categorizations or symbolizations onto the ‘sensory wash’ and the construction, from remembered fragments, of a story to “explain” the juxta-position of those symbolic representations. This idea is supported by the work of Klein, (2003). Indeed the need to ‘make sense’ of the world in this way may even result in the construction of quite fictitious explanations and the neglect of countervailing perceptions in order to preserve the current ‘mental model’. The best optical illusions work because we use a subset of the image to trigger model building and then persist in our belief in the model despite contrary evidence, even at the cost of disturbing cognitive dissonance.

This combination of history, culture, politics, psychology and physiology provides many ways for people to differ, and the interaction of the different causes can make it difficult to provide a clear pattern or distribution from which to generalize. Some clues might be had from past observations, such as in Bolia, et al. (2003), or from profiling of various sorts. If one is interested in advising soldiers on how to conduct themselves in the course of peace-keeping operations, for example, it is vital to understand what knowledge is relevant to the analysis and how the various disciplines interact.

Collective human variability

In most systems of interest to OR humans are involved collectively, invoking a whole range of additional sources of variability. Social networking, co-operation and competition, collective self-awareness and reflex, and interactions between system structure and function produce macro-level behavioral variability which is not always clear in its origins. Systems differ in their formally appointed structures, goals, strategies and processes. They also have different histories and collective experiences, which lead to wide variety in informal structures, goals and processes, even where their formal expressions are similar.

Two teams given the same task in the same context are likely to diverge in their approach to task execution and to the many non-task-related goals and behaviors which arise in any human collective. In general, OR modelers only look at formal structures and processes and neglect to account for the informal, perpetuating the comfortable myth that those things the executive controls dominate organizational behavior. Research, such as Siemieniuch and Sinclair’s (2003), indicates that in dynamic situations role structures will adapt in ways which depend upon individual team member capabilities. Salas, (2003) highlights how team competence is more complex than the combination of individual competencies.

It is generally true that OR practitioners (and many HS researchers) obtain their data on organizational processes by elicitation from organization members; an approach which often tends to reproduce the formal rather than actual processes and structures. In this context, OR needs to exploit the techniques developed by disciplines such as behavioral psychology to acquire knowledge of human behavior that does not rely entirely upon self-report.

Practical OR responses

The wide spectrum of sources of variability in systems outlined above demands a response from OR, if the discipline is to retain its credibility. One possibility is to retreat from analysis into the direct facilitation of executive decision making in complex systems. However, this would be to remove a main source of the value of the OR discipline, namely, the ability to derive insights about the consequences of intervention which add value to the executive’s intuitive understanding.

OR’s practical response to the challenge of human variability should include a broadening of scope, at all stages of the analysis process, to cover the full spectrum of significant factors. In particular, attention needs to be paid to a balanced problem formulation, a more ‘natural’ approach to OR modelling, the explicit treatment of uncertainty, and the synthesis of advice from analysis results.

Balanced problem formulation

The recently updated NATO Code of Best Practice for C2 (Command and Control) Assessment (NATO, 2002), emphasizes the importance of completeness in problem formulation and the treatment of practical constraints, such as data availability, as modifiers rather than drivers of the problem model. The Code also advises an open and adaptable approach to problem bounding and assumption setting.

In considering human decision making in complex systems it is critical for OR to account for all sources of variability which might prove significant rather than, as is often done, restricting the analysis to those variables which are readily observable. Since model development is often a capital intensive project, it is also important to design adaptability into models to facilitate future development in the face of new understandings. This requires an approach to modelling which does not adhere blindly to the KISS principle (‘Keep It Simple, Stupid’), but adopts the more holistic KISMET principle proposed by Maeers, et al. (2004) in the margins of the ISMOR conference2. KISMET stands for ‘Keep It Specific, Manageable, Exploratory and Testable’ and tries to evoke a more open-ended, iterative and exploratory approach. Such an approach will tend to produce more ‘natural’ models with a more explicit treatment of system variability and its sources.

Modelling systems ‘naturally’

One early text (Air Ministry, 1963) defines OR as “numerical thinking about operations, with the aim of formulating conclusions which, applied to operations, may give a profitable return for a given expenditure of effort”. Today, OR practitioners, particularly those in the military domain, are dominantly drawn from hard science, mathematics or engineering backgrounds. Consequently, systems are typically modelled from a rationalist perspective. Even where humans are treated explicitly, a rational construct based on utility theory and choice optimization is often used. In the military OR domain, models of decision making typically assume that command decisions are driven by the full set of information in the commander’s situation display, and that multiple options for course of action are considered before the ‘best’ course is chosen to meet the operational goal. This thinking is the usual interpretation of the ubiquitous OODA construct (Observe, Orient, Decide, Act) defined by USAF Colonel John Boyd, although Boyd’s original was much more cognitively and socially contextualized than current usage implies5.

It is now widely accepted in cognitive psychology that a natural description of decision making based on expertise, situation recognition and a satisficing strategy is more representative (Klein, et al., 1993). A natural model of the decision making process should include concepts like attention, attribution, construction, recognition, limitations in working memory (with consequences for cognitive strategies), and learning.

Current OR models (at least in the military domain) tend to assume organization members share formal goals, and faithfully follow formal processes. In military capability investment appraisal it is widely assumed that improving information sharing will significantly improve shared situation understanding, which will greatly enhance operational effectiveness. In the current OR models this assumed causality is usually already embedded in the model, rendering it incapable of being used to effectively question or challenge investment options involving information technologies. A more natural model of organizations would probably be centered on social networks rather than formal structures, dealing explicitly with the effects of multiple, unshared beliefs and goals, informal and ad-hoc processes, emergent roles and rules, the interaction of multiple cultures, and organizational adaptation as outlined above.

Modelling systems more naturally will initially make models more complicated, but will enhance OR’s ability to absorb human science knowledge and provide a sound basis for model evolution and adaptation in the light of new knowledge.

Some of the improvements in OR modelling implied by the ‘natural’ approach could be implemented in the short term. Representations of perception and attention, situation recognition (already demonstrated by Moffat, 2002), satisficing strategies for decision making, the impact of internal and external moderators on cognition and, at the collective level, multiple goals and social network influences on collaboration could all be added or improved with current knowledge and in the context of current models. In the longer term, new decision making algorithms based on constructed mental models, adaptivity in organizational structures and processes and a treatment of self awareness and reflex are possible with limited additional methodological research.

Embracing uncertainty

Expanding the scope of analysis to encompass the full spectrum of variables and factors significant to the behavior of CARS will bring in variables and factors for which solid empirical data are not available. OR practitioners already have tried and tested techniques for dealing with such unknowns. A combination of sensitivity analysis, stochastic modelling and risk-based reasoning will allow credible and useful advice to be generated even in the face of high uncertainty, provided it is embraced rather than ignored or suppressed.

The treatment of uncertain knowledge is quite different in the OR and HS communities. HS research tends to be very skeptical even in the face of statistically significant experimental results, because of the imperative to be conservative in adding knowledge to the scientific canon. Often, uncertain results are the trigger for further research proposals. Conversely, OR practitioners tend to be happier to use data with low statistical significance provided only that is appears to contribute to discrimination of investment options in the context of the immediate decision problem. Time for further research is a rare luxury for OR.

Analysts and researchers need to understand each others views on significance and usefulness of uncertain data before there can be a free flow of useful knowledge between.

Synthesis

Synthesis is the often unregarded twin of analysis. Even those human and organizational issues which have to be excluded from the analysis through lack of capability or usable data can be re-introduced during synthesis (provided they were explicitly identified at the start). It is recognized best practice for problem formulation not only to provide problem segments amenable to analysis, but also a clear and valid mechanism for meaningful synthesis to provide coherent knowledge about the original, larger problem (NATO, 2002).

Full spectrum analysis

Dealing with socio-technical issues will challenge existing OR capabilities. The immediate response of OR practitioners should be to use multiple methods, possibly even multiple theoretical bases, to address issues raised by a properly scoped problem formulation. The longer term response must be to seek synthesis of theories so that multiple methods become compatible.

Multi-disciplinary teams are also vital to the effective treatment of human decision making in complex systems. The hard science bias in the OR community needs to be removed and the recruitment of human scientists into OR teams should be a priority in coming years. The effective integration of scientific knowledge across traditional disciplinary boundaries is a core task for the OR community, one in which the human science community should participate eagerly, since it will render much of their hard-won knowledge more exploitable.

Summary

This paper has sought to explore the challenges faced by OR as it tries to support interventions in complex, socio-technical systems. It asserts that the validity (i.e., fitness for purpose) of OR depends upon a balanced treatment of factors and that this means a significant broadening of the scope of models used by OR to predict the consequences of interventions in systems.

The paper emphasizes the importance of the human sciences as a basis for understanding variability in systems, and notes that the wide range of disciplines is not integrated as a coherent body of knowledge. Specific proposals are made for the improvement of OR models and for an ongoing programme of integration to produce useable knowledge across the full spectrum of scientific disciplines. Together these developments will produce a capability for ‘full spectrum analysis’, capable of helping executives to intervene effectively in complex adaptive reflexive systems.

Notes

References

Air Ministry (1963). The origins and development of operational research in the Royal Air Force, Air Publication 3368, HSMO, London

Allen, P.M. (1988). “Dynamics of evolving systems,” System Dynamics Review, 4:109-130.

Bolia, R., Vidulich, M., Nelson, W. and Cook, M. (2003). “The use of technology to support military decision-making and command & control: A historical perspective, presented at [3].

Carter, R. (1998). Mapping the mind, Wiedenfeld & Nicolson, London.

Endsleigh, M. (2003). “Designing for situation awareness in complex systems,” presented at [3].

Farrel, P. S. E and Lichacz, F. (2003). “LOE 02: Canadian human factors analysis,” poster presentation at [3].

Janis, I. L. and Mann, L. (1977). Decision-making: A psychological analysis of conflict, Free Press, New York.

Klein, G. (2003). “A data/frame model of sensemaking.,” presented at [3].

Klein, G. A., Orasanu, J., Calderwood, R. and Zsambok, C. E. (eds.) (1993). Decision making in action: Models and methods, Ablex Publishing Corporation, NJ.

Maeers, P., Mathieson, G. and Rose, G. (2004). Private communication.

Malish, P., Mathieson, G. and Berry, A. (2003). “Contribution of the human element to command effectiveness: The impact of information on command effectiveness,” submitted to Journal of the OR Society, November.

Mintzberg, A. (1979). The structuring of organizations, Englewood Cliffs, N.J., Prentice-Hall.

Moffat, J. (2002). Command and control in the information age representing its impact, HMSO, London.

NATO, (2002). NATO code of best practice for command and control assessment, reprinted by US Department of Defense, Command and Control Research Programme. (see http://www.dodccrp.org).

Salas, E. (2003). “Why training team decision-making is not as easy as you think,” presented at [3].

Salvendy, G. (ed.) (1987). Handbook of human factors, John Wiley & Sons, New York.

Sicard, B., Jouve, E. and Blin, O. (2003). “Decision-making and Extreme Risk-taking,” presented at [3].

Siemieniuch, C. and Sinclair, M. (2003). “Changes in organizational roles when disaster strikes,” presented at [3].


1 © Crown Copyright, Dstl/2004, Published with the permission of the Controller of Her Britannic Majesty’s Stationery Offi ce, Reference Dstl/JP08610. The opinions expressed in this paper are those of the author and do not necessarily represent the views of the UK Ministry of Defence or HM Government.
2 20th International Symposium on Military Operational Research, a conference held in August 2003 by UK Ministry of Defence at Eynsham Hall, Oxford, UK. http://www.rmcs.cranfield.ac.uk/infoserv/ISMOR/ismor2003.htm.
3 NEC - The Human Dimension, a conference held in November 2003 by the Royal Military College of Science, Shrivenham, UK, on behalf of the UK Industry/MoD Command and Digitization Group.
4 Human Factors of Decision Making in Complex Systems, a conference held in September 2003 by the University of Abertay Dundee at Dunblane Hydro, Scotland, UK. http://staff.tay.ac.uk/bstmjc/web/home.htm.
5 Decision Making: OODA Loop. Retrieved February 4th, 2004 from www.mindsim.com/MindSim/Corporate/OODA.html.