LEVERAGING THE HIDDEN ORDER OF SYSTEMS

July 24, 2002

 

Earl H. McKinney Jr.
Bowling Green State University

 
ABSTRACT


Systems theory is better conceived as a collection of disparate theories, a system of systems theories.  This perspective would improve the use of systems theory in MIS.  By having a variety of system theories to use, investigators both academic and professional can better reveal order in a wide range of MIS topics.  Currently, the principles of various systems theories are not very well understood and they have conflicting basic tenets.  Moreover, the term “systems theory” is frequently used without qualification, without specifying basic assumptions or principles.  As a result of this misuse, this paper offers a classification system of four systems theories.  Within this classification, ongoing MIS research is placed to suggest the validity of the descriptors and to indicate the scope of systems use.  The limitations of systems theories and a brief reading list are presented.


CONTENTS
  1. INTRODUCTION
  2. SYSTEMS PRINCIPLES
  3. DIFFERENCES AMONG SYSTEMS THEORIES
  4. EVALUATION OF THE CLASSIFICATION:  EXAMPLES FROM MIS
  5. SUMMARY
  6. SUGGESTED READINGS
  7. REFERENCES

1. INTRODUCTION

“The whole is more than the sum of the parts”--Aristotle

This synergistic phrase, long a central tenet of systems theory, is often cited in the recent phenomenal growth of many information systems. It is evoked to explain the growth of the Internet itself, networks of all sizes, P2P applications such as Napster, supply chain management, consumer trust, E commerce, and globalization.  Though central and common to these current IS phenomena, this principle of systems theory is not, at times, well understood (Klir, 1991; Porra, 1999, Senge, 1990).  Moreover, this limited understanding is not isolated to this particular tenet, but suggests a number of the principles of systems theory could be better understood. To address this problem, the purpose here is to systematically clarify the tenets of systems theory in order to improve its use. 

Systems theory is better conceived as a collection of disparate theories.  By highlighting both the variety and commonality of systems theories, this paper suggests a classification scheme of four systems theories, and posits that systems theory is better conceived as a system or taxonomy of these four specific theories. 

The systems view, long established in some circles (Ackoff 1971), is gaining support elsewhere in the social sciences:  in human factors (Meister, 1989), in sociology (Kling, 2001), in organizational communication (Miller, 1995), in teamwork (McGrath, 1984), and in management, where Senge’s application of systems has recently been recognized as a seminal work in 75 years of management research (Harvard Business Review, 2000).  Elsewhere, with advances in computational power, systems theory is being reborn in the natural sciences where aspects of it are labeled chaos theory, complex systems, and non-linear systems (Bohm and Hiley, 1993; Campbell, 1989; Gell-Mann, 1994; Prigogine, 1980; Waldrop, 1992). 

Within information systems, applications of systems theory are growing as ambitious information systems become increasingly complex (Xu, 2000; Zeigler, 1996). For example, IS research relies on systems principles to investigate modularization and bundling (Shilling, 2000), systems design (McLeod, 1995; Xu, 1995; Zhu, 1998, 2000), decision support systems (Eom, 1998; Takahara and Shiba, 1996, 2000; Zhu 1998, 2000), and networks (Xu, 2000).  Moreover, the growing use of systems theory is also evident in introductory textbooks (Alter, 1999; O’Brien, 1999; Oz, 1998; Zwass, 1998), as well as highly regarded academic journals (Lewin, 1999; Porra, 1999).  Further, a new discipline of Social Informatics invokes systems theory to highlight the pernicious and lasting social impacts of information systems (Kling, 2001).  Unfortunately, this growing impact of systems theory on IS has not been well demonstrated until recently (Eom, 1998, 2000; Kirby, 1993; Xu, 2000).  This inattention may contribute to IS graduates’ lack of ability to think systemically, a key shortcoming identified by employers (ISCC ’99, 1999).

Systems theory is particularly valuable for IS during this current period of growth and change.  First, systems theory provides the language and perspective to understand adaptation, the key to understanding MIS use in organizations (Anderson, 1999). Second, the explosive growth in technology has fueled concomitant growth in the complexity of IS applications.  Systems theory is well suited to finding the hidden order at any level and structure in large complex systems.  Third, technology growth also has spurred new organizational structures of networks and alliances.  Systems theory, as evidenced by Senge’s work, is a robust language for these new structures and the IS that support them.  Fourth, the field of IS has shifted from production to identification and delivery of information.  Systems theory, with its fundamental input, process, output framework is well suited to model this new role of IS.  Finally, information economics is different than the physical good economics where single physical assets have singular values. Systems theory, based on non linear combinations of value, is better able to model information economics than reductionism.  In sum, a detailed understanding of systems theories is essential for IS professionals and scholars confronting today’s systems.

As the systems view gains popularity, it is necessary to reconceive the idea of systems as a collection of systems theories.  Some 30 years ago, Ackoff (1971) proposed an initial system of systems concepts, an integrated set of ideas or concepts about systems theory.  He organized over 30 principles of systems, definitions, and concepts into one comprehensive framework in order to sharpen definitions that had become dull with use and to aid understanding.  Suggested here is a system of theories rather than a system of concepts, but for similar reasons. 

The need to once again clarify is evident.  No mainstream introductory MIS text acknowledges more than one systems theory exists, well know popular business books based on systems theory (Wheatley, 1992; Senge, 1990) also fail to distinguish the variety of systems theories.  Recognizing the need to qualify the use of systems theory, Porra (1999) is an example against this trend.  Behind this indiscriminate use is the impression that the underlying principles are well accepted, that only one systems theory exists. However, as argued here, while systems theory in general has several well-accepted principles, systems theory has at least four variations whose underlying assumptions about knowledge and philosophy are in conflict.

2. SYSTEMS PRINCIPLES

Before turning to the classification of the four variations, a brief review of principles common to all systems theories is necessary.

Systems theory–the transdisciplinary study of the abstract organization of phenomena, independent of their substance, type, or spatial or temporal scale of existence.  It investigates both the principles common to all complex entities, and the models which can be used to describe them” (Heylighten, 2001). 

This definition suggests however complex or diverse the world is that we experience, we will always find different types of organization in it and such organization can be described by concepts and principles independent from the domain.  This search for hidden order emphasizes the interactions and connectedness of the components of a system.  Further, these organizational patterns or relationships are typically governed by laws, which cannot be derived from the laws that govern the lower levels of the system. Systems then, are nested hierarchies of subsystems where each level of the system is characterized by unique order (Checkland, 1981).  As stated by Polanyi,

You can not derive a vocabulary from phonetics; you cannot derive the grammar of language from its vocabulary; a correct use of grammar does not account for good style; and a good style does not produce the content of a piece of prose…it is impossible to represent the organizing principles of a higher level by the laws governing its isolated particulars. (Polanyi, 1967; p. 210)

Finally, open systems, the object of most system study, cannot survive without continuously exchanging matter and energy with the environment.  This exchange leads systems to continually adapt to their environment.

Consistent with this definition, examples of systems abound; economic and ecological phenomena, evolutionary biology, and chaotic structures are classically systemic, as are the immune system and central nervous system (Holland, 1992; Waldrop, 1992).  More specifically, at a low level the electrical laws of physics govern the central nervous system.  At a higher level, cellular biology provides us the order necessary to understand the behavior of nerve endings.  The essence of the CNS is not observable at these lower levels.  The CNS is an open system, processing inputs of energy in the sense organs to information output for the brain. Further, a CNS is a hierarchy of subsystems that adapts to an environment, tuning itself to more important signals in the environment.  A second example comes from Holland (1995) who uses an ecological example to argue for the usefulness of systems theory:

Ecosystems exhibit overwhelming diversity; they are continually in flux and exhibit a wondrous panoply of interactions such as mutualism, parasitism, biological arms races, and mimicry.  Matter, energy and information are shunted around in complex cycles.  Once again the whole is more than the sum of the parts.  Even when we have a catalog of the activities of most of the participating species, we are far from understanding the effects of changes in the ecosystem. (p. 3)

Systems is more completely understood by considering its antithesis: reductionism.  According to reductionism, the laws governing parts determine or cause the behavior of the whole.  Reductionism reduces a system to elementary elements in order to study in detail and understand the types of interaction that exist between them.  Modifying one variable at a time, and conducting repeatable, scientific experiments on parts, it tries to infer general laws that will enable one to predict the properties of a system under very different conditions.  Conclusions are based on linear extrapolations via the superposition principle (the whole is the sum of the parts), a method that has had great success in the natural sciences (Gell-Mann, 1994).  Reductionism seeks to explain and predict the world by searching for regularities and causal relationships between elements or parts (Burrell and Morgan, 1979).

There are a number of significant critiques of systems theory in general.  First, systems theory does not address what it is that makes the whole more than the sum of its parts, that is, what is the central intuition of wholeness (Fuenmayor, 1991; Varela and Goguen, 1978).   Wholeness is often invoked without specifying what is a whole: is it roles, relationships, perceptions or something else?  Moreover, is it unique to a particular whole or do all wholes share a common quality.  A second key criticism of systems is its emphasis on control.  By striving for consensus it imposes an infrastructure and robs itself of its natural and powerful diversity.  These issues require extended consideration and are well beyond the scope of this paper.  They are included here to suggest boundaries of our understanding of systems theory, and to partially explain why variants of the theory have developed.

 

3. DIFFERENCES AMONG SYSTEMS THEORIES

 

“Systems thinking, if anything, should be carried out systematically.” (Ackoff, 1971)

Taxonomy, the science of classification, dates to Plato (Mayr, 1969).  The purpose of a classification is to describe the structure and relationships of the constituent objects, to achieve economy of memory and facilitate communication, and to ease observations and retrieval of information (Fleishman and Quaintance, 1984; Jarvenpaa, 1988). A classification is a first step in organizing knowledge; it can facilitate comparison, highlight difference, limit generalization, expose gaps in knowledge, assist theory development, provide comparability, assist in extrapolating results to practice, help form theory, lead to new research questions, point out where extensive research has been done or where more research is required, and identify a critical set of characteristics and definitions for these characteristics. (Fleishman and Quaintance, 1984; Melton, 1964; Sokal, 1974).  For these reasons, systems theory needs a classification.  Further, classifications typically arise in heavily researched fields in which impetus has come from an overwhelming cascade of unorganized facts where a metalanguage to describe concepts in a field is needed.  In sum, a classification is a system that identifies regularities and order at a high level of organization.

To these ends, the following classification is offered.  The classification is shown in Table 1 which shows four variations of systems theory--hard, complex, soft and critical, and their descriptors.  Each commits to a number of philosophical assumptions in an attempt to be more coherent and useful than the more general systems theory described to this point. 

 


Table 1:  Differences Among Systems Theory

Descriptors Dominate
Metaphor
Epistemology Key
Principle
Purpose Methodology Sociology Domain
               
Hard     Teleology        

BOTH

Mechanistic Positivism   Norm., Prediction Nomothetic & Sim Regulation Well Defined
Complex     Emergence        
               
Soft     Indeterminacy   Ideographic Regulation  

BOTH

Organic Interpretivism   Descriptive Argue     Poorly Defined
Critical     Power   Ideographic, Pluralistic Radical Change  
               

Hard Systems

Hard systems theory employs quantitative techniques from a positivist epistemology similar to the traditional sciences.  Epistemology, or the grounds of knowledge, examines how we understand and communicate knowledge about the world (Burrell and Morgan, 1979).  Positivism is based on formal propositions, quantifiable measures of variables, hypothesis testing, and drawing of inferences; it creates a cumulative growth of knowledge as in the natural sciences (Klein & Myers, 1999).  What makes it different from traditional science is that its level of analysis is more holistic; the object of inquiry is typically large-scale systems in operation.  Labeled systems management, management science, and operations research, it is teleological:  assuming the existence of purposeful global goal seeking functions that it seeks to optimize (Forrester, 1971).  The aim is to predict the behavior of the system within a framework of self-control, optimization and objectivity.  The method of research is nomothetic (the study of cases or events as universals, an emphasis on measurement and identification, and a view to formulating general laws), and is rapidly becoming entirely quantitative.  It is important to note the evolution of this field.  Operations research and management science have evolved from roots in hard systems and now are almost entirely reductionistic; in a sense, hard systems is no longer practiced.  Systems dynamics is a label ascribed to the most systemic versions of current operations research and management science.  Hard systems was developed to solve the complex problems of logistics and resource management in World War II (Banathy, 1996).

Complex Systems

Within the past 15 years, this school of thought has emerged sharing many of the same tenets with hard systems, but extending their common mechanistic, positivist-nomothetic, predictive, regulative approach to non-purposeful domains in the natural, and artificial sciences. A key descriptor is the non-linear relationship between the whole and key “subroutines”.  These subsystems are complex functions, which tend to be performed in few locations and result in emergent behavior, or indirect effects, in the overall system.  These sublocation processes explain and predict emergent properties at the level of the whole.  For example properties of the immune system such as disease response and memory emerge from changes in a few specific cellular process; the emergent property of memory can be explained by subprocesses in spin glasses (magnetically charged glasses) (Campbell, 1989).  Increasingly the term adaptive is used to describe complex systems. 

This group has proposed emergent property models for the immune system, evolutionary biology, spin glasses, computational physics, dynamical functions, ecosystem dynamics and chaos theory (Devaney, 1990; Kauffman, 1993; McNaughton, 1989; Mitchell, 1995; Stein, 1989b; Zurek, 1990).  This school is concerned with explanation and prediction via pattern recognition, modeling agent interaction, and understanding local goal seeking ("niching") rather than teleological global optima.  Further, complex systems behavior is thought to be highly dependent on initial conditions; small variations in these conditions have significant non-linear impacts on system performance.  The complex school is critical of the hard systems approach as inadequately addressing complexity or emergent phenomena, overly relying on simplifying linear approximations, and unsuited to the inherently dynamic, iterative, interactive nature of complex systems that produces the emergent phenomena (Santa Fe Institute, 2001).  It attempts to quantitatively predict system-wide behavior by building mathematical, but non-linear models of the system's components.  (Linear functions by contrast, predict model behavior based on weighted sums of input values.)  One of the key differentiators of the complex framework is methodology, its specification and use of its own distinct computational tools. The dominant method is simulation.  Models of the system's components and their interaction are programmed.  Initial conditions, input from random number generators, are varied, and the quantitative patterns or symmetries developed over the multiple iterated runs are evaluated. 

The development of this framework was concomitant with growth in network computing technology.  These technological advances in the 1990s enabled the type of distributed computing and other tools that characterize the complex systems approach.

Soft Systems

Soft systems was developed to complement the hard systems approach, differing in epistemology, key principle, purpose, and method.  It arose from a need to better address complex contemporary social issues (Flood and Jackson, 1991).  Its interpretivist epistemology holds that subjects or groups construct knowledge because of selection pressures from the environment (Heylighten, 2001).  The interpretivist suggests that our knowledge of reality is gained only through social constructions such as language, consciousness, shared meanings, documents, tools, and other artifacts.  Moreover, various stakeholders have unique and valid views of the problem space.  Further, problem identification and selection are largely idiosyncratic:

The social world is perceived (and constructed) by men according to the particular world-views.  This is a cultural mechanism which maintains desired relationships and eludes undesired ones.  The process is cyclic and operates like this: our previous experiences have created for us certain standards or norms, usually tacit; the standards, norms and values lead to readiness to notice only certain features of our situations; they determine what facts are relevant; the facts noticed are evaluated against the norms or standards, so that the future experiences will be evaluated differently. (Vickers, 1983; p. 17)

Another fundamental difference of soft systems is the idea that goals may be ambiguous, conflicting, non-quantifiable, and indeterminate.  That is, ambiguity of problems is not a result of underdeveloped analysis tools; it is how things are.  Thus, problems involve judgment, weighing moral issues and creation of form (Checkland, 1981).  As a result, solutions do not emerge from one decision, but over time where action and refinement has a better chance of success.  Direct cause and effect is rejected, a more indeterminate problem space is considered more realistic, and as a result, this approach is often described as organic.  Therefore, social problems rich in complexity and change need to be managed rather than decided or solved, the predict and control framework of complex and hard systems yields to design and invention (Flood and Jackson, 1991).

The ideographic method of soft systems is founded on the premise that the world is understood only by first hand knowledge of the subject (Burrell and Morgan, 1979).  It encourages participants to accept multiple realities, multiple worldviews of a problem.  That is, participants are shown the idiosyncratic nature of their own worldview and how this affects problem identification and solution.  As a result, theory and practice are inseparable; practitioners attempt to help participants in social problems see themselves within the higher-level system or context (Flood and Jackson, 1991).

Finally, validation in soft systems methodology is difficult if not impossible.  External validity in an interpretivist epistemology depends on improved behavior of participants.  However, this opportunity for improvement assumes stakeholders are guaranteed free and open discussion about changes to be made.  That may be unrealistic to assume.  In reality, powerful participants in the process are unlikely to risk their dominant position and submit their privileges to the vagaries of others' ideal demands (Jackson, 1991).  This critique leads to the critical systems position.

Soft systems emerged in response to the failure of hard and quantitative tools to model messy social problems in the 1960s.  It represented a shift away from mathematical modeling to understanding the process of building consensus viewpoints.

Critical Systems

The critical approach takes it name from the radical humanist paradigm of sociology.  It is committed to the moral concepts of individual progress and emancipation from constraining paradigms and traditions.  Sharing foundations of interpretivism, ideographic methodology and purpose with the soft approach, it views soft and hard systems as regulative approaches, unaware of their own conservativeness, and more generally the role of power in shaping social action and meaning.  The critical approach shares with the radical humanists the view that consciousness is dominated by social infrastructures in which an individual operates (Burrell and Morgan, 1979).  Hard systems explicitly, and soft implicitly--although it claims to be political and ideological neutral (Flood and Jackson, 199l)--take as a given organizational mission and needs.  Problems are resolved to return the system to equilibrium.

 According to Jackson (1991), Ulrich (1991), and Schecter (1991), the critical approach is founded on critique, emancipation, and plurality.

Critique is a commitment to questioning the methods, practice, theory, non-native context and limits of rationality of all schools of thought.  It requires a never-ending attempt to uncover hidden assumptions and conceptual traps.  The commitment to emancipation is a commitment to human being and their potential for full development via free and equal participation in community with others.  The commitment to pluralism insists that all systems approaches have a contribution to make and that no single approach is adequate to address the full range of problematic situations. (Schecter, 1991; p. 211)

One example of the critical school's methodology is presented in Ulrich (1991).  He argues that problem selection and identification requires numerous boundary judgments of what is relevant beyond the control of logic or reason.  Specifically, participant consensus on issues relevant to the problem should be motivated by considering "what should be" rather than "what is" to avoid overlooking hidden boundary judgments.  Four general issues should be discussed, what should be sources of motivation, what should be sources of control, what should be sources of expertise, and what should be sources of legitimization in the domain of the problem space. 

In general, critical theory attempts to question objectives toward which discussions are offered.  It disagrees with the sociology of the soft approach that free and open debates are ever possible.  It points to the weakness of soft systems theory's attempt to resolve plurality of ideas via exchange.

In the end, validation is possible,

... only via the social actors involved in the process.  The analyst's success is measured by the extent to which the patient recognizes himself in the explanations offered and becomes an equal partner in the dialogue with the analyst.  The actor in the social world very often suffers false-consciousness and does not truly comprehend his situation in that social world.  It is incumbent, therefore, on the critical theorist to employ a social theory capable of explaining the alienated words and actions of oppressed groups in society. (Jackson, 1991; p. 133)

Critical systems theory is a manifestation of the movement from confidence to skepticism in philosophy and sociology.  This growing trend in philosophy in the 1960s created the conditions for the emergence of radical sociology which provides the philosophical support for the critical school in the 1990s.

Summary of Major Differences

Each theory differs in unique ways from the others.  However, three main differences between the top two theories in Table 1 (hard and complex) and the bottom two (soft and critical) are instructive of how significantly these frameworks vary. 

It is evident that to the soft and critical frameworks, systems is a philosophic choice, to hard and complex, it is distinguished as a methodology.  That is, for the critical and soft perspectives, systems theory explains how individuals perceive and conceive their world.  This epistemological role for systems contrasts with the hard and complex perspectives that assume a positivist epistemological perspective of traditional science, and view systems as a method to investigate objective phenomena using a non reductionistic process within that accepted epistemology. 

This methodological difference is a manifestation of a key epistemological one.  The interpretative systems theories (soft and critical) insist that an objective viewpoint outside of a system is not available, the observer or interpreter is always a part of the whole system under scrutiny.  The positivists (hard and complex) separate subject and object and hold that with appropriate methodology the object system can be understood.

4. EVALUATION OF THE CLASSIFICATION:  EXAMPLES FROM MIS

A classification is valid to the extent it addresses several criteria.  First, its descriptors should provide adequate coverage of the relevant attributes and these descriptors can be assigned reliably.  Second, its descriptors should be both mutually exclusive and exhaustive, although these criteria may be unrealistic for initial efforts at classification (Dubin and Champoux, 1970; Fleishman and Quaintance, 1984).  Third, the scheme should make sense to an informed reviewer (E. Miller, 1969; R. Miller, 1967).  Fourth, it should achieve objectives for which it is designed (in this case the objective is improved understanding about systems within MIS).  Finally, and ultimately, the classification is valid to the extent it is used (Fleishman and Quaintance, 1984).

That said, it is incumbent to demonstrate how to use the classification.   To this end, the following four sections place MIS research in each of the classifications.  The primary aim of the review is to improve understanding of the descriptors, and as a result, suggest they are reliable, exclusive, and exhaustive.  Secondary to this aim, the review facilitates comparisons of MIS research within and across classification.  Finally, the review also provides an opportunity to extol the variety of current applications of systems within MIS.

MIS and Hard Systems

Many MIS topics can trace their theoretical heredity to principles underlying operations research and management science: quantitative, large-scale, regulative, positivistism.  Prime examples include network control, telecommunication, and database management.  Other more recent examples are inventory/supply chain optimization (Kumar and Christiaanse, 1999; Salam, Rao, and Bhattacharjee, 1999), information retrieval/knowledge management (Abraham and De, 1999; Zhu, Ramsey, Chen, Hauck, Ng, and Schatz, 1999), and manufacturing control including enterprise engineering, detail design, and information flows (Cheng and Tang, 2000; Ladet and Vernadat, 1995; O’Sullivan, 1990; Xu, 2000).  Hard systems principles are also evident in cybernetics, expert systems, and simulation.  Each of these MIS domains examine well defined problems with a theological, normative, regulative framework, via global goal seeking algorithms.  At times, opinions and subjective elements are included; however, these inputs are quantified and treated with the same assumptions positivists treat other variables.

MIS and Complex Systems

The unique descriptor of complex systems is the principle that identifiable, dynamic and iterative local subsystems lead to emergent properties evident in the whole system.  Several MIS issues share this non-linear subsystem principle including virus dissemination, E commerce trust, taxation policy, and value determination.  In each, a critical mass metaphor is often used to explain the dynamic disproportionate long term results from iterating seemingly minor changes in a few local variables.  Other examples of complex systems MIS include technology adoption and diffusion, auction (Mbarika, 1999), pattern matching, and search engine optimization (Glezer and Yadav, 1999).

The most widespread example of an MIS research topic using complex systems principles is genetic algorithms (Chaudhry, Varano, and Xu, 2000; Xu, 2000).  Genetic algorithms employ a fitness assessment process which measures the mutually acceptability of a system and its context (Alexander, 1964).  Genetic algorithms, and more generally all adaptive production systems, have the capability of constructing new productions. This self constructed, fitness subroutine shifts its response to changes in the environment until a fit solution is determined. Understanding the overall behavior of the system is only possible with an understanding of this local fitness subsystem, the hallmark of complex systems.

Some recent research on the social and human aspects of software development also fit the complex system tradition (Dolado and Moreno, 2000).  As in biology where little can be said about the genotype by study of phenotype, in software systems it is impossible to recover the original specifications from the final product.  Further, the product code does not entail much knowledge of the process used to build it.  As a result, the final product of the software process is more clearly understood by examining a group of dynamic process applied to a small set of initial specifications. (Dolado and Moreno, 2000).

MIS and Soft Systems

The essential descriptors of soft systems theory are indeterminacy and interpretivism.  Research on ethical issues (McManus, 1999) and marketing, are current exemplars of MIS research that employs the soft systems approach.  Another well known example is inquiring systems (Courtney, Croasdell, and Paradice, 1998).  An inquiring system produces valid knowledge given a set of underlying assumptions about input and process.  Common to many inquiring systems is the interpretative process of generating knowledge from the inputs.

Walsham (1995) suggests that the interpretative soft systems view within MIS is being used to examine systems design, organizational intervention, and management of IS.  For example, research in systems design addresses interpreted communication, self regulation and organizational characteristics (Mahmood 1987; van Gigch and Le Moigne, 1990). In addition, the soft systems view is now helping frame implementation, user interfaces, model management issues, and group decision making (Eom, 1998; Takahara and Shiba, 1996; Srite and Ayers, 1999; Zhu, 2000).  One example of the soft approach to group decision making and support systems is the recent work on wicked group problems.  These are group problems with no stopping rule and conflict among stakeholder groups.  Formulation of the problem is the problem (Elgarah, Courtney & Haynes, 2002).  Moreover, research that views systems designers as a part of the system they are developing is using a soft systems perspective (Walsham, 1995).  This paper is also an example of that type of soft systems. 

MIS and Critical Systems

Critical systems suggests that MIS can be viewed as an exceptionally powerful control mechanism; the key descriptors are power and radical change.  This perspective argues the purpose of an MIS is often regulatory, a controlling mechanism whose stifling power is unnoticed by those in authority.  Colonial systems (Porra, 1999), and teledemocracy (Lee, 1999) are prime examples of using the tenets of critical systems theory to argue for change in a poorly defined social environment.  In addition, the recent MIS debates on privacy and copyright law are enlightened by critical systems view (Kling, 2001).  Moreover, the wired gap or digital divide (Nickell, 1998) between IT haves and have nots in our society employs critical systems assumptions about IT, power, and the need for radical change

The new field of social informatics is an example of MIS work using the principles of the critical systems theory.  Social informatics examines the social aspects of computerization, including the roles of IT in social and organizational change (Kling, 2001).  Telecommuting, higher education, and exchange of medical information are topics examined within social informatics that use the critical systems paradigm.

Summary of the Classification:  A System of Systems Theory

If the classification is valid, then the four systems theories could be conceived collectively as a system.  This meta systems theory suggests that the group of four distinct theories forms a system at a higher level.  This meta system should be governed by its own principles unavailable to the specific theories and it should reveal its own organization, or hidden order.  One meta principle is the new opportunity to deliberately select a systems theory based on the match between the theory’s assumptions and the phenomena under scrutiny.  Another meta principle is that some limits and critiques of each of the individual systems theories are compensated by matching strengths in the other theories.  A final meta principle is understanding; the meta system improves understanding of the constituent theories.  That is, an understanding of the meta systems theory reveals dimensions that differentiate the theories.  By understanding these differences, knowledge and use of each systems theory is improved.  These meta principles suggest that the meta theory is more than the sum of the four parts.

In addition to new principles, this classification of systems theories generates organization or hidden order at this meta level.  First, this meta theory is a system because it is well ordered or explained by a system theory.  That is, the meta theory is a system organized by the assumptions of the soft systems approach.  A second example of new order found at the meta systems theory level is the that a new sequence or order of questions emerge, questions not expressible at the individual theory level.  For example, which system theory to employ to study this phenomena, which theory will best reveal the hidden order of the system at hand, and which theory have others used to search for order.  Systems theory claims order can be found in any organization, this classification or system of systems theories is an example of that organization. 

5. SUMMARY

Systems theory often means different things to different people, a malady that precipitated development of classification schemes in psychology and biology (Fleishman and Quaintance, 1984; Klir, 1991).  To help preserve the utility of systems theory concepts for MIS, this classification starts ontologically, and develops epistemological, methodological, sociological, and other key descriptors. Again, the goal here was to clarify the variety of systems theories to make each of them more useful, and to suggest that a meta theory of systems theories is more appropriate conception.

To be sure, systems theory is not a panacea; a useful understanding demands consideration of its limits.  In addition to the main philosophical criticisms mentioned earlier, it is difficult to test hypotheses in a conventional sense using systems theory.  It cannot compete with what traditional reductionistic science has become over the centuries—a consistent, reliable, language and valid way of knowing.  The questions systems theory poses often do not translate into the form of specific testable hypotheses common in traditional science.  As a result, its language may appear fuzzy or vague; its appeal is as a different way to think about the world, not as a clear way to test it.  A nice concept, but it can not do anything useful. With soft and complex systems theories, each user determines validity, making standardization, training, and regulation difficult.  Systems theory, a language without a sentence, pays for its generality with validity.  It's maddeningly broad and vague, generating curious and intriguing insights that disappear like a Cheshire cat when examined in the light of traditional science. 

These limits are not taken lightly or dismissed.  What then should be done with systems theory?  This paper, using a soft systems perspective, suggests reestablishing a theoretical foundation by identifying systems theory as four specific variations, while also recognizing that taken as a whole, these four comprise a classification or a system of systems theories. 

In addition to this classification, this review of the variations of systems theory aims to unhinge long-held and unquestioned epistemological views.  Ideally this will can lead to closer scrutiny of systems theories, and to stimulating communication that establishes better understanding and research. 

In closing, systems theory has evolved as it suggests all systems do.  Systems theory, in response to the pressures of use, has adapted from a singular general theory into four separate and more coherent independent theories.  Now, treating these four as a system, systems theory again adapts as the use of the four independent theories reveal a hidden order, a pattern resolved only at the taxonomic level.  

What is valuable to MIS is that these four schools of systems thinking are becoming established in their own right, with important implications for MIS.  As MIS continues to seek philosophical support  (Courtney, 2000) it would do well to build on the foundation these four pillars provide. 

"The more science becomes divided into specialized disciplines, the more important it becomes to find unifying principles.”  (Haken, 1988)

6. SUGGESTED READINGS

            General Systems Theory

The Principia Cybernetica website http://pespmc1.vub.ac.be/nutshell.html.

Bertalanffy, L. (1968)  General Systems Theory.  George Braziller, New York.

Ackoff, R. (1971)  "Toward a System of Systems Concepts",  Management Science, 17(11), 661-671.

            Hard Systems Theory

Klir, G. (1991)  Facets of Systems Science,  Plenum Press, New York.

Meister, D. (1989)  Conceptual Aspects of Human Factors,  John Hopkins, Baltimore.

Simon, H. (1969)  The Sciences of the Artificial,  MIT Press, Cambridge MA.

Wiener, N. (1961)  Cybernetics,  MIT Press, Cambridge MA.

            Complex Systems Theory

Bar-Yam, Y. (2000)  Unifying Themes in Complex Systems, Perseus Books, Cambridge MA.

Gell-Mann, M. (1994)  The Quark and the Jaguar,  Freeman, New York.

Holland, J. (1995)  Hidden Order,  Addison Wesley, Reading MA. 

Waldrop, M. M. (1992)  Complexity,  Simon & Schuster, New York.

Wheatley, M. (1992)  Leadership and the New Science,  Berrett-Koehler, San Francisco.  

Emergence website.  http://www.emergence.org

            Soft Systems Theory

Checkland, P. (1981)  Systems Thinking, Systems Practice,  Wiley and Sons, New York.

Churchman, C. W. (1968)  The Systems Approach,  Dell Publishing, New York.

Senge, P. M. (1990)  The Fifth Discipline,  Doubleday, New York.

Vickers, P.  (1970)  Freedom in a Rocking Boat, Harper and Row, London.

 

            Critical Systems Theory

Flood R. and Jackson, M. (1991) Critical Systems Thinking, Directed Readings, Wiley and Sons, New York.

The Social Informatics website.  http://www.slis.indiana.edu/SI/

Mitroff, I. and Linstone, H. (1993)  The Unbounded Mind,  Oxford Press, New York.

Oliga, J. C. (1992)  Power, Ideology and Control,  Plenum, New York.

 

 

Thanks to two anonymous reviewers whose comments and suggestions significantly improved this paper.  Also thanks to the editorial process that was remarkably efficient and timely.

REFERENCES

Abraham, D., and De, R. (1999) “Adapting a process model of initial representation formation to a knowledge management application”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 450-452.

Ackoff, R. (1971)  "Toward a System of Systems Concepts",  Management Science, 17(11), 661-671.

Ackoff, R. L. (1974)  Redesigning the future, John Wiley, New York.

Alexander, C. (1964) Notes on the Synthesis of Form, Harvard University Press, Cambridge, MA.

Anderson, P. (1999) “Complexity theory and organization science”, Organization Science, Vol 10 No 3, pp. 216-232.

Alter, S. (1999) Information Systems, Addison-Wesley, Reading.

Banathy, B. (1996)  Designing Social Systems in a Changing World, Plenum Press, New York.

Beer, S.  (1972) The Brain of the Firm, Allen Lane, London.

Bohm, D. and Hiley, B. (1993)  The Undivided Universe:  An Ontological Interpretation of Quantum Mechanics, Routledge, London.

Burrell, G. and Morgan, G. (1979)  Sociological Paradigms and Organisational Analysis, Heinemann, Portsmouth New Hampshire.

Campbell, J.  (1989)  Introduction to nonlinear phenomena”, in Stein, D. (Ed.) Complex Systems, SFI studies of complexity.  Addison-Wesley, Reading MA.

Chaudhry, S. Varano, M. and Xu, L. (2000) “Systems research, genetic algorithms and information systems”, Systems Research and Behavioral Science, Vol 17, pp. 149-162.

Checkland, P. (1981)  Systems Thinking, Systems Practice, Wiley and Sons, New York.

Churchman, C. W. (1968)  The Systems Approach,  Delta, New York.

Cheng, C. and Tang, B. (2000). “A robust control method with applications in integrated information systems”, Systems Research and Behavioral Science, Vol 17, pp. 173-181.

Courtney, J. F. Croasdell, D. T. and Paradice, D. B. (1998)  “Inquiring Organizations,” Foundations of Information Systems, Sept 14.

Courtney, J. F. (2000) “Mini-track on the philosophical foundations of information systems”, in Proceedings of the Sixth Americas Conference on Information Systems, Long Beach CA August 10-13.

Devaney, R. L. (1990) Chaotic Dynamical Systems 2nd Edition, Addison-Wesley, Reading MA.

Dolado, J. and Moreno, A. (2000) “Assessing software organizations from a complex systems perspective”, in Yaneer, B. (Ed.), Unifying Themes in Complex Systems,  Perseus Books, Cambridge MA.

Dubin, R. and Champoux, J. (1970)  “Typology of empirical attributes dissimilarity linkage analysis”,  Office of Naval Research Conract N00014-69-A-U200-9001 Irvine, CA.

Elgarah, W. Courtney, J. and Haynes, J. (2002)  "A dialectical methodology for decision support systems design", Hawaii International Conference on System Sciences, Waikoloa Hawaii, January.

Ericsson, K. A.  (1996)  The acquisition of expert performance:  An introduction to some of the issues”, in Ericsson, K. (Ed.), The Road to Excellence,  Lawrence Erlbaum, Mahwah, NJ, pp. 1-47.

Eom, S. B. (1998)  “Relationships between the decision support system subspecialities and reference disciplines:  an empirical investigation”, European Journal of Operational Research, Vol 104, pp.1-45.

Eom, S. B. (2000)  “The contributions of systems science to the development of the decision support system subspecialties:  an empirical investigation”,  Systems Research and Behavioral Science, Vol 17, pp. 117-134.

Fan, W. Gordon, M. D. and Pathak, P. (1999)  “Automatic generation of matching function by genetic programming for effective information retrieval”, in Proceedings of the Fifth Americas Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 49-50.

Fleishman, E. A. and Quaintance, M. K. (1984)  Taxonomies of Human Performance, Academic Press Orlando, FL.

Flood R. L. and Jackson, R. C.  (1991) “Total systems intervention: A practical face to critical systems thinking”, in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp. 321-337.

Forrester, J. (1971) World Dynamics, Wright Allen, Cambridge, MA.

Fuenmayor, R. (1991)  Between systems thinking and systems practice”, in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp 227-243.

Gell-Mann, M.  (1994)  The Quark and the Jaguar, Freeman, New York.

Gibson, J. J. (1979)  The Ecological Approach to Visual Perception.  Houghton Co, Boston.

Glezer, C. and Yadav, S. (1999) “A conceptual model of an intelligent catalogue search system (ICSS)”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 435-437.

Haken, H. (1988) Information and Self-Organization:  A Macroscopic Approach to Complex Systems, Springer-Verlag, New York.

Harvard Business Review (2000) “Advances in Management”, Sept/Oct pp. 20-24.

Heylighten C. (2001) “Overview of Principia Cybernetica”, http://pespmc1.vub.ac.be/nutshell.html, date accessed 20 November 2001.

Holland, J. (1992)  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence, 2nd Ed., MIT Press, Cambridge MA.

Holland, J. (1995)  Hidden Order.  Addison-Wesley, Reading MA.

ISCC ‘99 (1999)  Educating the Next Generation of Information Specialists in Collaboration with Industry, NSF, available at www.iscc.unomaha.edu.

Jackson, M. and Keyes, P. (1991) “Toward a system of systems methodology”, in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp 139-158.

Jackson, M. (1991)  Modernism, post-modernism and contemporary systems thinking”,. in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp. 287-301.

Jarvenpaa, S. (1988)  “Tasks in IS research, A proposed taxonomy”,  Unpublished manuscript, University of Texas.

Kauffman, S. A. (1993) The Origins of Order,  Oxford Univ Press, New York.

Klein, H. and Myers, M. (1999) "A set of principles for conducting and evaluating interpretive field studies in information systems", MISQ, Vol 23, pp. 67-94.

Klir, G. (1985) Architecture of Problem Solving.  John Wiley and Sons, New York.

Kling, R. (2001)  “Social Informatics”, http://www.slis.indiana.edu/si/si2001.html, date accessed 20 November 2001.

Kirby, M. A. R. (1993)  Improving the impact of systems thinking on information systems development”,  In Stovwell, F. West, D. and Howell, J. G. (Eds.), Systems Science:  Addressing Global Issues, Plenum Press, New York, pp. 379-384.

Kumar, K. and Christiaanse, E. (1999)  “From static supply chains to dynamic supply webs”, in Proceedings of the Twentieth International Conference on Information Systems, De, P. and DeGross, J. (Eds.), Charlotte, NC, December 13-15, pp. 300-306.

Ladet, P. and Vernadat, F. (1995)  Integrated Manufacturing Systems Engineering, Chapman & Hall, London.

Lee, O. (1999) “Critical social theory and teledemocracy”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 169-171.

Lewin, A. Y. (1999) “Application of complexity theory to organizational science”, Organization Science, Vol 10 No 3, pp. 215-216.

Mahmood, M. A. (1987)  “System development methods: a comparative investigation”,  MIS Quarterly, Vol 11 No 3, pp. 293-311.

Mayr, E. (1969)  Principles of systematic biology, McGraw-Hill, New York.

Mbarika, V. (1999) “An experimental research on accessing and using information from written versus multimedia systems”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 502-504.

McGrath, J. (1984) Groups:  Interaction and performance,  Prentice Hall, Englewood Cliffs, NJ.

McLeod, R. (1995) “Systems theory and information resources management:  integrating key concepts”,  Information Systems Management Journal, Vol 8 No 2, pp. 5-14.

McManus, Y. (1999) “Ethics and technology in the workplace”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 644-646.

McNaughton B.  (1989)  The neurobiology of spatial computation and learning”, in Stein, D. (Ed.), Studies in the Sciences of Complexity,  Addison-Wesley, Reading MA.

Meister, D. (1989) Conceptual Aspects of Human Factors, Johns Hopkins Press, Baltimore.

Melton A. W. (1964)  The taxonomy of human learning:  Overview”, in Melton, A. (Ed.), Categories of Human Learning.  Academic Press, New York.

Miller, K. (1995) Organizational Communication, Wadsworth, New York.

Miller, E.  (1969)  “A taxonomy of response processes”, (Technical report 69-16), Human Resources Research Organization, Sept., Fort Knox, KY.

Miller, R. (1967)  Task taxonomy:  Science or technology?”, in Singleton, W. Easterly, R. and  Whitfield, D. (Eds.), The Human Operator in Complex Systems.  Taylor & Francis, London.

Mitchell, M. (1995) An Introduction to Genetic Algorithms.  MIT Press, Cambridge MA.

Mitroff, I. I. and Linstone, H. H. (1993)  The Unbounded Mind, Oxford Univ Press, Oxford.

Nickell, J. (1998) “The Digital Divide”, http://www.wired.com/news/politics/0,1283,14069,00.html, date accessed 20 November 2001.

O‘Brien, J. A. (1999)  Management Information Systems,  Irwin McGraw Hill, Boston.

O’Sullivan, D. (1990)  “Integrated manufacturing systems design”, in Proceedings of the Second International Conference on Factory 2001:  Integrating Information and Material Flow, Cambridge, UK, p. 11.

Oz, E. (1998)  Management Information Systems, Course Technology, Cambridge.

Polanyi, M. (1967)  The Tacit Dimension, Routledge and Kegan Paul, London.

Porra, J. (1999) “Colonial systems,” Information Systems Research, Vol 10 No 1, pp. 38-69.

Prigogine, I.  (1980)  From Being to Becoming,  W. H. Freeman, San Francisco.

Rasmussen, J.  (1993)  Deciding and doing: Decision making in natural context”, in Klein, G. Orasanu, J. Caulderwood, R. and Zsmabok, C. (Eds.), Decision Making in Action: Models and Methods, Ablex, Norwood.

Salam, A.,  Rao, H., and Bhattacharjee, S. (1999) “Internet-based technologies: value creation for the customer and the value chain”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 538-540.

Santa Fe Institute (2001) http://www.santafe.edu/  date accessed 20 November 2001.

Schecter, D. (1991)  Critical systems thinking in the 1980s: A connective summary”, in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp 213-226.

Senge, P. M. (1990) The Fifth Discipline.  Doubleday, New York.

Simon, H. (1969)  The Sciences of the Artificial.  MIT Press. Cambridge MA.

Sokal, R. R. (1974)  “Classification:  Purposes, principles, progress, prospects”,  Science, Vol 185, pp. 1115-1123.

Srite, M. and Ayres, B. (1999) “Positive affect and group decision making”, in Proceedings of the Fifth Americans Conference on Information Systems, Nazareth, D. and Goodhue, D. (Eds.), Milwaukee, WI, August 13-15, pp. 346-366.

Stein, D. (1989a) “Preface, complex systems”, in Stein, D. (Ed.), Studies in the Sciences of Complexity, Addison-Wesley, Reading MA, pp. 389-437.

Stein D. (1989b) “Spin Glasses”,  Scientific American, Vol 260 No 7, pp. 52-59.

Stewart, K. J. (1999) “Transference as a means of building trust in world wide web sites”, in Proceedings of the Twentieth International Conference on Information Systems, P. De, P. and DeGross, J. (Eds.), Charlotte, NC, December 13-15, pp. 459-464.

Takahara, Y. and Shiba, N. (1996) “Systems theory and systems implementation:  case of DSS”, in Oren, T. and Klir, G. (Eds.), Computer Aided Systems Theory: CAST’94, Springer, Berlin, pp. 388-408.

Ulrich, W. (1991)  Systems thinking, systems practice and practical philosophy: A program of research”,  in Flood, R. and Jackson, M. (Eds.), Critical Systems Thinking, Directed Readings,  Wiley and Sons, New York, pp 245-268.

Van Gigch, J. P. and Le Moigne, J. L. (1990)  “The design of an organization information system”,  Information and Management, Vol 19 No5, pp. 325-331.

Varela, F. J., and Goguen, J. A. (1978)  The arithmetic of closure”, in Trappl, R. (Ed.), Progress in Cybernetics and Systems Research, Vol 3, John Wiley, New York.

Vickers, G. (1983) Human systems are different,  Harper and Row, London.

Waldrop, M. (1992)  Complexity,  Simon and Schuster, New York.

Walsham G. (1995)  “The emergence of interpretivism in IS research”, Information Systems Research, Vol 6 No 4, pp. 610-634.

Woods, D. (1993)  “Process-tracing methods for the study of cognition outside of the experimental psychology laboratory”, in Klein, G. Orasanu, J. Caulderwood, R. and Zsmabok, C. (Eds.), Decision Making in Action: Models and Methods, Ablex, Norwood. 

Xu, L. D. (1995) “Systems thinking for information systems development”, Systems Practice, Vol 8 No 6, pp. 577-589.

Xu, L. D. (2000) “The contribution of systems science to information systems research”,  Systems Research and Behavioral Science, Vol 17, pp. 105-116.

Zeigler, B. (1996) Fundamental systems concepts:  ‘the right stuff’ for 21st Century technology”, in Oren, T. and Klir, G. (Eds.), Computer Aided Systems Theory: CAST’94, Springer, Berlin, pp. 28-33.

Zhu, Z. (1998) “Confucianism in action:  recent developments in oriental systems methodology”, Systems Research and Behavioral Science, Vol 15 No 2, pp. 111-130.

Zhu Z. (2000) “WSR:  A systems approach for information systems development”, Systems Research and Behavioral Science, Vol 17, pp. 183-203.

Zhu, B. Ramsey, M. Chen, H. Hauck, R. Ng, T. and Schatz, B. (1999) “Support concept-based multimedia information retrieval:  A knowledge management approach”, in Proceedings of the Twentieth International Conference on Information Systems, De, P. and DeGross, J. (Eds.), Charlotte, NC, December 13-15, pp. 1-14.

Zurek, W. (1990)  Complexity, Entropy, and the Physics of Information,  Addison-Wesley, Reading MA.

Zwass, V. (1998)  Foundations of Information Systems, Irwin McGraw-Hill, Boston.