Cognitive Systems Engineering

Erik Hollnagel

Ph.D., Professor, Professor Emeritus

 

CSE - Cognitive Systems Engineering

CSE: RIP

The first publication that outlined Cognitive Systems Engineering (CSE) was a Technical Report from the Risø National Laboratory by David Woods and myself (Hollnagel & Woods, 1982).

The idea came from our numerous discussions (in those days person to person, whenever we met in Denmark and Norway) about the ways in which human-machine research were developing. The time was a few years after TMI, when suddenly everyone had become a human factors expert. Personal computers had started to appear, including the first Macintosh, but were not part of everyday life and were far from ubiquitous. Computers, typically as minicomputers, were part of many professional work environments, such as industrial control rooms, but were embedded in the supervision and control systems rather than directly accessible by the operators. There was a strong and growing interest for man-machine systems (what we now call human-machine systems), which of course had been the remit of human factors from the very beginning. CSE was concerned, however, about where these developments were heading, as expressed by the introduction to the report:

This paper presents a new approach to the description and analysis of complex man-machine systems, called Cognitive Systems Engineering. In contradistinction to the traditional approaches to the study of man-machine systems (MMS) which mainly operates on the physical and physiological level, CSE operates on the level of cognitive functions. Instead of viewing an MMS as decomposable by mechanistic principles, CSE introduces the concept of a cognitive system: an adaptive system which functions using knowledge about itself and the environment in the planning and modification of actions. Operators are generally acknowledged to use a model of the system (machine) they are working with. But similarly the machine has an image of the operator, whether implicit or explicit. The designer of an MMS must recognize this, and strive to obtain a match between the machine’s image and user characteristics on a cognitive level, rather than just on a physical level. The paper gives a presentation of what cognitive systems are, and of how CSE can contribute to the design of an MMS, from the cognitive task analysis to the final evaluation.

The Technical Note was published as a journal paper the following year, with only minor stylistic changes (Hollnagel & Woods, 1983).

What both the report and the paper tried to express, although not as clearly as it can be done today, was the need to look at systems and how they function rather than components and component interactions. This was expressed by the three main themes of CSE (coping with complexity, joint cognitive systems, and the use of tools/artefacts) which were offered as guidance for how a CSE should develop.

CSE was proposed at a time when the enthusiasm for human-computer interaction (as opposed to human-machine systems) was just beginning. (The first of many SIGCHI conferences was organised in 1982. And in 1994 the venerable International Journal of Man-Machine Studies became the International Journal of Human-Computer Interaction – not quite the same thing.) Neither was the unhealthy preoccupation with ‘human error’ very strong; but that changed when James Reason published his book on Human Error in 1990. These, and other trends, were noticeable, but it was still possible to go in a different direction.

Unfortunately, the emphasis on systems rather than components, and on functions rather than structures, was not strong enough to withstand the leading trends, which favoured looking at how components (human and machine) interact rather than how they function together as system. The two books that followed much later, Hollnagel & Woods (2005) and Woods & Hollnagel (2006), both referred to joint cognitive systems and thereby tried to make clear that it was the ‘jointness’ rather than the ‘cognition’ bit that was important. But by then the window of opportunity had closed.

The dilemma can be illustrated by considering two ways of parsing CSE. One parsing is as C(SE), meaning cognitive (systems engineering) or systems engineering from a cognitive point of view. The other is (CS)E, meaning the engineering of (cognitive systems), or the design and building of joint (cognitive) systems. Our intention was clearly the latter, but it was the former interpretation that won. During the 1980s and 1990s it became common – and in some cases almost de rigeur – to use ‘cognitive’ as a prefix to other terms. This has led to names such as cognitive human factors, cognitive ergonomics, cognitive decision making, cognitive reliability, cognitive errors, cognitive work analysis, and even cognitive resilience, most of which sound great but few of which have any meaning.

These misgivings aside, the more serious problem is that the focus on the interaction between humans and something, be it -machine, -computer, -environment or something else, reduced the problems to a dyadic relationship. This completely missed the point that we cannot really understand what takes place unless we adopt a genuine system perspective, hence look at the joint system, or the whole, rather than its parts. CSE tried to make this point from the start but it was consistently overlooked. (In hindsight it would of course have been better to refer to anti-entropic systems, as we did in the two books, than to joint systems. CSE might then have been EAS – the engineering of anti-entropic systems.)

From my perspective, and I am willing to accept that this is rather idiosyncratic, CSE – or rather, C(SE) – is no longer relevant, if ever it was. This is not to deny the value of the study of cognition, as it is done by cognitive psychology. But the study of systems, and indeed the engineering of systems, cannot be based on the study of the cognitive processes that are assumed to take place within arbitrary system components. (CS)E, on the other hand, is still very much relevant, not least if we understand the ‘C’ to mean the system’s ability to modify its own behaviour on the basis of experience, rather than the faculty for the processing of information.

References

Hollnagel, E. & Woods, D. D. (1982). Cognitive systems engineering: New wine on new bottles (Risø-M-2330). Roskilde, Denmark: Risø National Laboratory. (Note that the cover page uses ‘on’ while the summary uses ‘in’.)

Hollnagel, E. & Woods, D. D. (1983). Cognitive systems engineering: New wine in new bottles. International Journal of Man-Machine Studies, 18, 583-600 .

Hollnagel, E. & Woods, D. D. (2005). Joint cognitive systems: Foundations of cognitive systems engineering. Boca Raton, FL: CRC Press.

Reason, J. (1990). Human error. Cambridge: Cambridge University Press.

Woods, D. D. & Hollnagel, E. (2006). Joint cognitive systems: Patters in cognitive systems engineering. Boca Raton, FL: CRC Press.