Ethnographic Workplace Studies

and Computer Supported Cooperative Work


published as:


Jordan, Brigitte

1996           Ethnographic Workplace Studies and Computer Supported Cooperative Work. Pp. 17-42 in: The Design of Computer-Supported Cooperative Work and Groupware Systems. Dan Shapiro, Michael Tauber and Roland Traunmüller, eds. Amsterdam, The Netherlands: North Holland/Elsevier Science.


Brigitte Jordan


Xerox Palo Alto Research Center


Institute for Research on Learning





Developing tools and methods that help us better understand complex work and learning situations has been a major interest of the interdisciplinary groups I have been a part of at Xerox Palo Alto Research Center (PARC) and the Institute for Research on Learning (IRL). Some of us have been pursuing this interest with an explicit focus on technology design, others have been more concerned with a holistic understanding of work practices, the organization of communities of practice within work settings, or work process redesign. Most of us move back and forth between these foci as projects allow. No matter what the focus, we tend to draw heavily on participant observation, in-situ question-asking, and micro-analytic methods of analysis — methods that grew out of anthropological ethnography on the one hand and ethnomethodology on the other.  I refer to the mix as ethnographic methods, more as a convenient shorthand than as a claim to disciplinary purity, comprehensiveness, or exhaustiveness. Just as there are many sociologies (Hughes and King 1992), there are also many anthropologies and many ways of doing ethnography. As an anthropologist, I shall be particularly concerned with the anthropological roots of ethnographic methods and with the adaptations they have undergone in the service of workplace research. An excellent introduction to the sociological antecedents of ethnography can be found in Hughes et al. (1993).


Workplace research, whether in the interest of CSCW-design, organizational learning, or productivity studies is in the process of undergoing a paradigm shift. As researchers and designers we are increasingly involved in having to understand very complex settings where multiple activities are carried out by multiple actors with multiple agendas. In such situations earlier research paradigms that emphasized the individual agent and individual quantifiable variables that could be subjected to statistical hypothesis testing no longer seem to work all that well. The requirements of these settings compel us to look for new research paradigms — paradigms that are, in many ways, orthogonal to the classic scientific paradigms we have been familiar with.


In particular, research that focuses on work practice requires a radical conceptual switch from seeing knowledge as a property of the individual, as a kind of quantity that can be measured, assessed, and “transferred”, to seeing knowledge and meaning as socially constructed within ongoing communities of practice. Taking this view seriously means to investigate the ways in which people in the workplace “co-construct” knowledge and skill by drawing on the social and material resources available to them. Attempts to design CSCW technologies, then, must be grounded in a thorough understanding of ongoing work practices and how they are supported (or not) by the physical layout, artifacts, information systems and data bases, as well as the social relationships and arrangements of the workplace. A distributed cognition approach that sees knowledge as socially and materially situated is particularly suited to exploring the ways in which we can provide better support for webs of collaboration rather than merely for individuals.


Historically, CSCW and the notion of Groupware arose out of the realization that most information technology has been designed too narrowly, primarily with the individual user in mind. In response, much work has been done in the last dozen years or so to elicit explicit information about group work that could feed back into technology support for such groups — not only in the CSCW community but also in socio-technical systems design, business process reengineering, workflow analysis, and the like. We believe that there is yet another dimension that needs to be explored and that is the knowledge that is not only group-based but also tacit, implicit, embodied, and not articulated. For this we need to adapt research methods that allow the investigator to become part of the community in order to understand what is not usually made explicit. Our research goal then becomes exploring the relationships and tensions between what in cognition and learning is tacit and what is explicit, what is under the purview of the individual and what cannot be understood except as a group-based phenomenon (see Figure 1). In order to move towards that goal, we also need to rethink our research methods. This paper is intended to present some of our thinking and experience in that arena.



Figure 1


Knowledge in Action: A Conceptual Map














subject matter knowledge

number tables

skills and “knowing that”


rules of thumb





process models of work

work flow representations



best practices








expertise and

                 “knowing how”

street smarts


common sense

good judgment




terra incognita


work practice


organizational learning

work culture





In the last few years, ethnographic methods have taken on increasing importance within CSCW, as indicated by the increase in workshops, tutorials, and papers in CSCW and HCI conferences and journals (Bentley et al. 1992, Heath and Luff 1991, Luff et al. 1992, Hughes et al. 1992, Morgensen and Trigg 1992, Orr and Crawford 1992).  Relatedly, detailed field studies have also taken on a major role within the sphere of workplace studies (Baba 1991, Bishop 1993, Blomberg et al. 1993, Brun-Cottan 1991, Brun-Cottan et al. 1992, Button 1993, Darrah 1992, Engestrom 1992, Engestrom and Middleton 1993, Goodwin and Goodwin 1993, Greenbaum and Kyng 1991, Heath and Luff 1993, Hughes et al. in press, Hutchins 1990, Hutchins and Klausen 1993, Jordan 1992a, 1992b, Kukla et al. 1990, Manning 1992, Middleton and Engestrom 1993, Orr 1990, Sachs in press, Scribner and Sachs 1990, 1991, Suchman 1987, Zuboff 1988). Our work draws on, and hopefully informs, the work of our colleagues.


 Different Research Paradigms for Different Kinds of Work


Most of us have grown up with a particular way of doing science, a deductive, hypothesis-testing approach to research which I will here call the analytic paradigm. I want to contrast it with another equally legitimate way of doing research, one that will give us better data for the kinds of objectives we have in mind when we do workplace analysis. I will call that emerging research paradigm the systemic paradigm. The two paradigms address different issues, yielding different kinds of knowledge, and ought to be seen as complementing and enriching each other rather than excluding each other (Salomon 1991).


The familiar analytic paradigm is appropriate in situations where we have well-developed theories regarding the phenomena we are interested in. From these theories we deduce specific hypotheses which are used to test relationships between discrete, well-defined, operationalizable variables. Statistical hypothesis testing is valuable in that it provides a systematic procedure for ruling out rival hypotheses. It is used to good advantage in fields such as experimental psychology, pharmacology, or ergonomics, where the researcher knows ahead of time what variables he or she wants to test. The outcome of research here is a statement of statistical significance of the difference between the variables tested. But while the analytic paradigm is well suited to study what can be made to happen in controlled laboratory situations, it is not well suited to study what happens in complex, mundane work places that are not subject to experimental manipulation.


As it turns out, in most situations of interest in CSCW and workplace research, we are dealing with very messy, dynamic situations that can’t be held constant. Trustworthy theories and well-defined variables are hard to pin down. In such situations it makes as little sense to ask how much a particular information technology contributes to work activity as it makes sense to ask how much the violin contributes to the performance of an orchestra. Rather, we have to find ways of describing the whole process in an integrated way. Work practice has to be understood as a dynamic system where changes in one aspect produce multiple changes that reverberate throughout. Each component, event, or action has the potential of affecting the entire system. Furthermore, as researchers in functioning workplaces, we are rarely in a situation where we can do controlled experimentation. Rather, we are typically dealing with a moving target, a system that doesn’t stand still long enough to be checked out, but one that is in constant flux, undergoing constant self-reorganization, even when there is no systematic, planned-from-the-outside change visible. How, then, can such a complex setting be studied and what questions ought its study be able to answer?


In dynamic social situations, applying the conventional deductive analytic framework is likely to produce invalid results. A better approach is to radically switch paradigms and rely on the systemic paradigm, working inductively and grounding our emergent understanding of work practices in iterative recourse to the data.


For the researcher this means giving up all kinds of customary control. It means, above all, that the research design is not fixed, but evolves in response to our increasing understanding of the situation. While in the analytic paradigm the researcher attempts to act as an independent observer who tries not to contaminate the research situation, in the systemic paradigm the researcher has to be involved; s/he cannot treat the subject as an object to be studied; rather the former “subject” becomes a collaborating expert, a co-analyst with a privileged point of view, and a co-designer of the research and of solutions to the problems the research identifies. These of course are the principles of the participatory design approach pioneered by the Scandinavian design community (Floyd et al. 1989, Greenbaum and Kyng 1991, Morgensen and Trigg 1992).


Communities of Practice


At IRL and PARC, we have been developing the idea that most (if not all) work gets done in “communities of practice”, or COPs (Lave and Wenger 1991, Wenger in press). A COP approach allows us, indeed forces us, to focus on how work is accomplished as a collaborative enterprise.


Communities of practice are naturally occurring groups that arise more or less spontaneously around a particular task, technology, or enterprise. COPs are ubiquitous and every person is a member of multiple communities, at work and elsewhere, dipping in and out of them as the situation requires. COP members may be co-located in face-to-face interaction, or they may work together remotely through a variety of communication technologies.


At times, COPs line up more or less with the organization chart; other times they do not, crosscutting and superseding the official organizational structure. COPs are self-organizing; they emerge in response to changing conditions and opportunities in the workplace, but can’t be created by fiat from above. However, they can be nurtured or stymied by particular technologies, spatial arrangements or work requirements, and this is precisely why they are of interest to CSCW design.


A COP then could be any group of people who work together to accomplish some activity (their practice), usually involving collaboration between individuals with different roles and experience. It is the basic social unit that gets the work done. We have analyzed airlines operations, the work of midwives and obstetricians, claims processing in an insurance company, investigating customer inquiries and many other kinds of work as the accomplishment of communities of practice.


A COP view of the work environment implies some shifts in the ways we usually look at work practices. For one thing, it means that we pay attention to the ways in which knowledge and meaning are constructed and distributed in particular work situations This enforces a view of knowledge and information that is different from the standard cognitive science view. Knowledge here is thought of as the ability to participate meaningfully; learning is seen as the process of becoming a member of the working community of practice; and tools are reconceptualized as resources that facilitate integration and interaction within the group so that it can carry out its business.


The point is not to decide whether a particular work group is or is not a community of practice. Rather, the idea of COP provides an analytic framework which enables us to look at the ways in which COP activities are (or are riot) supported by appropriate technologies and social arrangements. Using communities of practice as an analytic framework also directs us to look at the competencies required to get business done, and to consider the role which artifacts, technologies, documents, facilities, communication links and information systems play in the acquisition and display of such competence. We are then less concerned with individuals and the knowledge they hold, but with the ways in which knowledge and meaning are constructed and distributed in particular work environments. This reconceptualization has important consequences for training and testing.


For shedding light on CSCW issues, understanding how individuals become competent members of ongoing work communities is crucial. Workplaces are full of learning opportunities and every newcomer is keen to get socialized into the locally appropriate attitudes and practices. This, for neophytes, takes the form of acquiring the necessary work skills as well as figuring out how to tell a story, get along with co-workers, manage one’s appearance, and even, in some environments, how to be uninvolved with their work by creating “identities of non-participation” (Wenger in press).


Another important aspect of communities of practice is their adaptation (either easily and with little effort, or in a painful, destructive way) to changes in their own environment. Our research attempts to track how communities respond to changes in technologies, in personnel and social structures, in physical layout (moving to new facilities), in world view (new ideas), external constraints (new federal regulations) and so on. We attempt to find out what kinds of changes are easy and productive and what kinds are hard and costly—financially, emotionally, commitment-wise, etc. — and what facilitates and impedes positive changes.


With this approach, the challenging question for CSCW becomes providing resources that progressively and reflexively facilitate and enable the activities of the community. The emphasis shifts toward giving more attention to the people in the system and how they function as a community, and attending to ways that CSCW technologies can potentially enhance the community’s practices.


Key Points about Research Methods


In regard to all questions of methodology, we proceed from the premise that the choice of methods for data collection and analysis should be determined by what “work” the data need to do for us, that is to say, what aspect of the real world they need to illuminate for us. We take our research methods extremely seriously because we know that the choice of methods is crucial for the success of our projects. The wrong methods will produce the wrong results, which means unusable or invalid data. Given the variety of our interests, we expect to use a rich array of methods in our projects, but we mean to remain accountable to two questions at all times: 1) what are these data for; and 2) are these the most appropriate data to collect for the purpose we have in mind. One of the methodological challenges, then, is to determine the right mix of methods that will allow us to answer our research questions.


We think of methods simply as ways of transforming the real world into data. If this is so, different methods will give us different kinds of data which will do different kinds of “work” for us. We can envision methods as producing different windows on the world, different slices of reality, some bigger some smaller, some microscopic; some hitting only the proverbial elephant’s trunk while others produce a more complete, though less detailed, picture. The method of “counting”, for example, gives us numbers, and numbers are good for doing “number work”, such as computing ranges, averages of various kinds, standard deviations and so on, which are useful for many specific purposes. The method of “asking a question”, on the other hand, gets us data in the form of verbal answers which we can categorize as such or, in turn, transform into numbers, i.e. frequencies. The method “taking a picture” gives us yet another representation of the world. But note that in each case we are not really interested in the representation, the “data”, but in the inferences these data allow us to draw about the world.


To make things clearer, let us consider the work activities carried out in an order processing organization. If we are interested in better understanding such a workplace, we might apply various kinds of methods. For example, we might do some counting. This will give us data consisting of numbers, such as the number of people present at a particular time, the number of work stations in use, the number of telephone calls made, the number of orders processed, the number of interruptions experienced by the workers, the number of jokes cracked in the course of a working day. Or, we might make a sketch of the workspace and note the distribution of people and objects. The method “making a sketch” would give us another kind of data. Or we could listen to what people say and write it down, or we could ask them questions and write down the answers, or we could observe what they do: who speaks to whom, who distributes documents, what activities always follow other activities and the like. We might also take people’s heart rate before and after the day’s work or observe changes at times of stress. In an alternative vein, we might make a painting, compose a poem, or write a fiction story about the work of order processing. All of these are methods for transforming the experienced world of work into data about the work.


It is immediately apparent that different kinds of data belong to different symbol systems, each of which has its own rules regarding allowable transformations and operations. Thus number-type data must be treated according to the properties of numbers, poem-type data must be read according to the rules for reading poems, sketch-type data must be interpreted according to the conventions for making sketches.



                                                            Figure 2


Methods: Transforming the Experienced World into Data










people in interaction

with each other and their



a socio-material field




ask questions

make story

take picture

make poem





rules for manipulating numbers


rules for making and hearing stories


rules for reading photographs



Once we have achieved a symbolic representation of the world through the application of methods, we impute the structure of the symbol system to the real world. For example, if we use number kinds of methods, what we know about the world are number kinds of things: quantities, averages, frequencies, ranges, etc.


We could argue, then, that the application of particular methods translates the experienced world into a system of symbols whose structure determines what kind of access our data give us to the real world. But note that what these data “mean” does not depend on the method in any simple sense but is based in a community of practice. The conventions for performing operations within these symbol systems are achieved social conventions, conventions that develop and stabilize within communities as practitioners do their work, and that may change, atrophy and fall out of use (cf. the procedural conventions of alchemy; pulse-reading in medieval medicine, etc.).


The downfall of particular methods is usually explained by lack of fit between the symbolic representation and the real world. E.g., we don’t read chicken entrails anymore because the information gained from that procedure did not predict the state of the world very well. On the other hand, there are numerous examples of methods that remained in vogue for a very long time which did not produce significant purchase on the real world.


The important point for our work is that the choice of methods should be determined by what kind of data we need for our purposes. In the case of workplace analysis, the question we want to ask is: what sorts of methods are best, most insightful, most appropriate to deliver the data that allow us to understand work practices in complex systems and that give us information about tacit as well as fully articulated kinds of knowledge that have to be supported. We have found a powerful set of methods for workplace analysis in a combination of ethnographic fieldwork for the context and video-based Interaction Analysis for the detailed understanding of people’s activities on the micro level (Jordan and Henderson 1995).


Ethnographic Field Methods


Ethnography was developed by anthropologists early in this century for the study of exotic tribes and communities. Away from civilization, early researchers often found themselves in situations where the rules of their own society didn’t hold, where it was no longer clear what was edible and what was not, who was marriageable, what obligations you had to kinsman or stranger — even the language people spoke was not known in many cases. Anthropologists learned to learn not by explicit instruction but by participating in the routine activities of people’s daily lives and by immersing themselves in the events of the community, thereby coming to appreciate what the world looked like from the point of view of “the natives.” Today, there are few unstudied exotic peoples left, but ethnographic methods have been found to be exceedingly useful whenever one needs to understand complex functioning systems within a holistic perspective.


Starting in the fifties, ethnographic methods were adapted by sociologists, educational researchers and social psychologists for studies of scientific practice, for classroom research and for the investigation of medical work, and even more recently social and behavioral scientists have used ethnographic methods successfully in the study of complex communities and processes in industrialized settings. Within CSCW, the usefulness of the ethnographic preoccupation with the “native” view (here the user’s perspective) has gained increasing recognition and it is now taken for granted that careful in-situ participant observation before the design phase and user participation during the design phase are indispensable for the success of a project.


Working with Ethnography


What, then, does ethnography consist of? How does one do ethnography? As noted, ethnographic field work typically involves participant observation, a combination of observation and in-situ question asking, carried out while participating in the ongoing activities and events of a community (Bernard et aI. 1984). Participant observation is not unproblematic. To be able to function simultaneously as observer and participant is a complex skill that requires training, experience and continuous reflection on the process. Depending on the degree of involvement with the community it also raises particular ethical problems of responsibility for the welfare of study participants (Gilbert et al 1991). In workplace research it has to be possible to assure workers that none of the data collected for research purposes will be accessible to management for evaluation of individuals or in any other form that could be damaging to them. Such assurances and safeguards need to be worked out ahead of time and must be scrupulously adhered to by all participants.


The Participant Observer


When anthropologists study entire communities, they more likely than not become residents of the community. Since our work tends to involve working communities that are not necessarily residential communities, we simply try to become as much a part of the scene as local conditions allow. In practice this can range from taking up residence and participating in the full panoply of local activities to a position more akin to the proverbial fly-on-the-wall.


The participant observer has a difficult double role to play: her (or his) primary attitude is that of a novice who tries to become a part of the life of the community; at the same time, s/he needs to maintain enough distance to record her observations and reflect on her evolving understanding of the situations she encounters. As Blomberg points out, she needs to produce tentative and partial formulations of what is going on, which are revised in successive encounters until her understanding is no longer challenged by events in the world. Ethnographic fieldwork thus involves an iterative approach to understanding, wherein early formulations are continuously revised as new observations challenge the old, and where adjustments in research strategy are made as more is learned about the particular situation at hand (Blomberg et al. 1993).


In some settings, the researcher may be able to apprentice herself (or himself) to a focal member of the study population; in others, not much more than physical co-presence in the situation may be tolerated. (Where even that is lacking, of course, ethnography is not the method of choice.) An apprentice is the quintessential “legitimate peripheral participant” (Lave and Wenger 1991) who is specifically assigned the role of learner and in that capacity is allowed to be nosy, ask questions, “get into interesting situations”, and the like. We have found that role particularly powerful as a “way in” to embodied practice and the tacit knowledge of a community that does not get articulated but has to be absorbed. For example, in earlier work with village midwives in Mexico, I was able to apprentice myself to a Maya partera who taught me much of her obstetric knowledge and skills, not through formal explanation but by letting me participate in her daily activities (Jordan 1989, 1993). In a current project with a large health maintenance organization, on the other hand, such hands-on involvement in medical work will not be possible. In high tech situations, keeping the researcher away from the machinery is often justified. In that case, one could still participate in routine work activities as much as is unproblematic remaining for entire shifts, accompanying workers who leave the work site, participating in meetings, meals and story telling sessions, and the like, in addition to investigating work activities at the site. While in exotic settings one might learn how to build a canoe, weave a hammock, make rain, or deliver a baby, in industrial work places one gets insights into the intricacies of returning customers’ phone calls, adjusting a piece of machinery, smelling a vat of chemicals, or negotiating with co-workers about a useful software strategy (Sachs in press, Kukla et aI. 1990, Zuboff 1988).


Participants’ Categories


A distinguishing feature of ethnographic work is that it is concerned with understanding what the world looks like from the point of view of participants. How do they describe and make sense of their world and their activities; how do they talk about what is going on; what is important and significant to them and what is not; what resources in their environment do they use; what categories, models and representations are relevant and meaningful to them for solving problems and carrying out their work. What the descriptive and explanatory resources of study participants are is not known in advance, of course, but must be discovered through research. This concern with “the natives’ point of view” to some extent overlaps with the ethnomethodological focus on members and members’ sense-making.


Anthropologists make a distinction between two kinds of data: data collected in categories relevant to participants are “emic” data while those collected from an outside (the analyst’s) perspective are “etic” data. For example, one could imagine that we want to know what activities different types of persons engage in in a particular workplace, such as an airport. We initially use our own etic categories for the airport, i.e. pilots, passengers, etc. for observation and elicitation, a process that produces etic data. Once we have found out how persons are classified by people in the workplace, however, we can gather emic data. For example, “ramp rat” and “blue goon” are emic categories used by airlines personnel. One can easily imagine that data collected on, say, job performance, could look quite different if they were collected as emic or as etic data.


The distinction between emic and etic data is useful because much of what passes for work practice studies among management consultants is based on a purely etic framework that assumes certain structures and relationships without understanding what the work looks like from the point of view of participants. For example, in one of our current projects with a large American telephone company, we found that a highly touted workflow simulation tool was unable to model such things as a job held in abeyance for lack of information, or several people working on aspects of the same job in parallel. The tools forced a particular set of predetermined categories and ignored others that later emerged as important in ethnographic fieldwork.


In earlier days, when anthropologists went out to live in isolated villages, a simple distinction between the “inside” (emic) view of the natives and the “outside” (etic) view the anthropologist brought with her or him, made sense. In the normal course of traditional fieldwork, the anthropologist would hold her own views in abeyance as much as possible and, through a protracted period of participant observation, would attempt to come to understand how the natives saw their world. The final step would be the writing of an ethnography, in which they explained the emic system to an etic audience, that is to say, to anthropologists and other academics.


Today, we recognize that there is no unitary “outside view,” no single etic. Rather, for every workplace there exists a multiplicity of communities of practice who have an interest in it, a view of it, and ways of conceptualizing its problems and possibilities — be that management, the corporate level, labor organizations, business journal writers, management consultants, or CSCW designers. Thus the translation process has also become much more complicated. It becomes important to be clear about the “audience” to which our ethnographic insights will need to be conveyed, in other words, to which etic framework we have to reconstructively translate our findings.


In-Situ Question-Asking


Ethnographic fieldwork rarely makes use of formal questionnaires, relying either on informal interviews (“conversations”) or, even more typically, on questions asked on-the-fly, as they arise in actual situations. This is an appropriate strategy in situations where questions cannot be made up ahead of time because not enough is known to do so. It is also appropriate as a deliberate exercise in holding one’s etic analytic framework in abeyance. Rather, questions are constructed in the interplay between the researcher’s evolving understanding and the set of activities he or she is participating in. Questions asked in this manner produce data of greater “ecological validity” than laboratory data, i.e. they are likely to apply more directly to the situation from which they came than those procured by questioning in a dissociated locale and situation. They also avoid many of the pitfalls of survey interviews (Suchman and Jordan 1991).


As the ethnographer becomes familiar with local practice and local ways of thinking, s/he may become concerned about the generalizability of analytic insights. S/he might then develop a set of standard topics that are covered in each new situation in order to provide adequate coverage.


Observation -- When Asking Questions Is Not Enough


Probably the most distinguishing feature of ethnographic research is the extensive reliance on observation of naturally occurring activities in real world settings. In workplace research, the question often arises: Why do we need to observe? Why can’t we just go and ask people what they do? Or better yet, ask their managers?


It turns out that managers are frequently ignorant of the actual details of the work lives of the people who they supervise and for whom they devise policies. But workers, too, are not always capable of providing the details of their activities. They are often able to solve problems without being aware that a problem has occurred. They devise “workarounds” for problematic technologies without realizing it, they continuously cover for each other, provide information, and build redundancy into the system without having cogent ways of discussing these things. Researchers’ failure to capture information of this sort is hardly ever a question of workers consciously and willfully withholding information, given that ethnographic participation presupposes the development of a trusting relationship. It is much more likely that workers are unaware of what they do, and are able to carry out tasks expertly and skillfully without being able to talk about that expert and skillful execution. Drawing attention to this disjunction is not to take an attitude that people lie or that you can’t trust them, but rather that the tellable and remarkable characteristics of their activities are something different from the activities themselves.


But even when people have an explicit view on something and are prepared to talk about it, we still need to be careful. What people think and say they do and what they actually do are two different things. The relationship between events (what people did) and accounts of events (what people say they did) is an empirical question that must be determined by research It should never be the case that we mistake one for the other. This is one of the most common confusions in workplace research. It mistakes attitude for action, ideology for activity, what people report (accounts of behavior) with the actual behavior, in spite of the fact that this fallacy has been decried over and over again. We note, for example, that most workplace research relies on the accounts of participants or their managers, provided through structured interviews, focus group interaction, and the like. Failure to pay attention to the say/do distinction is common and rarely questioned, but is likely to produce data that are invalid in the technical sense, i.e. data that do not measure what we intend to measure.


Working with Interaction Analysis


The power of ethnographic fieldwork is substantially extended through video-based Interaction Analysis. Interaction Analysis is the in-depth micro-analysis of how people interact with one another, their physical environment, and the documents, artifacts, and technologies in that environment (Jordan and Henderson in press). It has roots in many social science disciplines including kinesics (the study of body language), proxemics (the study of space and territoriality in social interaction) and ethology (animal behavior), but has been shaped most consequentially by conversation analysis and ethnomethodology. Like ethnography in general, it looks for orderliness and patterns in people’s routine interactions but operates at a finer level of detail than ethnographic observation (Suchman and Trigg 1991).


In our practice, video-based Interaction Analysis does not stand alone, but is used in conjunction with other ethnographic techniques. Participant observation in the field identifies “hot spots”—problematic sequences in the routine work life which are not easily understood by simple observation or interviewing. Detailed micro-analysis of these hot spots then focuses systematically on the ways in which participants use the social and technological resources available to them for carrying out their work, allowing for an in-depth understanding of the work process as an interactive system. Ethnographic information thus furnishes the context against which Interaction Analysis is carried out, while the detailed understanding provided by the micro-analysis of interaction, in turn, informs our ethnographic understanding.


In our projects, videotapes are typically analyzed in collaborative, interdisciplinary working sessions. During these sessions, the tape is played and replayed many times over, as researchers develop grounded hypotheses about what is happening on the tape. The hypotheses that have been developed in these joint session are then further explored against evidence on other tapes, so that a tentative understanding of the work setting emerges. Sometimes questions arise that can only be answered by further field observation or interviewing, by digging into archives and other sources, or by more videotaping, but as time goes by an increasingly sophisticated understanding of the problems in this particular workplace and of potential resources for their solution emerge.


Video Review Sessions


Whenever possible, individuals from the work setting are invited to participate in the analysis and contribute their own special insights. Such data represent the participants’ perspective, their (emic) view of the world, which may contrast substantially with the analysts’. Most of the value of these sessions, however, may lie in the opportunity they provide for participants and researchers to make fine adjustments in their joint understanding of what is going on. Review sessions provide researchers with occasions for asking detailed questions about work practice which are often impossible to entertain in pressured work settings. Methodologically, videotape-based answers have the advantage of staying much closer to the actual events than if one were to ask questions removed from the activity of interest. For example, instead of interviewing software programmers about their practices (or, even more removed, asking them to fill out a questionnaire), one might ask them to look at a videotape of themselves or of other programmers at work and ask questions about that work as they arise from the activity being viewed. Data elicited in this manner are likely to have greater “ecological validity,” that is to say, are more readily generalizable to real conditions of work than data generated under the more artificial circumstances of a focus group or laboratory. In spite of their great benefits, video review sessions are difficult to arrange in practice, primarily because of time constraints and the frequent movement of personnel.


 Camera Effects


A question often arises about the degree to which people are influenced by the presence of a camera. This is, above all, an empirical question that cannot be decided in principle but must be investigated on each occasion of camera work. Evidence that the camera mattered to participants can sometimes be found on the tape itself in the form of visible monitoring of, or remarks about, the camera. Our experience in many different kinds of settings shows, however, that’ people habituate to the camera surprisingly quickly, especially if there is no operator behind it. Wherever people are intensely involved in what they are doing, the presence of a camera is likely to fade out of awareness quite rapidly. It may be worth noting that we always let participants know that their managers will not be able to see their tapes without their explicit permission and that they can request erasure of any tape, either on the spot or later—an assurance that may substantially contribute to people’s comfort with the camera.


The Interaction Analysis Laboratory (IAL)


A special resource for interaction-analytic work at PARC and IRL is the Interaction Analysis Laboratory, an interdisciplinary group of social and cognitive scientists and designers from industrial labs and academic institutions who meet weekly for collaborative analysis of members’ research tapes. It provides a forum for members’ and guests’ video-based research while at the same time contributing to further development of the methodology. As such it constitutes an important node in an emergent global network of practitioners of the method.


Advantages of Video-Based Interaction Analysis


It may be useful to summarize why we find video-based Interaction Analysis (IA) such a powerful component of ethnographic workplace analysis.


•     IA creates a permanent data corpus:  A key characteristic of video data is the permanence of the primary record in all its richness which makes possible an unlimited number of viewings and listenings, in IAL-like settings as well as by individual analysts. This allows researchers to understand details of interactions that remain opaque without repeated replay. In addition, a videotape can be played in slow or accelerated motion, thereby exposing otherwise invisible patterns in the movements of persons or artifacts.


•   IA provides access to behavior invisible without replay technology:   Many of the phenomena of interest to us in workplace analysis emerge only on repeated viewing We use video analysis precisely because there are good reasons to suspect that our powers of observation and memory are not trustworthy for this kind of fine detail; nor is the memory and awareness of the workers themselves. Such skills as picking relevant information out of the “sonic soup” of an airlines operations room are not identifiable from any other kind of record (Jordan and Henderson in press).


•   IA captures complex data: Even for a trained observer, it is simply impossible to keep track of the overlapping activities of several persons with any accuracy or any hope of catching adequate detail. In multi-operator workplaces, ethnographers are forever frustrated by the necessity to decide whom to focus on. In such complex work settings as airlines operations rooms, a single investigator with paper and pencil is simply overwhelmed by the necessity to keep track of several people interacting not only with each other but also with headquarters, pilots, ground crews, maintenance people, baggage handlers, gate agents, and so on. Furthermore, the essence of most manipulative procedures, be they those of a pair of air traffic controllers arranging flight-status strips on their work panels or those of a secretary moving a cursor while text editing—is almost impossible to capture in words, both because of the density of behavioral details and because we lack a ready descriptive vocabulary for bodily behavior which could capture such behavior in notes. It is for these reasons that we need the level of detail and specificity that video data can provide.


•   IA counteracts bias of recorder:  Fieldworkers invariably highlight “important” aspects and pass over what they consider to be at the time “unimportant” aspects of the activities they observe. The camera, on the other hand, records events as they happen, with a level of detail that is unattainable for any method that relies on note-taking or on-the-spot observational coding.


•   IA counteracts bias of the individual analyst:  Multidisciplinary group analysis, as it is practiced in our labs, is particularly powerful for neutralizing preconceived notions on the part of individual researchers. It provides a critical corrective to the tendency to see in the interaction what one is conditioned to see or even wants to see. We have again and again “observed” behaviors which on replay were not present. Errors of this sort are invisible in a paper-and-pencil record because there is no opportunity to go back and check what happened, but a tape segment can be played over and over again, and questions of what is actually on the tape versus what individual observers think they saw, can be resolved.


•   IA avoids the say/do problem:  As noted earlier, what people say they do and what they do in fact are not necessarily the same Video recordings approximate the characteristics of direct observation of an event. The mechanical audiovisual fixation of activities produces data much closer to the activities themselves than other kinds of re­presentations such as fieldnotes or interview data. We believe, therefore, that one situation for which video provides optimal data is when we are interested in what “really” happened rather than in particular accounts of what happened, including people’s recollections and opinions.


•  IA provides access to members’ categories and world view:  Given that analysts have strong preconceived notions of what the world is like, it is often difficult to see when those notions differ from those of workplace participants. For example, it was through analysis of video tapes that we first became aware of the routine, automatic orientation of ops room workers to a bank of video monitors that provide information on the state of planes at the gates. Similarly, it became apparent only with video analysis that it is concerns about baggage and not passengers that drives the work of the ops room (Jordan 1992a).


• IA exposes mechanism and antecedents: Since video records the phenomenon of, interest in context, it is possible to ask about antecedents, varieties of solutions produced on different occasions, and questions of what led up to any particular state. Since we get process data rather than snapshot data, it is often possible to draw direct design implications. For example, in workplace redesign situations, a standard method is to ask people what they want, or if they want x or y. Contrast such data with the results obtainable by the analysis of actual work practice, where one might see, for example, that collaboration becomes difficult when people run out of counter space for their joint paper work Basic assumptions about conditions of work are generally not articulated, but they become visible on videotape Interaction Analysis thus enables designing to the actual conditions of work, thereby avoiding costly design mistakes.


In spite of these considerable advantages, it should be pointed out that, at present, video-based Interaction Analysis is time consuming and expensive, primarily because video records are clumsy to access, annotate, and integrate with other materials such as observer notes, key stroke data, or physiological or state measurements However, emerging annotation and synchronization technologies promise to make this type of analysis more easily available in the future.


To include ethnographic techniques in our methodology for the study of the workplace, then, is important for CSCW design because it provides assurances against several major pitfalls: we are guarding against irrelevance by studying naturally occurring work activities in real world settings;, we are able to see what the work looks like from the workers’ point of view by doing ethnographic fieldwork that involves us in the life of their community of practice; and we are mindful of the distinction between what people say and what they in fact do, which is crucial for any considerations of effective and beneficial workplace redesign.


Systematic Data Collection


In ethnographic work as in any form of data collection we have to ensure that data is gathered systematically and not haphazardly, if we want to generalize from our observations rather than having them stand only as descriptions of a particular event at a particular time. In the analytic paradigm, this requirement is taken care of by a research design which specifies the nature and size of the sample ahead of time. In the systemic paradigm, how many and what kinds of observations we have to make emerges’ in the course of the research as we become increasingly confident that we understand what is going on in a particular workplace. One criterion for judging the extent of our understanding is the “surprise index”. We can be certain that we have made enough observations when there are “no more surprises,” i.e. the events that happen in the workplace are predictable and familiar to the researcher the same way they are (or are not) to the workers. Note that with this criterion the variability of the phenomenon of interest determines the length of the period of participant observation more in highly variable situations, less in highly patterned ones Since we often don’t know how “full of surprises” a new setting will be, we are often unable to state ahead of time how long a field period will be needed for obtaining adequate data—a source of great frustration, if not for ourselves, then definitely for our funders.


Though the number of data points is not easily predicted within the systemic research paradigm, it is nevertheless extremely important within the systemic research paradigm to pay attention to data collection that is systematic, since without adequate coverage of the phenomenon of interest nothing worthwhile can be concluded in the end. There are many ways of ensuring that data collection is systematic I will discuss here only two of the most important strategies: attention to coverage and attention to focus.


Systematic Coverage


What I call “coverage” here is in some ways equivalent to the role of sampling in the analytic paradigm. Adequate coverage involves, minimally, spatial, temporal, and personnel considerations. For example, if we are interested in the work of software programmers, we would want to make sure that we collect data in all situations where they carry out work, which is probably in a fair number of locations in addition to their offices. We would also want to make sure that we cover programmers’ work temporally as much as possible, i.e. across shift changes, unscheduled midnight work; and the impromptu discussions that happen at I a.m. in the parking lot. In regard to personnel, one would want to be concerned not only with people who have the official title “programmer” but also with the support staff, the spouses that kibitz, the official and unofficial consultants that have input, etc. The idea is to get coverage across all types of participants, so that, sooner or later, the relevant observations get made, and the key questions asked, of everybody involved in the work. While we can think ahead about likely areas that will need coverage, it may be quite impossible to specify them rigorously in advance, as is done in the analytic research paradigm. What the necessary areas of coverage need to be depends primarily on what we find out in the beginning stages of fieldwork.


Foci of Investigation


Another way in which we can ensure that data are collected systematically is by methodically focusing on particular aspects of the work process I will briefly discuss here four types of focused records: the Person-Oriented Record (POR), the Object-Oriented Record (OOR), the Setting-Oriented Record (SOR) and the Task (or Process)­Oriented Record (TOR).


•   The Person-Oriented Record (POR): With the POR, we are interested in coming to understand what the working existence of a particular person is like for her or him the sequence of activities in their daily round who they interact with, what are routinely encountered problems and rewards, what do they perceive as typical and exceptional in a particular day’s work, and so on.


Note that we are actually not interested in the particular informant. What we are after is the persons like her or him who do that kind of work; in other words, what we want to understand is the work practices, not the psychology or experience of the individual qua individual. For which types of persons we need Person-Oriented Records depends on our assessment of relevant types of persons in the work setting.


•   The Object-Oriented Record (OOR): Here we are interested in following the career of an object, artifact, or document through the system. Again, our interest is not in the particular object but in its typicality, in the fact that it can inform us about the career of objects like it: who owns it, what it is likely to come in contact with, who has rights to manipulate it, change it, store it, dispose of it, and so on. Crucial artifacts and documents are studied by compiling an OOR.


•   The Setting-Oriented Record (SOR): The SOR chronicles what happens in a particular location through time—throughout a shift, a day, week, or whatever the temporal cycle relevant to the particular work site might be. For example, one might want to track activities in an airline’s dispatch room, a hospital’s OR, a research institute’s conference room, a company’s lobby, a factory’s parking lot, and the like. SORs can often provide a snapshot picture of a work setting’s linkages to other parts of the system. They are also extremely useful for understanding micro-settings, like the water cooler or the fax machine.


Many kinds of work activities are spatially distributed. If video taping is part of the data collection, doing a SOR might require placement of additional audio recorders and sometimes additional cameras in off-site locations in order to allow tracking interaction with remote participants


•  The Task—(or Process-) Oriented Record (TOR): The TOR details the activities, settings, facilities and persons necessary to get a particular task or process within the total scope of work done. Again, we are interested in the recurring, regular features of typical tasks such as opening up for business, handling a complaint, making out a ticket, dispatching a plane, conducting a meeting, and so on


Other ways of systematizing data collection are possible What is important to realize is that attention to systematic data collection is as crucial within the systemic research paradigm as it is in the analytic paradigm It is simply less formulaic and therefore requires more conscious attention and active, informed decision-making as the research unfolds.


The Co-Design of Productive Work Environments


Many of us are involved in orchestrating cultural and technological change in the workplace. Most of us would argue that such changes should be based on collaborative co-design that has the interests of workers as much in mind as those of management. We would further argue that design needs to be data-based, not a “design from fantasy.” We believe that attention to the methods by which we do the research that produces the data that feed into the design is of crucial importance in this enterprise.


Acknowledgments   It is in the nature of this kind of paper that it owes a tremendous debt to colleagues and collaborators. Of particular Influence in shaping my thinking have been the members of the Workplace Project at Xerox PARC: Francoise Brun-Cottan, Cathy Forbes, Charles Goodwin, Marjorie Goodwin, Lucy Suchman and Randy Trigg. I have also benefited enormously from my collaboration with Bill Clancey at IRL, a computer scientist and system designer who continuously pushes me to shape my methodological interests into something that speaks to systems design. Debra Cash, Paul Duguid, Meg Graham, Randy Trigg and two anonymous reviewers provided most useful comments. I am indebted to Susan Stucky for the precursor to Figure 1 An earlier version of these materials was presented as a working paper at the Conference on Productivity in Knowledge-Intensive Organizations, held April 6-10,1992, in Grand Rapids, Ml, and was published in the proceedings. I particularly thank the participants of that conference for their contributions to my current formulation of these issues.




Baba, Marietta

1991           The Skill Requirements of Work Activity: An Ethnographic Perspective. Anthropology of Work Review XIl:3:2-11.


Bentley, R., John Hughes, David Randall, T. Rodden, P. Sawyer, Dan Shapiro and I. Sommerville

1992           Ethnographically-Informed System Design for Air-Traffic Control. Pp. 115-122 in: Sharing Perspectives. Proceedings of the 1992 Conference on Computer Supported Cooperative Work. New York: ACM Press.


Bernard, H. Russell, Pertti J. Pelto, Oswald Werner, James Boster, A. Kimball Romney, Allen Johnson, Carol R. Ember and Alice Kasakoff

1984           The Construction of Primary Data in Cultural Anthropology. Current Anthropology 27:4:382-396.


Bishop, Elizabeth

19               Tinker, Taylor, Soldier -  Why?: Renegotiating the Employment relation when Organizations Introduce Computer Networks. Doctoral Dissertation, Department of Economics, University of California at Berkeley.


Blomberg, Jeanette, Jean Giacomi, Andrea Mosher, Pat Swenton-WaII

1993           Ethnographic Field Methods and Their Relation to Design. Pp. 123-1 55 in: Schuler and Namioka, eds. Participatory Design: Perspectives on System Design. Hillsdale, NJ: Lawrence Erlbaum.


Brun-Cottan, Françoise

1991           Talk in the Workplace: Occupational Relevance. Research on Language and Social Interaction 24:277-295.


Brun-Cottan, Françoise, Kathy Forbes, Charles Goodwin, Marjorie Goodwin, Brigitte Jordan, Lucy Suchman and Randy Trigg

1991           The Workplace Project: Designing for Diversity and Change (Videotape). Palo Alto, CA: Xerox Palo Alto Research Center.


Button, Graham, ed.

1992           Technology in Working Order: Studies of Work, Interaction and Technology. London: Routledge.


Darrah, Charles

1992           Workplace Skills in Context. Human Organization 51:3:264-273.


Engestrom, Yrjo

1992           Developmental Work Research: A Paradigm in Practice. Quarterly Newsletter of the Laboratory of Comparative Human Cognition 13:4:79-80.


Engestrom, Yrjo and David Middleton, eds.

1993           Communication and Cognition at Work New York: Cambridge University Press


Floyd, Christiana, Wolf-Michael Mehl, Fanny-Michaela Reiisin, Gerhard Schmidt and Gregor Wolf

1989           Out of Scandinavia: Alternative Approaches to Software Design and System Development. Human-Computer Interaction 4:4:253-350.


Gilbert, Jean M., Nathaniel Tahoma, and Claudia C. Fishman

1991           Ethics and Practicing Anthropologists’ Dialogue with the Larger World: Considerations in the Formulation of Ethical Guidelines for Practicing Anthropologists. Pp. 200-210 in: Ethics and the Profession of Anthropology Dialogue for a New Era, Carolyn Fluehr-Lobban, ed


Goodwin, Charles and Marjorie Goodwin

1993           Formulating Planes Seeing as a Situated Activity. In: Y. Engestrom and D. Middleton, eds. Communication and Cognition at Work. New York: Cambridge University Press>


Greenbaum, Joan and Morten Kyng, eds.

1991           Design at Work: Cooperative Design of Computer Systems. Hillsdale, NY: Lawrence Erlbaum Publishers.


Heath, Christian and Paul Luff

1991           Collaborative Activity and Technological Design: Task Coordination in London Underground Control Rooms. Proceedings of the Second European Conference on Computer-Supported Cooperative Work, Amsterdam, The Netherlands. Kluwer.


Heath, Christian and Paul Luff

1993           Collaboration and Control: The Use and Development of a Multimedia Environment on London Underground. In: Cognition and Communication at Work, Yrjo Engestrom and David Middleton, eds. Cambridge, MA: Cambridge University Press.


Hughes, John and Val King

1992           Sociology for Large Scale System Design. Paper for CRICT Conference on Software and Systems Practice: Social Science Perspectives. Reading, Nov. 30 - Dec. 1.


Hughes, John, VaI King, David Randall and Wes Sharrock

1993           Ethnography for System Design: A Guide. ‘COMIC’ paper 2-4, Department of Sociology. Lancaster University, LAI 4YL, UK.


Hughes, John, David Randall and Dan Shapiro

1992           Faltering from Ethnography to Design. Pp. 115-122 in: Proceedings of CSCW 92:  Sharing Perspectives, the Fourth International Conference on Computer Supported Cooperative Work. New York: ACM Press.


Hughes, John, Dan Shapiro, Wes Sharrock, Richard Harper and David Randall

 in press     Ordering the Skies: The Sociology of Coordinated Work. London: Routledge.


 Hutchins, Edwin

1993           The Technology of Team Navigation. Pp. 191-220 in: Intellectual Teamwork: Social and Technological Foundations of Cooperative Work, J. Galegher, R.E. Kraut and C. Egido, eds. Hillsdale NJ: Lawrence Erlbaum Associates.


Hutchins, Edwin and T. Klausen

1993           Distributed Cognition in an Airline Cockpit. In: Cognition and Communication at Work, Yrjo Engestrom and David Middleton, eds. Cambridge, MA: Cambridge University Press.


Jordan, Brigitte

1989           Cosmopolitical Obstetrics: Some Insights from the Training of Traditional Midwives. Social Science and Medicine 28:9:925-944.


1992a        Technology and Social Interaction: Notes on the Achievement of Authoritative Knowledge in Complex Settings. IRL Technical Report No. IRL92-0027.  Palo Alto, CA: Institute for Research on Learning.


1992b       New Research Methods for Looking at Productivity in Knowledge-Intensive Organizations Pp 194-216 in Productivity in Knowledge-Intensive Organizations Integrating the Physical, Social, and Informational Environments, H. V. D. Parunak, ed. Ann Arbor, Ml, Industrial Technology Institute.


1993           Birth in Four Cultures: A Crosscultural Investigation of Childbirth in Yucatan, Holland, Sweden, and the United States. Fourth edition, revised and expanded by Robbie Davis-Floyd. Prospect Heights, IL: Waveland Press.


Jordan, Brigitte and Austin Henderson

1995           Interaction Analysis: Foundations and Practice. The Journal of the Learning Sciences 4:1:39-102.


Kukla, Charles, Elizabeth Clemens, Robert Morse and Debra Cash

1990           An Approach to Designing Effective Manufacturing Systems. Paper presented at the Conference on Technology and the Future of Work, Stanford University, March 28-30.


Lave, Jean and Etienne Wenger

1991           Situated Learning: Legitimate Peripheral Participation. Cambridge, UK: Cambridge University Press.


Luff, Paul, Christian Heath and David Greatbatch

1992           Tasks-in-Interaction: Paper and Screen-based Documentation in Collaborative Activity. Pp. 163-170 in: Sharing Perspectives. Proceedings of the 1992 Conference on Computer Supported Cooperative Work. New York: ACM Press.


Manning, Peter K.

1992           Technological Dramas and the Police: Statement and Counterstatement in Organizational Analysis. Criminology 30:3:327-346.


Middleton, David and Yrjo Engestrom, eds.

1993           Distributed Cognition in the Workplace. London: Sage.


Morgensen, Preben and Randall H. Trigg

1992           Using Artifacts as Triggers for Participatory Analysis. Proceedings of PDC ‘92: The Participatory Design Conference, Cambridge, MA.


Orr, Julian

1990           Sharing Knowledge, Celebrating Identity: War Stories and Community Memory in a Service Culture. Pp. 169-189 in: Collective Remembering: Memory in Society. David S. Middleton and Derek Edwards, eds. London: Sage.


Orr, Julian and Norman Crawfoot

1992           Design by Anecdote: The Use of Ethnography to Guide the Application of Technology to Practice. Proceedings of PDC ‘92: The Participatory Design Conference, Cambridge, MA.


Sachs, Patricia

in press      Thinking through Technology. Information, Technology and People.


Salomon, Gavriel

1991           Transcending the Qualitative-Quantitative Debate: The Analytic and Systemic Approaches to Educational Research. Educational Researcher 20:6:10-18 (August/September).


Scribner Sylvia and Patricia Sachs

1991           Knowledge Acquisition at Work. Technical Paper #22. New York: Institute on Education and the Economy, Teachers College, Columbia University.


1990           A Study of On-the-Job Training. Technical Paper No. 13. New York: National Center on Education and Employment.


Suchman, Lucy

1987           Plans and Situated Action. Cambridge, UK: Cambridge University Press.


Suchman, Lucy and Brigitte Jordan

1991           Validity and the Collaborative Construction of Meaning. Pp. 160-1 78 in: Questions about Questions: Inquiries into the Cognitive Bases of Surveys. Judith Tanur, ed. New York: Russell Sage.


Suchman, Lucy and Randall H. Trigg

1991           Understanding Practice: Video as a Medium for Reflection and Design. Pp. 65-89 in: Design at Work: Cooperative Design of Computer Systems, Joan Greenbaum and Morten Kyng, eds. Hillsdale, New Jersey: Lawrence Erlbaum Associates.


Wenger, Etienne

in press      Communities of Practice. Cambridge: Cambridge University Press.


Zuboff, Shoshanna

1988           In the Age of the Smart Machine: The Future of Work and Power. New York: Basic Books.