![]()
Silvia Gherardi.
Organizational knowledge: The texture of workplace learning.
London; Malden, Mass.: Blackwell, 2005.
cloth, 265 p., ISBN 1405125594, US$79.95.
Blackwell: http://www.blackwellpublishing.com/Silvia Gherardi’s new book, Organizational knowledge: The texture of workplace learning, is an excellent theoretical treatment of learning in organizations. Gherardi is an Italian sociologist at the University of Trento. The book is part of a body of work known as practicebased research in which analysis of cognition and mind are replaced by analysis of practice. Definitions vary, but practice has to do with stable routine enactments in which human and nonhuman elements interact.
While some aspects of the practicebased approach are problematic (of which more in a moment), I first want to say how extraordinarily nuanced and insightful Gherardis analysis of workplace learning is. Gherardi studied safety practices in the Italian construction industry, focusing primarily on tradesmen, but also considering organizational approaches to safety. She used a case study methodology, collaborating with a young male researcher who could apprentice in the industry (which as a female she could not readily do).
... I first want to say how extraordinarily nuanced and insightful Gherardis analysis of workplace learning is.
Gherardi developed the idea of the situated curriculum by which she meant learning as a localized practice in which content is closely related to the specific set of local, material, economic, symbolic, and social characteristics of the field of practices and work activities. She contrasts this view with Lave and Wengers (1991) learning curriculum which is a more generic concept concerning the learning practices of a profession at large.
Gherardi stated that The situated curriculumÉexhibits an erratic, contextdependent, redundant, eventbased and largely nonlinear sequence where what to do, how to do it (and how to do it skillfully) are taught on an experiential basis and the novice ‘learns the ropes of the trade by imitation and contact with practitioners. It is in the minutiae of specific work events enacted in contact with practitioners that workplace learning happens.
This understanding nicely undercuts the illusion of a practice of knowledge management that could bottle up knowledge in databases. As corporations lay off workers or outsource operations to less experienced and more isolated workers, they hope to cash in on the expertise workers have built up over the years by preserving it in databases. Gherardi shows that this is not realistic.
Gherardis analysis contains the surprising revelation that many senior workers did not want to reveal their expertise within the situated curriculum. And one novice was given same task over and over again limiting his learning. This lack of sentimentality was a breath of fresh air. Much of the analysis from the various situated camps deletes such dislocations, endorsing informal learning as superior to other forms. Gherardi showed how the formal training of the construction workers was a complement to the informal. She presented the situated curriculum honestly, warts and all, making the concept much more useful for application as well as theory.
The larger project of advancing a practicebased sociology seems less useful than the case study. In the service of slaying the demons of the rational decision maker and AI, the practicebased approach eschews concepts of mind and subject. Pretty drastic surgery for a nonfatal condition.
Gherardi said that practicebased analysis gives priority to practices over individuals. We must attend to sets of seeing, doing, and saying. The concept of intentionality is rejected — perhaps it is too much like the rational decision maker? Gherardi advises that with the practicebased approach we need not worry about whats going on in peoples heads or what they intend. Communication and language must yield to discursive practices which are identifiable patterned verbal interactions.
In practicebased theory we have lost our minds but gained our bodies, which are made visible in the theoretical discourse. For example, Gherardi said, Craft trade requires trained bodies ones, that is, which have incorporated an expertise or connoisseurship. It is through the body that an eye for something [is developed]. The trained body is seen as an antidote to cognitivism and rationality.
While I agree that craft trades require trained bodies, I have some problems with this statement. The mind and body are not separate. Except for processes such as digestion, the mind is required for what the body does. Practicebased theorists are engaged with the level of human activity at which the mind virtually always comes into play in working inseparably with the body. The practice approach has, rather amazingly, introduced mindbody dualism as a core principle.
An emphasis on the bodies of others on the part of scholars who clearly have a vibrant mental life is somewhat unnerving.
An emphasis on the bodies of others on the part of scholars who clearly have a vibrant mental life is somewhat unnerving. Those of us in the chattering classes know well that it is our abilities to spin originalsounding utterances that are not just same old discursive practices that earn our bread and butter. So another dualism seems to be inadvertently advanced those whose bodies must be trained for work and those who work with their minds.
Gherardi says we cant know others intentions and therefore they should be outside analysis. But how can we know a trained body? Gherardi quoted construction workers who invoked the body in their descriptions of how they worked. Surely the workers words about their bodies came from their minds.
I sense a sort of puritanism in the practicebased approach. Mind is a natural phenomenon yet we must give it up as though it were an unspeakable excess, along with subject, communication, and language. Gherardi is not alone here; see The practice turn in contemporary theory an edited collection by Schatzki, et al.. for other practicebased analysis that asks similar sacrifices of us.
I find myself wondering if maybe we should just stick to midrange theories grounded in ethnographic observation. It is there that so much insightful work seems to be done such as Gherardis development of the concept of the situated curriculum. Though I disagree with the practicebased approach in many ways, I will nonetheless be using Gherardis more grounded concepts in my research on learning in online games.
Gherardi is extremely wellread and you can learn a lot by reading Organizational knowledge: The texture of workplace learning. Youll find literature that no one else seems to cite that Gherardi not only cites but explains and contextualizes. Gherardis courteous but critical analyses of many bodies of literature are woven through the text in a lovely pattern, providing insightful discussions of varied streams of research. She makes it look easy, but years of careful thought and reflection have gone into this book. Organizational knowledge: The texture of workplace learning is itself as richly textured as the workplace learning practices Gherardi documents. Bonnie Nardi, Donald Bren School of Information and Computer Sciences, University of California, Irvine.
References
J. Lave and E. Wenger, 1991. Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.
T.R. Schatzki, K.K. Cetina, and E. von Savigny (editors), 2001. The practice turn in contemporary theory. London: Routledge.
Torin Monahan.
Globalization, technological change and public education.
London: Routledge, 2005.
cloth, 213 p., ISBN 0415951038, US$26.95.
Routledge: http://www.routledge-ny.comResponding to the inducement of grant monies, members of a Puente Piedra College (pseudonym) committee, during the Fall 2005 semester, squabble over defining, and then creating outcomebased measures for, the institutions official notion of information literacy. In the end, they formulate the narrowest of all possible definitions and the attendant outcome measures, duly cataloged and embedded across specific courses. Peering at the emailed MS Word table composed by a tenured full professor in the social sciences, I blanch, slightly, as the Jehovahsque miniproclamationsonatemplate, distributed through that disciplines curriculum, imperceptibly flickers on the LCD screen: The student shall learn how to construct an Excel spreadsheet, or The student will be required to gain a working knowledge of PowerPoint. While the development of modest technical proficiency in Microsoft Office modules is useful, defining such narrow competences as information literacy, is, at best, an ersatz appellation.
For this committee, crafting an official definition that fostered habits of critical reflection on and assessment of emerging information sources and representational formats was not of sufficient import. Also absent was an open acknowledgement of how technological ensembles, because they differentially constitute bodies, identities, discourses, networks and relations of power, are deeply, profoundly and inherently political. The committees final product reinforced an industrialera, de facto Fordist sensibility of workingclass students as malleable objects, subjected to power, rather than agents actively exercising power. What was left on the committees cutting room floor was as revealing as what was meticulously cleansed and shrinkwrapped for presentation to the institutions internal and external clients.
Torin Monahans Globalization, technological change and public education frequently engages such definitional issues. Pace Foucault, Monahan redirects these questions from the typical fussing over performativity issues (How do we document whether, and how well, this or that works?) to how information technologies, and their accompanying ideologies and operational definitions, produce and reproduce social relations and networks of power. Although Monahans object is the Los Angeles Unified School District (LAUSD), a massive K12 system with a very different history and demography than Puente Piedra, the external forces and internal logics reshaping public education are so broad, systematic and deep that his ideas and experiences are recognizable, across all levels of U.S. public education, in the first decade of the twentyfirst century.
Key to Monahans definitional approach is, in part, the combination of two notions. First, Monahan discards the narrow and often selfserving fiction that information technology ensembles are but mere pedagogical tools. Instead, Monahan draws on the more productive analytical frame of technologies as social actants (derived from the field of science and technology studies) and, more specifically, as forms of media. Secondly, as social actants and embedded media, information technologies are also artifacts of a macropolitical economy. For Monahan, how information technologies and their enabling ideologies (and accountability routines) are deployed in the LAUSD illustrates a primary social fact: In the manner they are often aggressively touted and formally implemented across public education sites, information technologies are often the distilled products of a massing of early twentyfirst century corporatist social control and capital extraction mechanisms, typical of globalization and its enabling ideology, neoliberalism. The major effect is a rapid and intensive reengineering of the mitochondria of U.S. public education. As Carlos Alberto Torres notes in his foreword, Monahans conclusion is
... that the built pedagogy of technologized schools shuffles traditional relations between students and teachers, [as] it dramatically increases [the nourishment of the private sector while intensifying] bureaucratic control over both groups while aggravating their institutional vulnerability. (p. x)How this happens, in Monahans multisite ethnography of the LAUSD, is a layered, complex, fragmented, resisted, sometimes contradictory and always overdetermined process. It is to Monahans credit that he does not gloss over the profuse details. Fortunately, hes able to corral a significant element of the phenomena, across various institutional actors, networks, and spaces, by weaving through his narrative the notion of fragmented centralization. As a metagovernance element, the logic of fragmented centralization spreads across spatial and institutional strata, from classrooms to boardrooms. What does Monahan mean by fragmented centralization?
[Despite] the ostensible decentralizing valences of information technologies, a form of centralized control persists ...
[In] fragmented centralization ... decisionmaking authority is ... more centralized [reviewers emphasis] while accountability for centrally made decisions is ... more distributed [reviewers emphasis] down the hierarchy ... This ... gives organizations the appearance of responsible management but simultaneously decreases worker autonomy while intensifying workloads ... Centralization is now a stealth endeavor hidden in the seemingly apolitical settings of specifications and standards while risk and responsibility are fragmented and copiously distributed to those on multiple peripheries ... . (pp. 94, 106)
Composed from a variable mélange of Fordist and postFordist economic and organizational logics, fragmented centralization is reinforced by historical, structural and operative factors. These include preexistent physical and social architecture; durable organizational fealty systems; the preconditions set by external funding and granting agencies; daytoday corporatist specification and operational standards; risk avoidance restrictions imposed by actuarial planners, etc. So, rather than the protean flexible specialization attributed to an idealized and mythical lean postFordist organization, Monahan argues that this (below) is what we get:
[LAUSD] mutates in response to changing perceptions of the role of education in society; it accepts the responsibilities given to it by funding agencies, industry, and the public ... it performs elaborate rituals of disclosure and restructuring ... it develops many cooperative relationships ... [while] feed[ing] the global economy with generous industry contracts and pliable workers and consumers. In other words, [LAUSD] flexibly adapts to the global economy but does not provide a flexible environment for workers or students. This current form is the paradigm for a postFordist organization. (p. 109)As I was reading Monahans description, Puente Piedras fourcolor glossy Annual Report arrived, via post. The reports themes reinforced Monahans observations. For example, at the top of page 12, in large type, capitalized and bolded was the assertion: We have a responsibility to be agile, dynamic and responsive. The text below this particular headline reads, in part, as follows:
Puente Piedras success is predicated on our ability to be proactive and nimble. It also demands that, on occasion, we change our ways of thinking ... to further marshal our resources and coordinate our endeavors so we can be wherever we are needed ... The people we serve told us that new thinking was needed to ... ensure consistency ... The [endowment] has grown ... This reliable, private stream of revenue has helped ... [offset] ... fluctuating state appropriations.The Annual Report enthusiastically emphasizes technological development and expansion:
Perhaps nothing speaks to Puente Piedras innovative spirit as much as the continued growth and use of technology in teaching and learning. A full (sic) 100 percent of firstyear students came to campus in 2004 with laptop computers ready to connect to our wireless infrastructure ... (p. 5)
To walk into one of our classrooms is to be amazed ... the majority [have] ... faculty computing stations, LCD projectors and highspeed wireless network transceivers ... (p. 6)
Although its easy to be entranced by technological toys, Monahan reminds us of a basic lesson, one that I frequently encounter in the classroom: Computers do not signal equity, (p. 154) nor genuine flexibility, in institutional environments, anymore than standardized test scores reflects deep learning. Whether at the LAUSD or Puente Piedra, information technology deployments in public education are often part of a condition where selfgovernance and self-adaptation to system rigidities become naturalized. (p. 149) What are the common systemic internal rigidities that have an elective affinity with the swarming of information technologies in U.S. public education? According to Monahan, they are as follows:
First, the rapid growth of information technologies in schools is tightly paired with an equal or greater growth in audit culture, imposed in the name of global competitiveness, and enforced by various levels of accreditors. Audit cultures are microaccountability routines that reformat institutional practices and individuals as auditable. Audit culture produces forms of selfpolicing and obsessive documentation that multiply like rabbits.
Secondly, Monahan discusses the ramping up of regulation culture, a term that refers to the many laws ... policies, tests and other mechanisms that govern practices within public spaces. Pace Foucault, these can be seen as the formal and informal intensification of governmentality effects that shape the conduct of conduct. Most interesting is a third term, coined by Monahans LAUSD informants: Firewall culture. Firewall culture can be described as a form of individual and institutional reaction against the unwanted visibility produced by audit and regulation regimes. Generally, such reactions consist of setting up communication outside the institutions zone of recordability. The results of such behavior can be positive, negative and, at times, paradoxical. (For example, a firewall strategy, such as the use of a noninstitutional email address, may shield lowlevel actors from excessive scrutiny. However, the same strategies may also be used by those who actively construct and police zones of visibility, as well, when these auditors, regulators or other policing agents desire the same invisibility). Writ large, the result is an increasingly selfpoliced, suspicious and restricted communicative environment that Monahan tags as inherently inimical to deep learning.
Clearly, much in Monahans book is starkly recognizable. In its format, the texts conceptual organization is stratified. Monahan unpacks his ideas by semicompartmentalizing different levels of his object, moving from the physical spaces of classrooms, to the analysis of the distortions and exclusions in the neoliberal verbal tropes than promote information technologies, and then to the tensions between local IT personnel in schools, and their nemesis at the LAUSD central office. The tensions between the standardization and control ethos of central office and the variant, site specific needs and coping strategies that typify the multiple, distributed periphery of specific schools are usefully reframed in subsequent chapters (on Fragmented Centralization, Policy Games, and Flexible Governance, where ethnography meets political economy and contemporary theory). In the next to the last chapter, Future Imaginaries, Monahan offers up counterstrategies to mitigate the restrictiveness imposed by early twentyfirst century technopedagogical practices, even as he shows us a frightening rendition of an ideal learning environment, sketched by one LAUSD student, that resembles the architecture of a Nazi concentration camp. In the brief final chapter, Monahan discusses the role of neoliberal ideology, as the master narrative for the intensification of social control via technological means. Ultimately, Monahan charges his readers with the task of redeploying information technologies to ferret out, nurture, amplify and invent practices of freedom. In public education, we should, and we must, he says, foster creative, productive and deeply human interactions, mediated via information technologies, rather than use these ensembles to subject students, faculty and staff to the new ghosts of excessive social control embedded in the distributed machinery of capital.
Monahan has generated a useful, troubling and thoughtprovoking set of ideas and observations, in his first book. For this reader, the author was something of a proverbial Jacob, wrestling with the angel (the LAUSD, his object of analysis). The resistant angel finally responded by flooding Jacob with a tumult of connections and ideas. The author worked diligently to shape this cornucopia into a semipliant narrative. But this is such a rich, complex and important topic that the controlled profusion is, on balance, far more stimulating than distracting. (Monahans deft use of tropes is a substantial asset, in this context). If there is an area of the text that is underdeveloped, it is in the truncated references to public relations and marketing culture. With U.S. public education increasingly driven by a notion that education is a private, rather than a public good, public relations and relationship marketing discourses and practices have become more prominent part of institutional strategies. Across the U.S., top administrators and their Chief Information Officers (CIOs) diligently police the details of self-representation in furtherance of positive brand identification. However, in all fairness to Monahan, the phenomenon of branding in public education is a separate, if closely aligned, object of inquiry. With this book, Monahan has already given us plenty of material to chew on. In its own way, the books overarching message is an academic equivalent of the Buffalo Springfields Vietnamera song, For what its worth: I think its time we stop, hey, whats that sound, everybody look [at] whats going down. Dion Dennis, Department of Criminal Justice, Bridgewater (Mass.) State College.
J.R. Okin.
The technology revolution: The notfordummies guide to the impact, perils, and promise of the Internet.
Winter Harbor, Maine: Ironbound Press, 2005.
cloth, 286 p., ISBN 0976385724, US$26.95.
Ironbound Press: http://www.IronboundPress.comBy the early 1990s the late Michael Hauben had coined the term netizen to denote those he saw as seeking to promote the development of the larger Internet community in positive ways: Netizens were truly network citizens because they were much more than mere nationals working with computer networks [1]. They grasped the significance of the global Internet and understood, at an early time, its capacity to change cultures, societies and governments for the better. Their active participation was bringing about a communication shift that would have extraordinary consequences, far beyond what even they envisioned. As Hauben states:
These people understand the value of collective work and the communal aspects of public communications. These are the people who discuss and debate topics in a constructive manner, who email answers to people and provide help to newcomers [sic], who maintain FAQ files and other public information repositories, who maintain mailing lists, and so on. [Netizens] are people who understand it takes effort and action on each and everyones part to make the Net a regenerative and vibrant community and resource. Netizens are people who decide to devote time and effort into making the Net, this new part of our world, a better place. [2]The early netizens efforts are little known or appreciated in most of todays Internet community. But their impact and that of many others on the Net is remembered and honored as part of the complex story continued in J.R. Okins The technology revolution: The notfordummies guide to the impact, perils, and promise of the Internet. This is the third and final volume of Okins series on the Internet, the other two being The Internet revolution: The notfordummies guide to the history, technology, and use of the Internet, and The information revolution: The notfordummies guide to the history, technology, and use of the World Wide Web. Together they are intended to represent a moreorless comprehensive overview of the Internet, its history, structures, resources and manifold impacts.
Through use of an admittedly arbitrary selection of chapter topics for his final volume, Okin provides good illustrations of how the Internets basic design was adapted to handle various innovations, as well as how that evolution posed problems its designers did not initially foresee. The Java programming language, for example, was inherently suited to the Internet. Yet its development placed it alongside commercial imperatives, directly shaping how it evolved, and how it ultimately arrived in the form we use today. Javas emergence was coevolutionary with other players as well. IBM was undergoing transformation from a mainframebased company to one whose future, like that of most businesses, would be increasingly dependent on the Internet. IBMs and others adoption of Java, as Okin points out, ... eliminated the barriers between the resources that composed the Internet, the information that composed the Web, and pretty much everything in between. [3] IBM was one of many beneficiaries of Java, alongside consumers; Okin handles this discussion very well.
The Java story illustrates how difficult it often has been for the Internet to accommodate innovations and demands coming from a wide variety of developers and constituencies. When Hauben proffered the concept of netizens, the Internet was a complex melange of email, news groups, and many separate, difficulttouse protocols. But back then it also consisted of a more cohesive environment of users, who helped inculcate values of trust and civility. That all changed when the Web caused public and commercial interest in the Internet to explode at a level and pace none of the early netizens could foresee. Additionally, the Web brought with it an environment that could be exploited to invade our privacy and to jeopardize security across the Internet both topics given separate chapters by the author.
His chapter on the dot.com bubble of the late 1990s is especially fascinating, not least because it incorporates his own experiences with one of the hundreds of companies seeking to capitalize on the Webs growth during that period. This is a frank account of his own rewards and frustrations in an organization; it contains many passages that will be familiar to anyone who has worked with programmers or in a programming environment. While hindsight is indeed superior, one wonders whether there would have been fewer failures during the dot.com boom if the kinds of basic structural improvements and checks Okin tried to implement with his employer had been followed. He deserves high marks for candor and honesty in his appraisal of his own performance at that company.
Chapters on the digital divide separating the haves and havenots of Internet access, and another on the nature of Internet communities seem less engaging because their subjects have been so thoroughly examined over the years. The Cleveland Freenet, and freenets in general, are curious omissions given their historical significance in the development of the Internet community.
The final chapter represents the authors summary observations on the Byzantine story of the Internet. Entitled Synergies, it is a review of communication history, the convergence of media through networking and computers, and the everincreasing pace of individualization in communication via the Net. Though sometimes repetitive, both within the chapter and of points made elsewhere in the book, this is an interesting read full of good points and persuasive arguments. As Okin acknowledges, no one has been good at predicting what will happen with the Internet; many have been proven wrong over and over. But he presents a very good review of how much the Net has altered our sense of community, our traditional facetoface awareness of others, and of its interlinkages with our increasingly ubiquitous methods of electronically communicating. From this he extrapolates what may happen in the future, though via a discussion that is sprawling at times.
Few readers will have reason to quarrel with his observations on the Internets empowerment of the individual, or of its unique capability to provide users with a presence and residence on the network. The identities we achieve there, even using only email, have transformed our entire understanding of social interaction: Consider the information you routinely access on the Internet. Now think back to where this information came from and how you accessed it before you started to use the Internet. The differences are sriking. [4] Increasingly, of course, more and more people will not be able to appreciate these differences as preInternet communication and information acquisition become less and less a matter of direct experience.
The text concludes with an appendix on milestones related to privacy and security, including sections on netiquette and jargon.
It seems odd to include the notfordummies subtitle in all three of Okins titles. For one thing, it isnt the case that much of the material is beyond a neophytes grasp. Okins expository style is on the whole very clear, and by no means is much of what he discusses too complex for beginners. One would think that such a subtitle would therefore discourage a readership which, presumably, is much larger than the more experienced group one infers is the focus. And for the latter, much of that same material will simply not be very challenging or will represent already wellknown concepts and issues. In this reviewers opinion it would have better to drop the notfordummies subtitle as it is difficult to see how it would expand readership, given the content realities of the series.
That aside, the current volume and its predecessors form an achievement that is especially impressive for one author. While there are omissions, on the whole Okin has handled an extremely complex subject very well indeed, and has done so in readable terms. Those wishing more than a casual exposure to Internet history, structure and issues should read all three. Douglas Kocher, Chair, Department of Communication, Valparaiso University.
![]()
Notes
1. Haubens seminal concept, originally posted online, was eventually published in Michael and Ronda Hauben, 1997. Netizens: On the history and impact of Usenet and the Internet. Los Alamitos, Calif.: IEEE Computer Society Press; excerpts appeared in the July 1998 issue of First Monday at http://www.firstmonday.org/issues/issue3_7/.
2. Michael Hauben, What is a Netizen?, special issue, The Amateur Computerist, 1 May 2002, volume 11, number 1 (available at http://ais.org/~jrh/acn/Back_Issues/Back_Issues%5b1998-2002%5d/ACn11-1.txt). This issue is devoted to commentary on Haubens significance and is very worthwhile reading for anyone interested in Internet history. For further information on The Amateur Computerist itself, see http://www.columbia.edu/~hauben/acn/.
3. Okin, The technology revolution, p. 93.
![]()
![]()
Copyright ©2006, First Monday
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2013.


