On following indexes as ethnographic methodology

Ethnography, like most (all?) scientific methods, must initially proceed on the postulate that there is, over there, some “it” to write about.  All critiques of ethnography have succeeded in demonstrating that, for human phenomena at least, this postulate cannot stand.  Anthropologists, as Geertz put it, do not study villages, they study “in” villages (1973: 22).  The new question that has not been answered: what do they do when they arrive in a village, if they are not going to study “it”?  Geertz suggests they might study “colonial domination” but does not quite explain what that might be. I suspect Geertz would say this is an ideal-type (Weber 1949: 89-95). Parsons might say it is a “formal category.” In either case, the anthropologist is just as much as a loss as when Malinowski or Boas told her to record “everything.”

I venture to say that most anthropologists of the past half-century have, uncomfortably, proceeded “as if” there were some there there, and I have often proceeded in such a matter—or let students proceed as if they would find objects to write about.

We have to find a clear way of stating what one is to do in the absence of a postulated ‘it’.  Many have argued for what they call “multi-sited” ethnographies which, I think, it intended to account for ethnographic activities when the ethnographer moves from one setting to the next in an attempt to … do what?  I have not been convinced.  First, there is the danger that one is led back to the initial problem: what is a “site” that there can be several?  Second, there is the matter of the selection of the sites.  Does one make this selection on the sense that there is a population of sites from which one select a sample?  How else might one proceed?

Working with students this past academic year, planning various research projects, and continuing to think about webs/networks, polities, etc., has re-opened this question with some urgency.  At this moment, I am exploring the following in an expansion of Garfinkel as compounded by Latour.  The fundamental methodological principle is: trust the people to tell you what makes them make differences in their lives and that of others.  The people, too, are trying to figure ‘it’ out, and, in the process ‘constitute’ it.  All we have to do is follow them.  This, of course, is not easy since the constraints on their methods (getting their work acknowledged as relevant to the task in such a way that the task is accomplished) are not those under which we operate (getting work acknowledged as ‘social science research’).  So:

  1. start with a salient phenomenon in some population (cohort). That is, start with a local (national) topic of conversation among the population. The more contentious this conversation, the better for our purposes.
    1. NCLB, indigeneity, autism would be such phenomena (to mention ongoing work by Jill Koyama, Jeff Schiffer, Juliette de Wolfe). These are salient in the United States or Canada. They generate a lot of talk. And that talk is easy to find in many setting.
      1. Do not attempt to define, say, “autism” or “indigeneity.” The participants, in their talk might make it look as if it is an ‘it’. You can remain agnostic while accepting that the practices in which the participants engage are very real and produce concrete consequences.
      2. Do not attempt to define the setting either. Again, the people will tell you its boundaries and reach through their own practices.
    2. Postulating a web means that one can start anywhere convenient.
  2. The danger is to take this starting point as THE core point. To prevent this, it may be best to start with a setting “obviously” peripheral
  3. listen carefully for indexical sequences (e.g. “We are doing this against our better judgement because they make us do it.”)
    1. these sequences are going to be included within larger conversations and will include many indexes to the current conversation and cohort, as well as to other conversations and cohorts.
    2. again, the more contentious the conversation, the more likely it is that linked matters will be indexed.
    3. the indexical sequences need not be verbal though one can start with the verbal as it may be easiest to access
  4. follow the indexes to the next convenient setting/moment
  5. repeat until exhausted (or a year has passed—though the temporality of such a search needs also to be addressed).

More to come.

more on intentionality

One of the most puzzling aspect of facing for the first time G.H. Mead’s (and all other pragmatists’) consideration about meaning is what happens to “intentionality” when the emphasis is placed on the openness of what is said until the “third turn” when the interpretant kicks in and that which has been done is settled, at least for a time. Students understand the argument (thanks to “what time is it” illustration) but one at least will be upset and ask the question: “are you really discounting the intention of the first actor?” I have to say ‘yes’ to that, but this cannot be the end, if only because it does not quite satisfy the common sense of the student(s) who remain convinced that action is founded on intentions and that research should emphasize those, if only to preserve the autonomy of the individual, “agency.” Students may not realize the ideological grounding of the insistence on intentionality, but they do insist.

Once thinking about this, I also realized that another aspect of my teaching Mead, or Garfinkel, is my insistence that cultural patterns (the ensemble of a identifiable three turn sequence) can be oppressive on all the individual participants (culture disables). At that moment, I have re-introduced the individual as a separate entity. As Ray and I have written several times, the individual can be a “unit of concern” (1998: 217).

But, if this is so, then the individual is also a unit possibly suffering because of the gap between that which she intended and that which it is now publicly acknowledged has happened.

I tried to say this in Chapter 8 of Successful Failure but I never got the sense that this piece was successful. Even I, sometimes, have a hard time reading it. So it needs to be said more easily. The power of Mead’s analysis lies in his insistence that we can indeed talk about an ‘I’, that the ‘I’ refers to an experience even though it cannot be identified without doing the violence to it that Ray and I attribute to “culture.”

So, it is not that individuals do not have intentions, but that their intentions are not the primary motor of the constitution of an event so that it gets known as having happened. Society cannot be explained by intentions (or by learning, etc.). Interaction can produce acknowledged (canned) intentions, and one of the purpose of social analysis can be to distinguish between such labeled intentions and the always-to-remain mysterious ‘I’ that moved others to label the actor.

The acting ‘I’ is free.

on taming the ignorant powerful …

In the earlier post, I marveled at Oprah’s pedagogy … as an anthropologist interested in education. As such I am puzzled and need to figure out how the new skepticism about medical authority is actually performed, by whom and when. So I was fascinated by an editorial by Joel Stein of Time Magazine about “The Vaccination War.”

In an earlier post, I asked a question, with a tongue in my cheek: “how could we tame Oprah?” I did not specifying who ‘we’ are, on what grounds ‘we” should try to tame her, and whether taming Oprah (and others like her) is something that could be done.  After all, they are wonderfully extra-Vagant (as Boon, 1999, might put it) and likely to escape most forms of social control.

I leave the questions open for the moment in order to expand the puzzle triggered by a critique of the advice Oprah dispenses on matters like vaccination.  There is every evidence that, from the ‘official’ public health point of view, her shows can be dangerous, particularly when she discusses vaccination.  She may endanger the health of individual children not getting vaccinated, as well as the health of the public as these children get sick and may sicken others.  At least this is what us, sober headed experts in public health as driven by medical research, might say (and have said).  As a highly schooled expert myself, and someone who generally accepts what other experts tell me, I am uncomfortable at any challenge to my expertise, particularly when it comes from someone as powerful as Oprah.  But I am not writing her to complain.  I continue here to puzzle.

In the earlier post, I marveled at Oprah’s pedagogy (the way she formats the discussions) and at her use of a classic American formula where highly schooled experts are pitted against the common sense of the non-expert.  As a cultural anthropologist, I cheer at a particularly striking improvisation on an old theme.

But I am also an anthropologist interested in education.  As such I am puzzled and need to figure out how the new skepticism about medical authority is actually performed, by whom and when.  So I was fascinated by an editorial by Joel Stein of Time Magazine about “The Vaccination War” and how it played out in a “liberal, wealthy [couple] of L.A.” and their immediate network.  He assumed his new-born would be vaccinated, his wife said ‘No’.  So:

To try to be open-minded and stop our fighting, I went to a seminar about inoculation at Cassandra’s yoga center.  Along with about 50 other people, we paid $30 each to listen to Dr. Lauren Feder.

This Dr. Feder explained why inoculation is not necessary, and how not immunizing might possibly be healthier for the child.  I am tempted to write about this as a case of deliberate mis-education and induced ignorance among the prosperous and powerful.  I could also write it as a case of resistance among some of the powerful (the people who might appear in Larry David’s Curb your enthusiasm) against some other powerful people (health experts).  Unless of course it is just a case of mass hysteria.

Again, I do not want to complain, but the problem remains: “how are we to tame the mis-guided powerful?” (if we decided to do so).

“Educational research” has produced much that has documented the apparent ignorance of the poor and powerless, particularly as compared to the apparent knowledge of the prosperous and powerful.  This literature assumes that the latter know (more) and that they pass this knowledge on to their children (“cultural capital”).  But what if the prosperous also mis-educate their children, if not on matters relating to schooling at school-relevant moments (e.g. high stake exams), at least on a lot of other matters, and at other times?  And what if the prosperous’ success in school tests is not quite a reflection of what they end up “knowing”?  What if the prosperous, often, just pass (pretend, lie) at passing (school tests)?  There is some evidence that the children of the poor sometimes fail at tests because they refuse to do what it takes to pass.  We need to see if, in some case and for certain purposes, the children of the powerful succeed at the same tests because all they have done is do what it takes to pass, without learning in the sense of understanding or accepting the answers they gave on the tests.  When test-taking is done, then what they actually learned over the course of their schooling, or what they continue to educate themselves about as they grow older, may reveal itself as something else altogether than school-people might have expected.

Joel Stein, in the same column, says something very interesting and challenging to us school people about the source of his adult knowledge:

Even in the age of Google and Wikipedia, we still receive almost all of our information through our peers.  I believe in evolution not because I’ve read Darwin but because everyone I know things it’s true.

Now, that is a challenge for all educators, and for all researchers on education.

how to tame Oprah …

What sort of policies might be established to control the kind of systematic education that is not controlled by experts in pedagogy or curriculum? How are professional teachers to control people like Oprah Winfrey when they push what experts consider “mis-“education?

In the space of two days, I watched a PBS biography, of Benjamin Franklin and an journalistic discussion of Oprah Winfrey Show in Newsweek (June 8, 2009).  Of course, the scholars who discussed Franklin mentioned again and again his place in American history as one of the first and most powerful advocate and practitioner of self-education both through one’s own efforts and through the efforts of people like himself to teach in such a way as to make it easier for people to self-educate.  The journalists reporting on the Winfrey show emphasized the way she presents health issues with an emphasis on personal vs. expert knowledge.  Arguably, both Franklin and Winfrey belongs to the same American tradition that Franklin helped institutionalized.  Both establish their authority by calling on the common sense skepticism about the wisdom of legitimized authority.  It has been said that P. T. Barnum used similar rhetorical moves as he presented the displays for which he was famous as occasions to exercise the skepticism of the audience. One did not have to take Barnum at his word, but one should come in and check for oneself.  Winfrey uses the same argumentation in the statement she released to Newsweek when she was asked to respond to their story: “People are responsible for their actions.  The information presented on the show is … not an endorsement … My intention is for the viewers to … engage in a dialogue with their medical practitioners.”  In other words: “Do not believe me! Judge for yourself!”

Early in my career I would have stopped here with a quip like “only in America!” and possibly mentioned other famous current people who use similar forms. The symbolic successes of Sarah Palin can certainly be linked to the same broad cultural pattern; and so can the refusal to accept the usual narratives about U.S. history, or evolution; and so perhaps can much of what has made the Interned powerful as a tool for cultural production (education).  In the inner cities, the Bible Belt, and probably everywhere else as well, the call to trust common sense over expertise remains what it has been—and Hollywood is happy to profit from it.  As Franklin said of his Almanac: “I endeavored to make it both entertaining and useful and … I reaped considerable profit from it” (as quoted in Cremin, 1970: 374).

But today, I will not stop with an altogether admiring wonder at American quirks.  I am involved in an academic struggle with the definition of what is to count as educational (school reform) policy in the United States and, particularly, with the search for a method to control what experts have determined should be taught by American schools so that it is learned by all American children (so that no child, indeed, is left behind).  I have argued that this attempt is unlikely to work if it focuses solely on schools.  Actually I am not even sure that going beyond schooling to understand education would be enough.  This is where Winfrey comes in.  Provocatively, how might “we,” expert scientists from schools of education, tame Winfrey and come to control her health education curriculum?  Should “we”? And who are the “we” who claim the right to control this curriculum?

And so, I was reminded again of Rancière and the argument that education cannot be tamed.  I thank Dr. Linda Lin for thinking of the paradigm of the wild/domesticated/tamed as best capturing Rancière’s argument as it must be generalized.  Education cannot be tamed and all the efforts to tame it are in vain—and this is a very good thing for the production of new cultural forms.

But that leaves those of us who are quite sure of our expertise with a dilemma. How do we reach the settings where licensed teachers do not have direct authority, from families to the media, from street corners to churches or mosques? How do we tame Oprah?


Cremin, Lawrence American education: The Colonial Experience.  New York: Harper & Row. 1970

beyond “conviction” as the product of social constructions

(social) constructions do not “convince” or “make it appear to” individuals; they are, just, that, objects, things, that individuals then must deal with in the setting where they encounter them.

This is a coda to my last post about the semiotic (interactional, conversational) aspects of all collective processes (including science, schooling, etc.).  Lim, in his answer to the exam question about Bourdieu’s response to Latour, quotes a statement about “how a fact takes on a quality which appears to place it beyond the scope of some kinds of sociological and historical explanation” and how a “laboratory is a system of literary inscription, an outcome of which is the occasional conviction of others that something is a fact” (Latour and Woolgar 1979: 105).

As Latour now would well know, this statement is possibly dangerous as written if one takes literally or seriously words like “appearance” or “conviction.”  This is the kind of writing that allows for the critique of much “constructivism” for assuming more or less explicitly that the product of “social constructions” is a mental state that will unnoticeably confuse the person now psychologically “convinced” that some (scientific, etc.) statement is indeed a “fact” (or any other relevant entity).  What Latour demonstrated elsewhere (Latour 1987: 14) is that a statement like “the DNA molecule has the shape of a double helix” has multiple lives as it moves from setting to setting.  For example, the statement’s social power will differ whether it is the first time the statement was produced at some unique moment in the past, or whether it is some other settings such as scholarly papers demonstrating or critiquing this statement, or conversations among scientists, and then in textbooks, or now in works in the sociology of science.  In his work, Latour has started giving some indication of what happens as a statement is “taken on the collective mode” (Lévi-Strauss 1969).  But this conversational process which, after a while, produces an intertextual web of multiple consequences, is one that must be traced ethnographically through the various settings within which the statement appears.  Above all one must not prejudge what will happen to the many human beings who will hear the statement and then incorporate it in their own practices.

A fun version of this, which is also a call for anthropological proposals, can be found in the Jorge Cham’s cartoon about “the science news cycle” (2009).

And, of course, this is what I am trying to do with Jill Koyama, Ray McDermott and Aaron Hung.

on approaching reality through signs: the responsibility of anthropology

that essential reality can only approached through signs and in conversations that challenge earlier representations, and thus on the possibility of science and the responsability of anthropology to explicate further how signs in conversation can dis/en-able.

One of the questions for the final exam in my class on “Technology and Culture” asks for a discussion of one of Bourdieu’s rant against “constructivism” particularly as it applies to the sociology of science.  Once, he singled Latour and Woolgar’s ethnography of a biological research lab for possibly “reducing scientific demonstrations to mere rhetorical exhibitions”  (1998 [1994]: 94).  Joseph Lim, one of the students in this June 2009 class, took this question on.  He started with Baudrillard’s discussion of simulacra where the “sign replaces the real.”  Then, of course, science-as-sign ceases to be “real.”  Lim argues against this—as indeed one should for Baudrillard altogether misunderstands that signs are the only way for human beings to approach reality.  And then Bourdieu joins Baudrillard without noticing that, arguably, what Latour has been doing is putting analytic teeth in his own emphasis on, precisely, practice.

This made me think further about what Latour and, of course, Garfinkel before him, had done.  In a way they moved science (and all other “it’s” [things, epistemes?]) from the realm of platonic ideas to the realms of human practical productions that transform human conditions.  In this move they did not quite demonstrated that there may not be essences that human beings cannot directly apprehend.  Rather, they demonstrated that human beings, in their metaphorical cave, do not simply contemplate the shadows and wonder what they might be shadows “of.”  Human beings, always, work hard together to figure out what to do with their actual conditions in the cave.  In the process, they transform their cave and indeed their methods for figuring out the things that may be making shadows.  In this process, as Merleau-Ponty had understood (1969), (Saussurian) signs are the only, as well as the most powerful, tools at the disposal of human beings.  Signs never substitutes themselves to reality.  As anthropologists had to learn, though they did it early on, no human beings, together, has ever mistaken a prayer for successful hunt with a successful hunt.  So, I am quite sure, no scientist will ever mistake a statement of fact (a sign), or an argumentation that a fact is factual, with the fact itself (the experience the sign cannot quite capture).  Or, more precisely, in the collective conversations scientists have with each other (and this is to bring in the pragmatist correction to Saussurian structuralism), whether a statement of fact is to be taken as a fact “for all intent and purposes” will be a practical achievement that will last until it is demonstrated that the semiotic process was somehow invalid.

Methodologically, this means that, to access what we have found out our current representations do not quite catch, one does not proceed from deduction and definition to observation.  Rather, one proceed through another look at the practices that have led to any “it” (taken-for-granted-so-far) and the new skepticism about its “it-ness.”  “Perhaps,” we can imagine men discussing, “this is not the way to hunt this beast… Perhaps another set of hunting practices might be more successful and, by the way, do you notice that, this other set, might also allow us to ……”  Having looked carefully at the practices, one can then propose new statements of (practical) fact: “this process is (dis-)abling in these specific ways.”

And so, approaching science (schooling, etc.) in a semiotic (interactional, conversational) way is not to critique the standing of science as a particular form of human knowledge, and probably a privileged one for certain human purposes.  It is, on the contrary, to participate in its further development as, precisely, “science.”  My favorite example is to be found in Jane Goodall  work on chimpanzees: By highlighting certain semiotic processes (men looking at males and privileging their activity), and then by shifting these (as a woman looking at females), our collectivity (polity, community of practice) was led to a more scientific view of chimpanzee social structure.

Thus the responsibility of anthropologists is to demonstrate just how signs-in-particular-conversation proceed, and to do so in such a way that scientists in other fields find their work useful for their own.

experimenting with formats for the representation of anthropological analyses

First attempts at representing graphically our work on schooling in America

First, I want to thank Dr. Aaron Chia-Yuan Hung for all his help with the visualisations.  Without his imagination in translating my often inchoate ideas, not much of this would be happening.

The experiment in representation that I introduced in my June 11th entry is taking us in several directions.  We are

  • summarizing the links between settings, moments, and people, implied or explored in Jill Koyama’s dissertation, for example:
    • a private corporation lobbying Congress to ensure that for-profit entities can provide “Supplemental Education Services”;
    • principals and teachers facing an error made by the New York City Department of Education that identified them as a “School in Need of Improvement.”
  • Finding possible graphic means to represent the links, for example”
  • populating the Web of NCLB Consequences (and sub-webs)

What has been interesting so far is that the exercise is obliging me

  • to be much more specific than the ‘paper’ format allows
  • to face up to the need to imagine linkages for which we do not have good ethnographic evidence (these, when mentioned, are really Requests for Research)
  • to push the evidence that each moment/setting is itself
    • a web
    • a source for further indications (indexes) of un-imagined linkages (this could be the most useful aspect of all this)

This raises a whole set of new analytic problems which I will address in another post.

on experimenting with anthropological representation

In 1972, Geertz asked “what is it that we, anthropologists, do?”.  He answered, provocatively but not quite rhetorically: “we write.”  Actually, he said more in this vein which I summarize,
1) The ethnographer ‘inscribes’ social discourse … In so doing he turns it from a passing event into an account … which can be reconsulted (1973: 19)
2) We inscribe … only that small part of which our informants can lead us into understanding (1973: 20)

I interpret this, anachronistically to everything we know about the tradition of symbolic and interpretive anthropology with which he is now associated, as an introduction to the ethno-methodology of all (educational) search to discover what is going on and what can be done with.  Technically, as (ethno-)anthropologists, we transcribe carefully documented interaction, through audio or video tape if we can, through detailed field notes if we cannot.  And then we pay close attention to everything we can see on this record to figure out, through the activity of the people and only through this activity, how a particular sequence might make sense in the world that the people had made (if not in ours).  Geertz, famously, analogized this task to that of a literary critic. But, in one text at least he specified that he thought of the critics task to explicate what an author might have indexed that a modern reader might not get: “you [cannot] know what a catcher’s mitt is if you don’t know what baseball is” (1976: 221).

Following such leads to find out how people constitute the world they must then deal with, has been magnificently productive for our understanding of face to face interaction among small groups.  From telephone conversations, to the telling of jokes (Sacks 1974), to the discovery of pulsars (Garfinkel and Livingston 1981), we have been able to move significantly further in the anthropological task.

But it has been much more difficult to follow this program with larger emerging units in which it seems evident that interactions in Setting A are somehow linked to conversations in Setting C through the interactions in Setting B.

For example, school based research can easily argue that some troubles a principal may have with teachers and students is directly related to directives from the superintendent’s office which was only passing on directives from the State or the Federal Government.  In other words, the principal and the immediate significant others in a school, indicate through their speech and acts what is it that they cannot escape, what they are making locally with this, and where we, as anthropologists might go to trace where what-they-cannot-escape comes from.

In recent years, Latour has become famous for insisting that social scientists work at tracing these connections.  He writes about “networks” and how his Actor-Network (non-) Theory might be useful (Latour 2005).  But he does not give much guidance as to how this might get done, that is as to how we might systematically inscribe and then identify the links and then inscribe the identified links as suggested by the participants, and in such a way that our critics can identify errors or omissions.

This is why I found Jill Koyama’s dissertation.  Not only did she dare make New York City her unit of analysis, she began tracing systematically the linkages on matters of providing “supplemental educational services” across many settings.  The next question, for me, then became, how might we represent these linkages.

As a starting point, I am reverting to the “web” metaphor, Geertz borrows from Weber (1973 ).  It is cozy metaphor about being “suspended” which I am turning more ominous by writing that we are “caught in webs of (practically consequential and enacted) meaning.”

spider web

And then I am pushing the metaphor by actually “suspending” people-in-their-moments in the hope that it can allow us to see the proposed connections and then plan further research on this basis.  Here is what this might look like.

on ‘Lost’ as educator

if one can teach oneself about, say, the philosopher John Locke, has one learned anything? How would ‘we’ know? Does one know something if no one has certified that she does?

There is a part of me that is half-ashamed in the pleasure I take in such shows as the TV series Lost.  It is of course gratifying to know that many people do.  More interesting is the discovery of what so many of these people are doing with the show.  Let me join them.

I will leave to a student the task of tracing the full extent of what people are actually doing with Lost.  Given its success, I am sure someone has started doing this.  So I will give this student one more issue to trace: Lost can also be explored as a site for education.  I build here on a journalistic piece written for the web site of Christianity Today.  I will do so to highlight the educational aspect of the show and how its fits with what I have been writing about in recent years (2008, 2009).

This piece (posted 5/18/2009) is written by Tyler Charles, a freelance writer.  He mentions some of what people are doing with Lost as they investigate the scientific, literary, philosophical and religious hints the show gives.  Charles lists the books and philosophers mentioned, the major philosophical issued revealed, as well as other matters.  He does not mention anyone exploring the political aspects of the show and yet, particularly in the first season, a major issue was the nature of leadership and the organization of government (“Who made you the leader, Jack?”, “A leader can’t lead until he knows where he is going.” – Episode 5).  Actually, this issue has been reopened with the (divine?) appointing of Locke as leader by “the island.”  I suspect some people are also exploring this since it can open conversations about the very grounding of democracy.

Charles reports that people are following these leads.  They seek to find out more about the physics of time travel and electromagnetism, or what might make John Locke or David Hume important enough persons to have characters named after them.  Of course, there is a wiki site where one can start exploring all this: lostpedia.wikia.com.

The French philosopher Jacques Rancière (who still does not have a character named after him on Lost…) brought out of obscurity a, until now, minor figure of the French Revolution, Joseph Jacotot.  Rancière recounts how Jacotot demonstrated, to his satisfaction at least, that any one, particularly someone who did NOT know the subject matter, could “teach” this matter.  Even more challenging he argued that all the material they needed was one book, Télémaque written by Fénelon.  It was not because the book was a source of universal wisdom but because, to say all this more carefully:

anyone (not quite a teacher) can produce a situation
…… where someone else (not quite a student) can learn
………. what this person must have the will to learn, and that,

given this will,

…… anything (and not necessarily Télémaque)

………. can start this person on the way to find out for herself what she wants to know.

This actually might be the basis for a Lost episode.  Unless it is the point of the whole show where everyone has to figure out, again, what to do next given what has happened to them in the past.

In that perspective, Lost (like probably Star Trek, Star Wars, etc.) is becoming a Télémaque for a generation of people using it as a departure to teach themselves about physics, religion, human relationships, etc., not to mention of course many aspects of what used to be called “literary criticism,” and would now fall under the purview of “popular culture studies.”

The question, for a faculty at Teachers College, of Columbia University, is: when someone teaches herself anything, does she “learn” it?  Does one know something if no one has certified that she does?

on researching autism as “cultural fact”

on modeling “autism” as a cultural fact, that is as an enabling and disabling resource for all those who cannot escape it, whether as “child with autism,” parent or teacher of such a child, administrator of a school with special education classrooms, or as policy maker devising new regulations about how to deal with all the above.

There is a cliche in the sentiment that one of the best part in being a professor is being faced by great students challenging one’s pet ideas.  But a cliche can also be true as I experienced again when Juliette de Wolfe, at the end of a seminar, told me that she was anxious about using one of my favorite conceits.  For close to 20 years, McDermott and I have been writing about such matters as learning disabilities as “cultural facts.” De Wolfe, who is starting a project on the processes for the identification of autism, and who had used the phrase in her proposal, was worried that she was caught in something, that was “static.”

On the spot, my answers were weak and not convincing–certainly they were not convincing to me as I thought about them later.  I had mumbled something about the adjective “static” being possibly an attribute of a research analysis, not of a concept that could be used in any number of ways, that emphasizing “change” is much easier said than done, and that those that claim that they do not want to be “static” mostly produce analyses that end up extremely static.  Had I not been interrupted, I probably could have gone on in this defensive/offensive mode without quite answering a very proper concern about the very justification for social science research, particularly in its anthropological version.

McDermott and I devised the phrase (“cultural fact”) to index our roots in Durkheimian sociology (as reinterpreted by Garfinkel) and in American cultural anthropology and pragmatism.  Earlier I had pointed de Wolfe to the pages in Successful failure (1998) where McDermott and I developed the phrase “cultural fact” we had introduced earlier (McDermott & Varenne 1995).  But these passages are not enough.

To stay with de Wolfe’s concern, let’s say that we are interested in children who are having a difficult life and particularly with those who have, or are caught with, something now labeled “autism,” something that was discovered-as-such in America and in the 1940s.  It is something that was fully institutionalized starting in the 1970s.  Autism may be some thing that has always been there in humanity, though until recently this thing may have been labeled something else, or institutionalized differently.  Just putting the issue this way should make it clear that I am taking here the classical cultural anthropological stance (Benedict 1934).  I make the noticing of autism as a thing with specific personal, interactional, and political consequences, a historical event.  In other words I place autism “in its historical context,” or, more jargonistically, I “historicize” autism.

All this is well and good, but it actually must leave our apprentices in confusion.  What are future anthropologists to do next, after we have historicized autism, or any one of its sub-practices (e.g. the meetings where a child gets officially labeled)?  What is the point of historicizing something?  Actually how do we know that we have actually historicized “it” or that we have conspired in reconstituting something that should never have been constituted in the first place?

I argue that our duty, as anthropologists, is to provide future practitioners (parents, teachers, etc.) with a more systematic account of the constraints which they will not be able to escape.  This, I think, is what Durkheim meant when he wrote of social facts as “imposing themselves,” or what Latour now means when he writes about objects as having “agency.”  What easily disappears in these statements as they have been taken for more than a century is that these are statements about the future rather than the past, or even the present.  As McDermott and I put it “Culture is not a past cause to a current self.  Culture is the current challenge to possible future selves” (2006:8).  As I would put it today, technically, a cultural fact is a model for the set of (dis-)abling properties of the present that make a difference in some future.  The task of the cultural analyst is to discern these properties and report on them in a way that makes sense to at least some of the practitioners.

Thus the task for de Wolfe, as she starts observing teachers and students in an “autistic classroom,” is to build a model of those matters that make a difference as the people she meets build a life together and, in the process, instruct her as to what actually does make a difference.

This is what I advise her to do because this is what all those who care for the children need from an anthropologist: a different account of their experiences that may provide them with new resources for the future they will make with each other.

And we should not worry if this account looks to some as a “synchronic” account.  The account, if it is well done, will of course be synchronic in the Saussurian sense.  Others can write about the history of autism and trace its diachronic evolution.  But history, however interesting, is not quite useful because human evolution, including its cultural (linguistic) evolution is not a rational process in the narrow sense.