“Self-harm has become internal and non-communicative. It has become self-regulatory and not interpersonal. It is difficult not to see this change as relating to the much larger economic and political shifts of the late 1970s and early 1980s.”

What is self-harm, and where does it come from? These are the two questions that I am trying to answer in my new, open access book A History of Self-Harm in Britain: A Genealogy of Cutting and Overdosing (2015).

The question really depends upon when and where you ask. In Britain during the 1950s and 1960s, the terms ‘self-harm’ and ‘self-damage’ largely signify taking an overdose of medication. It is also called ‘attempted suicide’, ‘self-poisoning’, ‘pseudocide’ and ‘propetia’ (from the Greek for ‘rashness’). The studies from which such terminology emerged were rooted in hospital Accident and Emergency departments (A&E). At this point, the overdose is generally understood as a disordered communication – a ‘cry for help’ – and is assessed by psychiatrists attached to hospitals, alongside another particular group of professionals: psychiatric social workers (PSWs).

But the idea that ‘self harm’ essentially indicates ‘overdosing as a cry for help’ changes during the 1980s. In particular, the practice of self-cutting as a form of tension release or emotional regulation gains more prominence. Initially studies of self-cutting emerge from inpatient units in North America and Britain. Despite being called things like ‘delicate cutting’ or ‘wrist-slashing’, these studies actually document a wide range of behaviours including self-burning, skin-picking, smashing windows, and swallowing objects such as pins or dominoes. However, self-cutting is repeatedly emphasized as being archetypal in some way (this is a topic I discuss in much more detail in another paper).

Despite this emphasis on self-cutting, the behaviour presenting at hospitals doesn’t really change: between 80 and 95 per cent of the cases under the label ‘self-harm’ in hospital statistics remain self-poisoning. However, there are now huge numbers of studies from psychotherapists, counselors and psychiatrists documenting ‘self-cutters’.

The behavioural stereotypes inaugurated during the 1980s remain substantially intact today. ‘Self-cutting as emotional self-regulation’ is still largely presumed to be the behaviour and motivation indicated by the term ‘self-harm’. The key questions are, why have things changed, and why at that point? The answers are still murky, even after 250+ pages of the book. I am pretty clear on why self-poisoning emerges as a national concern in the 1960s: changes in mental health law in 1959 mean that more psychiatric assessment and therapy can take place at general hospitals (rather than the remote Victorian-era asylums). This increase means that increasing numbers of people are assessed psychologically after arriving at A&E having harmed themselves. Thus, what is normally quite a small amount of damage physiologically speaking can be assessed in terms of a person’s mental state, home life and romantic attachments. Thus we have more possibility for a ‘cry for help’.

PSWs take the lead here, following up patients by visiting them at home and bringing to bear assessments of various life stresses (such as an alcoholic spouse, sexual infidelity or a difficult adjustment to married life). Suicide law also changes in 1961, meaning that people who take overdoses are no longer breaking the law. This allows the government to recommend that all ‘attempted suicides’ are psychologically assessed. Previously it would have been difficult to do this given the technical illegality of such actions (even if it is rarely prosecuted post-1945).

The reason why self-poisoning, rather than self-cutting, predominates at hospital A&E departments is rather mundane. If somebody is discovered having taking an overdose, even only ten tablets, how many laypeople would be comfortable leaving it to chance and advising the person to ‘sleep it off’? These cases, even if thought by doctors to be ‘trivial’, are much more likely to end up at A&E. However, self-cutting of the forearms (the archetypal site) seems much less physically lethal, and more people are comfortable dealing with this physical damage with their own first-aid skills. Thus it is more likely to emerge through counseling, rather than at A&E.

Reasons for the shift in the 1980s from cutting to overdosing in popular usage of the term ‘self-harm’ are much more unclear. Partially it has to do with the delegation of self-poisoning assessment away from hospital psychiatrists in the 1980s. Fewer research psychiatrists coming into contact with the behaviour on such a regular basis leads to a drop-off in studies. But there is another order of explanation that interests me.

In broad terms British society in the 1950s and 1960s is that of consensus politics, of active welfarism and social support, the post-War settlement, the NHS, social work, and commitment to social housing. By the 1980s, exemplified by the ascent of Ronald Reagan in the USA and Margaret Thatcher in Britain, there is a sense of something new: rolling back the state, championing the logic of the market and the virtues of competition and self-regulation. As far as the social setting and social intervention are obvious and necessary in post-1945 Britain, we are told post 1980 that there is ‘no such thing as society’.

It seems to me that we have a form of self-harm in the 1960s that is socially-embedded, accessed by social workers, and fundamentally understood as interpersonal behaviour. It is a very ‘social’ form of self-harm. In the 1980s, the kind of self-harm that resonates is one that focuses upon individual emotional states, and the practice of self-regulation. The very idea of ‘crying for help’ is recast as negative and manipulative; it is never totally free of those implications in the 1960s, but there is a much more prominent understanding of humans as social, communicative beings. This sense of communication is entirely recast in a negative light in the 1980s. Clinicians who want this behaviour taken seriously as a clinical problem (rather than something to be ignored as manipulative) therefore stress the internal emotions, the overwhelming tension, and dismiss or downplay any communicative aspect.

Self-harm has become internal and non-communicative. It has become self-regulatory and not interpersonal. It is difficult not to see this change as relating to the much larger economic and political shifts of the late 1970s and early 1980s. The history of psychiatry can help us to understand that the categories that we use to understand human behaviour are unavoidably tied up with broader political and social circumstances. When we cast mental health or mental distress as internal or external, as social or biological, we are lining up (like it or not) with much broader political questions about the nature of humans. I should end on one of my favourite quotations from Michel Foucualt, words uttered in the course of the debate with Noam Chomsky in their famous debate on human nature:

“The real political task in a society such as ours is to criticize the workings of institutions that appear to be both neutral and independent, to criticize and attack them in such a manner that the political violence that has always exercised itself obscurely through them will be unmasked, so that one can fight against them.”

Even as we attend to self-harm, the ways in which we understand the behaviour, and the ways in which the behaviour is experienced at a deep level, resonate with dominant constellations of power.

In the huge clutter of concepts and shorthands and commonsense with which we make sense of the world, visions of human nature lurk. Before we can contest them, before we can agree with them, we must see that they are there at all.

Chris Millard is Wellcome Trust Medical Humanities Research Fellow in the School of History, at Queen Mary University of London. ‘A History of Self-Harm in Britain: A Genealogy of Cutting and Overdosing’ is published now by Palgrave Macmillan. It is available, open access (thanks to the Wellcome Trust), from the following link: A History of Self-Harm in Britain: A Genealogy of Cutting and Overdosing.

 

“We see the contingency and uncertainty that underlies the term ‘human sciences’ not as a source of anxiety but as the grounds for celebration.” – New Editors’ Introduction

The central problem of the human sciences remains unresolved. Despite the new claims championed within molecular biology, evolutionary psychology, artificial intelligence and the cognitive neurosciences, one of the central organising categories of each of those disciplines – the human – has resisted definition. This resistance has a long history. When Kant asked the last of the four key philosophical questions posed in his Logic of 1800 – ‘Was ist der Mensch?’ – he likely knew that nineteenth-century theory would fail to provide a definitive answer. The category that came to define both the humanities and the human sciences in the German-speaking territories – that of Geist, the inherently un-measurable, unstable and speculative prefix to the Geisteswissenschaften – served only to produce provisional answers that would in turn only give rise to further questions.

Towards the close of the nineteenth century, Wilhelm Dilthey concluded that this resistance to definition was inevitable because the human being is an ineluctably historical being whose attempts at self-understanding are always contingent upon a particular historical perspective and therefore always subject to variation (Dilthey 1991 [1883]). Within the German tradition of philosophical anthropology advanced by Max Scheler (1928) and Helmuth Plessner (1928), among others, and recently taken up in the writings of Hans Blumenberg (2006) and Peter Sloterdijk (2004), the human being is held up as a ‘cultural being’ that is able to survive only because of its non-biological adaptations and technologies. Human nature, these writers insist, is human culture, and the human sciences would thus require a methodology quite different from those of the natural sciences.

This recognition that human nature is, in the last analysis, historical has been foundational to the post-structural turn in the human sciences. The acknowledgment of the radical problem that the question of the human posed underwrote the work of Michel Foucault and Jacques Derrida. Foucault famously argued in The Order of Things that ‘Man, in the analytic of finitude, is a strange empirico-transcendental doublet, since he is a being such that knowledge will be attained in him of what renders all knowledge possible’ (Foucault, 1970: 318). Derrida, in his essay ‘Structure, Sign and Play in the Discourse of the Human Sciences’, described the knotted field of those sciences, one constituted by two ‘absolutely irreconcilable’ modes of interpreting: one that ‘seeks to decipher, dreams of deciphering a truth or an origin which escapes play and the order of the sign’, and the other that ‘affirms play and tries to pass beyond man and humanism, the name of man being the name of that being who, throughout the history of metaphysics or of ontotheology – in other words, throughout his entire history – has dreamed of full presence, the reassuring foundation, the origin and the end of play’ (Derrida, 2001 [1967]: 370–71). Those two opposing desires – one a flight to an imagined ‘before’, and the other a leapfrog over, and hence beyond, the shoulders of ‘man’ – are with the human sciences, still.

The writings of Foucault and Derrida acted as a lightening rod for multiple others to interrogate the old organising categories that had served as the basis for human scientific research in the first half of the twentieth century. Language, reason, history, evidence, testimony, sexual difference, biology and culture: all were subject to profound deformations that fundamentally reshaped the terrain of the human sciences. It was in response to this radical ferment that our predecessors, Arthur Still and Irving Velody, founded History of the Human Sciences in 1988. Through their work, alongside that of James Good, who took over as editor in 1999, and Roger Smith, who has served as an associate editor since the journal began, History of the Human Sciences emerged as one of the central forums in the English-speaking world for reflection upon the constitution and demarcation of this contested field, as well as the wider institutional and political implications of these epistemological deformations. From its inception, the journal recognised that the French and German terms – sciences humaines and die Geisteswissenschaften – had no simple equivalent in the English language, and that the term ‘human sciences’ was already being deployed within the biological sciences to describe attempts to bring together genetics, ethology, communications studies and the neurosciences into an overarching synthesis. In their opening editorial, the editors acknowledged collegial uneasiness around the term, but insisted that the phrase, unlike ‘social sciences’, ‘suggests a critical and historical approach that transcends these specialisms and links their interests with those of philosophy, literary criticism, history, aesthetics, law, and politics’ (Still and Velody, 1988: 1).

In many ways the terminological challenges faced by our predecessors have been superseded. This has occurred for two related reasons. On the one hand, the journal’s success over the last 28 years has established the human sciences as a field, and made clear its intrinsically historical basis. In the last quarter century, the long-standing neglect, on the part of historians and philosophers of science, of the human sciences in comparison with the natural sciences has given way to an investigation of their often intertwined (as well as times opposed) epistemic projects, practices and commitments. On the other hand, the porous boundary between the natural scientific approach pursued in many of the life sciences and the historical approach promoted by this journal has largely dissolved. In recent years, there has been growing acknowledgement, for instance, of the ways that new biological approaches and technologies have helped to reshape our understanding of life and the human (e.g. Landecker, 2010); of the role of material culture in shaping historical practice; and of the close relationship between the sociological and biological projects in the first half of the twentieth century (e.g. Renwick, 2012). In addition, grand – and contestable – claims are now being made for potential inclusion of psychological, evolutionary, and cognitive neuroscientific perspectives within historical analyses (for example, within the field of neuro-history). Whatever one may think of such demands, and certainly they are complicated by the necessarily historical character of those disciplines, it is clear that our working concepts of subjectivity, history, life, emotion, and culture cannot be insulated from developments in psychopharmacology, the neurosciences, bioinformatics, and all those fields gathered under the neologism ‘omics’ (including genomics, proteomics, metabolomics and transcriptomics). Thus while we remain committed to the claim made by the journal’s founding editors, that ‘All reflections in the human sciences seems embedded in history, forming a categorical framework difficult if not impossible to escape from’ (Editorial, 1992: 1), we also recognise that the character of history and the shape of the historical imagination are uncertain. What it might mean to be ‘embedded in history’, then, is subject to on-going reformulation.

We see the contingency and uncertainty that underlies the term ‘human sciences’ not as a source of anxiety but as the grounds for celebration. It provides new points of departure for critical reflection and opens up new opportunities for interdisciplinary collaboration. Certainly, this is reflected by our own disciplinary orientations: an historical geographer of twentieth- and twenty-first century psychiatry, psychology and cognitive neuroscience, who draws upon social and critical theory (Callard); a historian of psychology and psychiatry and their connections to broader cultural history (Hayward); and a Germanist and literary scholar with interests in the history of anthropology, critical theory and psychoanalysis (Nicholls). As incoming editors, we are joined by book reviews editor Chris Millard (a historian of twentieth-century psychiatry), and we have created the new role of web and social media editor (which is filled by Des Fitzgerald, a sociologist with a particular interest in the past and present intersections of the social and life sciences). This year, we launch a new website for the journal (www.histhum.com). Given our interest in how genres, media and technologies are entangled with the kinds of knowledge that the human sciences are able to produce, we are keen to see how the website might help found new connections – between scholars, ideas, methods, practices – in this heterogeneous, interdisciplinary terrain.

We invite all readers both to engage with our website, and offer contributions and ideas about where we might take it. We have also invited a number of academics on to the journal’s advisory editorial board, with the aim of bringing into the journal’s fold a greater proportion of early- and mid-career scholars (many of whose publications are already shifting premises, epistemological starting points and objects of inquiry in the history of the human sciences). We are deeply indebted to the meticulous work of both James Good (as editor) and Sarah Thompson (as editorial assistant) in relation to the journal’s curatorial and substantive contributions. The shape of the history of the human sciences over the last 15 or so years bears the imprint of their visible and invisible labours. We are delighted that James remains a member of the advisory editorial board, and Sarah continues in her editorial role.

As incoming editors, we have been thinking together about how best to articulate our own rules of thumb for the kinds of submissions to the journal that we hope to encourage. We are resolutely committed to continuing the support that the journal has always shown to arguments that might appear risky to the received ideas that underpin particular communities of thought and practice. More prosaically, we welcome manuscripts that address at least one of the modern human sciences, broadly conceived (including psychology, psychiatry, psychoanalysis, history, philosophy, medicine, sociology, geography, anthropology, archaeology, economics, political economy, human biology, physiology, science and technology studies, sexology, the neurosciences, critical theory, literary and cultural theory, linguistics). And of course we welcome engagements with all those domains of knowledge that have a more precarious relationship to, or have been discredited by, current epistemological norms (for example, parapsychology, the racial sciences). By using the qualifying adjective ‘modern’, we register the journal’s tendency to focus on the post-Cartesian period, though we emphasize that we welcome submissions on the pre-modern human sciences (such as Ancient Greek ‘psychology’, medieval medicine etc.) if the approach taken addresses the question of ‘the human’ of the human sciences and/or establishes a dialogue between those sciences and more recent human sciences in terms of particular ontological, epistemological or methodological problematics. Additionally, our hope is that submissions take an interdisciplinary approach – but that authors put pressure on what they believe ‘the interdisciplinary’ connotes by dint of their methods, modes of reading, and, indeed, their assumptions about ‘discipline’. We warmly welcome manuscripts that dwell on questions of method and methodology (rather than, say, simply use a method less common to the core concerns of the field with the assumption that this strangeness will be itself revelatory).

We are convinced of the continued utility of the themes that the first editors enumerated when they considered the particular problems they wanted the journal to address, namely: (i) the history of individual disciplines and their shifting boundaries within the human sciences; (ii) the dependence on theoretical and cognitive presuppositions in the human sciences; (iii) the infusion of literary and aesthetic forms in the human sciences; (iv) the character of substantive findings in the human sciences and their institutional implications; (v) the deployment of historical resources in the human sciences (Still and Velody, 1988: 2–3). But alongside this editorial continuity, we want also to record our own sense of how submissions in 2016 (and beyond) might look a little different from those received in 1988. We anticipate a growing number of submissions from authors reflecting on, and embedded within, the history of more recent fields in the human sciences (such as the medical and digital humanities, disability studies and queer studies, as well as the inter-disciplines prefixed with neuro-); from those interrogating the shape and the historiography of ‘the interdisciplinary’ itself; and from authors (or co-authors) who are simultaneously practitioners in the field(s) under historical investigation. For the boundaries between those external to and internal to many epistemological domains are under pressure, not least when many of those domains are themselves interdisciplinary. We are particularly keen to expand the journal’s attention to the space and constitution of the global and the local – and to the tangled histories of the colonial and the post-colonial – in the making and remaking of the human sciences. And we predict that the efflorescence of ‘animal studies’ – as well as wider attentions to questions of materiality, animality, vegetality, and, indeed, the inorganic – will continue to press on the edges of the central category, the human, with which we started this editorial.

The capacities and limits of non-human animals – as well as those of those cyborg entities that ghost, with ever greater density, our figure of the human – are undoubtedly being both rethought and remade. This in turn opens new questions about how to conceptualize the environments – physical, political, geological and social – in which those entities, both human and non-human, are embedded. That human experience – which has, in the time that has elapsed since the founding of this journal, been provincialized in a number of sciences – opens up, we suggest, exciting and difficult questions for all of us interested in the past and future of that sprawling field called the history of the human sciences. We welcome submissions, therefore, as much from those working on the ‘non-human sciences’ as on the human – so as to adumbrate more carefully the contours of this distinction. We want, nonetheless, to hold fast to the fact that insofar as the human animal is an animal that has history, narrative, the capacity for self-reflection, and the imaginative ability to project itself in the future, the human sciences remain in the last analysis interpretative and hermeneutical sciences.

Felicity Callard (Durham University), Rhodri Hayward (Queen Mary University of London) and Angus Nicholls (Queen Mary University of London) are the editors of History of the Human Sciences.

The final version of this article, as published in the Journal (Vol. 29, No. 3), is available here: http://hhs.sagepub.com/content/29/3/3.full.pdf+html

References

Blumenberg H. (2006). Beschreibung des Menschen [Description of Man], ed. M. Sommer. Frankfurt am Main: Suhrkamp.

Derrida, J. (2001 [1967]). ‘Structure, Sign and Play in the Discourse of the Human Sciences’, in Writing and Difference, trans. A. Bass. London: Routledge Classics, pp. 351–70.

Dilthey, W. (1991 [1883]). Introduction to the Human Sciences, trans. R. A. Makkreel and F. Rodi. Princeton, NJ: Princeton University Press.

‘Editorial’. (1992). History of the Human Sciences, 5(2), 1–2.

Foucault, M. (1970). The Order of Things: An Archaeology of the Human Sciences. London: Tavistock Publications.

Landecker, H. (2010). Culturing Life: How Cells Became Technologies. Cambridge, MA: Harvard University Press.

Plessner, H. (1975 [1928]) Die Stufen des Organischen und der Mensch. Einleitung in die philosophische Anthropologie [The Stages of the Organic and the Human Being. Introduction to Philosophical Anthropology], 3rd ed. Berlin: De Gruyter.

Renwick, C. (2012). British Sociology’s Lost Biological Roots: A History of Futures Past. London: Palgrave Macmillan.

Scheler, M. (1928). Die Stellung des Menschen im Kosmos [The Position of the Human Being within the Cosmos] in Gesammelte Werke, ed. M. Scheler and M. S. Frings, 15 vols. Basel: Francke; Bonn: Bouvier, 1971–1997, vol. 9.

Sloterdijk, P. (2004). Sphären III, Schäume [Spheres III, Foams]. Frankfurt am Main: Suhrkamp.

Still, A. and Velody, I. (1988). ‘Editorial’, History of the Human Sciences, 1(1): 1–4.

“We might have a brighter future if we stopped conceiving of ourselves as an epistemic counterculture” – An Interview with Nicolas Langlitz

Nicolas Langlitz is Associate Professor of Anthropology at the New School for Social Research in New York City. His work lies at the intersection of anthropology and the history of science, where he has been especially engaged with the epistemic cultures of the neurobiological and psychopharmcological sciences. His most recent monograph, ‘Neuropsychedilia: The Revival of Hallucinogen Research Since the Decade of The Brain’ is available from the University of California Press. At the beginning of March, Des Fitzgerald, HHS Web Editor, caught up with Nicolas about his recent article in History of the Human Sciences,On a not so chance encounter of neurophilosophy and science studies in a sleep laboratory.’

Des Fitzgerald: We’ve had a lot of reflection lately on how disciplines like anthropology and sociology intersect the natural sciences (and especially life sciences); one of the things I found especially valuable about your article was its attention to a very different set of interdisciplinary relations – those between social scientists and philosophers. Why do you think there has been relatively little attention to these interactions? And where do you see their future?

Nicolas Langlitz: That’s true. Social studies of science, including anthropology and sociology, have not paid much attention to philosophy. I think there are political reasons for why the humanities and the social sciences attracted less interest. In his article “What Happened in the Sixties?“, Jon Agar located the birth of science studies in the long 1960s and the countercultural upheaval against technocratic government. Philosophically, one of the goals of science studies was to show that there was no clear demarcation of science from society, that scientists were human beings like you and me, and that their claims to objectivity were unfounded. Expert knowledge was put in its place and subordinated to a democratic process. When science studies were established as a field in the 1980s, we were certainly not ruled by philosopher kings and nobody felt the need to show how Derrida and Rorty had fabricated their truth claims ­– not least because these philosophers didn’t make any. But technoscientists did assert their expertise and transformed our world in powerful ways. So we started the Science Wars.

“On a Not So Chance Encounter” has a non-identical twin titled “Vatted Dreams,” in which I point to a second reason for the neglect of philosophers. They are really hard to study. Life scientists meet in a laboratory where they conduct experiments or they go to the field where they observe things. If they trust you, you can hang out with them and watch what they are doing. By contrast, philosophers sit at their desks in the library, their office, or – in the worst case – at home. You can’t install an observation post in their study. And, even if you did, watching them think and write wouldn’t provide much insight anyway. This is primarily a problem for the ethnography, not the history of the human sciences, I think. I was lucky in that my interest in the exchange between neurophilosophers and neuroscientists took me to a sleep lab in Finland. So I departed from a classical and more manageable laboratory ethnography setting not so different from the work I had done on neuropsychopharmacologists studying psychedelic drugs. Nevertheless the neurophilosophy project never flourished ethnographically. It mostly facilitated conceptual reorientations on my part that I document in the two articles mentioned and a third one that will soon be coming out in Common Knowledge.

DF: One of the things you seem to be negotiating in the article is your stance as an ethnographer, on the one hand, and your role as a collaborator in an interdisciplinary team, on the other: as you say in the article, your interest is not only in differences, but in common ground. What has it been like to inhabit the sleep laboratory both as ethnographer and collaborator? And where do you locate yourself in the sometimes vexed debates about anthropological inhabitations of the life sciences?

NL: As an ethnographer I felt relatively comfortable in this project because the group of philosophers, psychologists, and neuroscientists I worked with had only been brought together by the Volkswagen Foundation’s European Platform for Life Sciences, Mind Sciences, and Humanities. So I was a member from the start and not the awkward newcomer trying to find his place at the margins of a community, which is the usual role of the ethnographer. I got along very well with the other members of the group, especially Jennifer Windt and Valdas Noreika.

However, I really failed as a collaborator. Originally, I saw my job as conducting what Niklas Luhmann and Paul Rabinow called second-order observations: observations of how other observers were observing the world. Ideally, this perspective allows you to identify the other observers’ blind spots, the contingency of their observations. But it’s no basis for collaboration. The philosophers were trying to develop a theory of dreaming based on the empirical findings of the sleep researchers – so philosophers and neuroscientists were looking at the same thing, dreams, while I was looking at something else, namely them. At one point, we had a series of, on my part, rather agonizing Skype conferences about an article we wanted to write together on dreaming as a model of consciousness. I had written a piece on using another altered state of consciousness, namely the inebriation with hallucinogenic drugs, as a model of psychosis. So I actually had things to say about this kind of modeling and yet I did not know how to contribute. Eventually Jenny and Valdas went ahead without me and I really couldn’t blame them (see Windt & Noreika 2011).

It eventually dawned on me that collaboration required that I would open up to first-order observations, that we had to look at the same thing. I had already been making this move in the context of psychedelic research. But there it was much easier. The effects of psychedelics depend on set and setting – they are shaped by the social and cultural milieu. It’s not difficult to insert yourself into this research as an ethnographer. In a publication in this very journal, I had argued for a rapprochement of psychopharmacology and the human sciences. But dreaming is a very different state of mind. The neural thresholds for sensory input and motor output are significantly elevated. Dreamers are cut off from their environments. The setting of the sleep laboratory doesn’t affect dream content a whole lot. It took me years to realize what a beautiful provocation this was. In anthropology and science studies, we implicitly or explicitly subscribe to externalist philosophies of mind, emphasizing how human experience is a product of the subject’s relations with the outer world, and we always criticize brain researchers and neurophilosophers for reducing mind to brain. It turns out that the neuroscience of dreaming provides some of the strongest support for internalist philosophies of mind. This led me to rethink the biases I had inherited from my own field of scholarship. That’s what I lay out in “Vatted Dreams.”

For me, doing ethnographic fieldwork is about learning to think differently. So I have no interest in mobilizing anthropological critiques against my interlocutors. There are enough people who regularly, although not very successfully, remind neuroscientists that they have left out the social and the cultural. I focus on mining other fields for things that we anthropologists ignore or habitually dismiss. My plea for positivism at the end of “On a Not So Chance Encounter” is another example of that.

 DF: One of the many fascinating historical stories threaded through your article is the failure – if I can put it like this – of science studies to be the mode in which the conceptual would get sutured to the empirical within a naturalized epistemology (a role, indeed, for which it was a serious candidate). There’s an interesting counterfactual history at stake here: what happened? And how do you think things might have played out differently for what today calls itself STS?

I actually do think that the social studies of science are based on a naturalist epistemology. My plea to make neurophilosophy more materialist urges philosophers like the Churchlands to expand their naturalist approach from the mind to the sciences, which they continue to regard in a rather idealistic fashion. They slept through the social, practical, and material turns in the history of science. Of course, that also protected them against the constructionist excesses that came with these turns.

If you want me to make up a counterfactual history of the two fields, I would imagine a much earlier encounter of science studies and neurophilosophy in a neuroscience lab. Maybe between Latour and Churchland at the Salk Institute where they both conducted research in the late 1970s and 1980s, respectively. It might have made both fields more attentive to the fact that some natural phenomena are more affected by humans than others, and that this should be more of an empirical than metaphysical question.

DF: Your paper is one sense a genealogy of neurophilosophy – and (if I read you correctly) one of your claims is that neurophilosophy has been (or at least has become) a more orthodox intellectual space than some have seen it, or than it might otherwise have been. Is this the case? And what would your dream for a more heterodox neurophilosophy look like?

NL: Since my intellectual life doesn’t depend as much on how dynamic or sclerotic neurophilosophy is these days, I’m personally a lot more concerned about the orthodoxy of my own field. That’s what I’m primarily writing against.

But I do think that neurophilosophy could profit from catching up with a history of science that, in the past 30 years, has shifted its attention from scientific ideas to material practices. The Churchlands’ prediction that psychological understandings of the human mind will either become reducible to neuroscientific conceptions or be eliminated went far beyond the philosophy of mind. It drew from positivist and postpositivist philosophies of science, which also gave rise to science studies, historical epistemology, etc. What philosophy of mind would we arrive at if it took into consideration these later developments in how we think about science?

Regarding the orthodoxies of science studies, we should revert the theory-ladenness of observation and the constructedness of all phenomena from articles of faith to objects of empirical inquiry. We might also be able to learn something from the seemingly old fashioned histories of scientific ideas that the Churchlands continue to favor. That would be in line with John Tresch’s recent plea for reintegrating a materialist history of science with intellectual history.

DF: You end by saying that science studies scholars, among others, should perhaps not peremptorily dismiss a positivist attitude to objects like the dreaming brain? Can you expand on this – are you calling for a more nuanced ethnographic attention to positivism, or actually for something like a more positivistic STS? What is the content of your ‘materialist dream,’ as you put it?

NL: The ontological turn in anthropology and science studies has relegitimated metaphysical speculation. In principle, that seems fine to me. We need metaphysical frameworks for empirical research and materialism is one such framework. But these frameworks can and should be continuous with what we know about the world. In the case of dreaming, this knowledge is quite limited. Like the neurophilosophers, I’m confident that we will ultimately arrive at a materialist account. That’s my materialist dream, but at present it’s speculative. Positivism rejects all metaphysical speculation, which also sets it up in opposition to metaphysical materialism. It was not my intention to commit myself to materialism or positivism in general. It’s a situational epistemology in the face of a particular phenomenon. And in the current situation we don’t have enough experimental knowledge to simply dismiss Norman Malcolm’s dream positivism. So it’s not a plea for a more positivistic STS. If there is anything programmatic about it, then it would be to pay more attention to the peculiarities of phenomena instead of plastering everything with one and the same theory.

 DF: One of the things that has captured my own attention about neuroscience is how, when you get close up, it is sometimes strikingly unnaturalistic, at least in the stereotyped sense of that term. This is one sense banal (all intellectual practices are weird, close up) – but it does seem to call for more nuance in how anthropologists and sociologists have understood the neurosciences. And indeed one of the lessons I take from your article is that both the neurophilosophers and the anthropologists have potentially failed to grasp the subtleties that structure the material culture of neuroscience. What are your thoughts on this?

NL: There is so much to be said about this, but it all depends on what you mean by naturalism. A prominent definition in anthropology, for example, is Philippe Descola’s. For him, naturalists assume continuity between humans and other animals on the physical level while postulating radical discontinuity on the mental or spiritual level. Descartes is the prototype of this kind of naturalism. Of course, that’s not what most neuroscientists and neurophilosophers mean by the term. They are Descartes bashers like everybody else these days. And yet some of their practices – most prominently animal experiments – are informed by this dualist conception of naturalism: it wouldn’t make epistemological sense to develop an animal model of a neuropsychiatric disorder, if you didn’t believe in physical continuity, but ethically it’s only permissible to experiment on these animals because their minds are regarded to be qualitatively very different from our own. I examined this closely in my book Neuropsychedelia. I also noticed that the psychopharmacologists I worked with only talked about themselves in neurochemical terms when they were joking. As soon as things got serious they reverted to vocabularies informed by the psy disciplines.  Ian Hacking might well be right and Cartesianism continues to run strong. Maybe that even tells us something about universal forms of human cognition, as Descola suggests, but I don’t think he provides enough evidence for that.

There are probably other reasons for why the Churchlands are materialists in their philosophy of mind and idealists in their philosophy of science. In Science in Action, Latour argued that, in scientific discourse, statements qualify other statements either in a positive or negative modality. In the positive modality, a statement leads away from the first statement’s production to its consequences. By contrast, negative modality statements direct the reader’s attention to the conditions under which the first statement was produced. It’s not taken as a fact on which we can build but is opened up to further scrutiny. By and large, historical and ethnographic laboratory studies have adopted this critical perspective revealing the social and material practices generating scientific truth claims. Neurophilosophers prefer to draw philosophical conclusions from neuroscientific facts – they operate in the positive modality. That might explain why they are not so keen on the kind of epistemological naturalism that characterizes science studies. We can dismiss this as being uncritical, but we should also note that our own obsession with the negative modality is a very serious obstacle to any meaningful collaboration with neuroscientists and empirically oriented philosophers of mind. Although science studies originated in the long 1960s, we might have a brighter future if we stopped conceiving of ourselves as an epistemic counterculture.

On a not so chance encounter of neurophilosophy and science studies in a sleep laboratory,’ by Nicolas Langlitz, is available in the October 2015 issue of History of the Human Sciences: http://hhs.sagepub.com/content/28/4/3.abstract