Wednesday, November 30, 2016

DeLanda on historical ontology


A primary reason for thinking that assemblage theory is important is the fact that it offers new ways of thinking about social ontology. Instead of thinking of the social world as consisting of fixed entities and properties, we are invited to think of it as consisting of fluid agglomerations of diverse and heterogeneous processes. Manuel DeLanda's recent book Assemblage Theory sheds new light on some of the complexities of this theory.

Particularly important is the question of how to think about the reality of large historical structures and conditions. What is "capitalism" or "the modern state" or "the corporation"? Are these temporally extended but unified things? Or should they be understood in different terms altogether? Assemblage theory suggests a very different approach. Here is an astute description by DeLanda of historical ontology with respect to the historical imagination of Fernand Braudel:
Braudel's is a multi-scaled social reality in which each level of scale has its own relative autonomy and, hence, its own history. Historical narratives cease to be constituted by a single temporal flow -- the short timescale at which personal agency operates or the longer timescales at which social structure changes -- and becomes a multiplicity of flows, each with its own variable rates of change, its own accelerations and decelerations. (14)
DeLanda extends this idea by suggesting that the theory of assemblage is an antidote to essentialism and reification of social concepts:
Thus, both 'the Market' and 'the State' can be eliminated from a realist ontology by a nested set of individual emergent wholes operating at different scales. (16)
I understand this to mean that "Market" is a high-level reification; it does not exist in and of itself. Rather, the things we want to encompass within the rubric of market activity and institutions are an agglomeration of lower-level concrete practices and structures which are contingent in their operation and variable across social space. And this is true of other high-level concepts -- capitalism, IBM, or the modern state.

DeLanda's reconsideration of Foucault's ideas about prisons is illustrative of this approach. After noting that institutions of discipline can be represented as assemblages, he asks the further question: what are the components that make up these assemblages?
The components of these assemblages ... must be specified more clearly. In particular, in addition to the people that are confined -- the prisoners processed by prisons, the students processed by schools, the patients processed by hospitals, the workers processed by factories -- the people that staff those organizations must also be considered part of the assemblage: not just guards, teachers, doctors, nurses, but the entire administrative staff. These other persons are also subject to discipline and surveillance, even if to a lesser degree. (39)
So how do assemblages come into being? And what mechanisms and forces serve to stabilize them over time?  This is a topic where DeLanda's approach shares a fair amount with historical institutionalists like Kathleen Thelen (link, link): the insight that institutions and social entities are created and maintained by the individuals who interface with them, and that both parts of this observation need explanation. It is not necessarily the case that the same incentives or circumstances that led to the establishment of an institution also serve to gain the forms of coherent behavior that sustain the institution. So creation and maintenance need to be treated independently. Here is how DeLanda puts this point:
So we need to include in a realist ontology not only the processes that produce the identity of a given social whole when it is born, but also the processes that maintain its identity through time. And we must also include the downward causal influence that wholes, once constituted, can exert on their parts. (18)
Here DeLanda links the compositional causal point (what we might call the microfoundational point) with the additional idea that higher-level social entities exert downward causal influence on lower-level structures and individuals. This is part of his advocacy of emergence; but it is controversial, because it might be maintained that the causal powers of the higher-level structure are simultaneously real and derivative upon the actions and powers of the components of the structure (link). (This is the reason I prefer to use the concept of relative explanatory autonomy rather than emergence; link.)

DeLanda summarizes several fundamental ideas about assemblages in these terms:
  1. "Assemblages have a fully contingent historical identity, and each of them is therefore an individual entity: an individual person, an individual community, an individual organization, an individual city." 
  2. "Assemblages are always composed of heterogeneous components." 
  3. "Assemblages can become component parts of larger assemblages. Communities can form alliances or coalitions to become a larger assemblage."
  4. "Assemblages emerge from the interactions between their parts, but once an assemblage is in place it immediately starts acting as a source of limitations and opportunities for its components (downward causality)." (19-21)
There is also the suggestion that persons themselves should be construed as assemblages:
Personal identity ... has not only a private aspect but also a public one, the public persona that we present to others when interacting with them in a variety of social encounters. Some of these social encounters, like ordinary conversations, are sufficiently ritualized that they themselves may be treated as assemblages. (27)
Here DeLanda cites the writings of Erving Goffman, who focuses on the public scripts that serve to constitute many kinds of social interaction (link); equally one might refer to Andrew Abbott's processual and relational view of the social world and individual actors (link).

The most compelling example that DeLanda offers here and elsewhere of complex social entities construed as assemblages is perhaps the most complex and heterogeneous product of the modern world -- cities.
Cities possess a variety of material and expressive components. On the material side, we must list for each neighbourhood the different buildings in which the daily activities and rituals of the residents are performed and staged (the pub and the church, the shops, the houses, and the local square) as well as the streets connecting these places. In the nineteenth century new material components were added, water and sewage pipes, conduits for the gas that powered early street lighting, and later on electricity and telephone wires. Some of these components simply add up to a larger whole, but citywide systems of mechanical transportation and communication can form very complex networks with properties of their own, some of which affect the material form of an urban centre and its surroundings. (33)
(William Cronon's social and material history of Chicago in Nature's Metropolis: Chicago and the Great West is a very compelling illustration of this additive, compositional character of the modern city; link. Contingency and conjunctural causation play a very large role in Cronon's analysis. Here is a post that draws out some of the consequences of the lack of systematicity associated with this approach, titled "What parts of the social world admit of explanation?"; link.)



Sunday, November 27, 2016

What is the role of character in action?


I've been seriously interested in the question of character since being invited to contribute to a volume on the subject a few years ago. That volume, Questions of Character, has now appeared in print, and it is an excellent and engaging contribution. Iskra Fileva was the director of the project and is the editor of the volume, and she did an excellent job in selecting topics and authors. She also wrote an introduction to the volume and introductions to all five parts of the collection. It would be possible to look at Fileva's introductions collectively as a very short book on character by themselves.

So what is "character"? To start, it is a concept of the actor that draws our attention to enduring characteristics of moral and practical propensities, rather than focusing on the moment of choice and the criteria recommended by the ethicist on the basis of which to make choices. Second, it is an idea largely associated with the "virtue" ethics of Aristotle. The other large traditions in the history of ethics -- utilitarianism and Kantian ethics, or consequentialist and deontological theories -- have relatively little to say about character, focusing instead on action, rules, and moral reasoning. And third, it is distinguished from other moral ideas by its close affinity to psychology as well as philosophy. It has to do with the explanation of the behavior of ordinary people, not just philosophical ideas about how people ought to behave.  

This is a fundamentally important question for anyone interested in formulating a theory of the actor. To hold that human beings sometimes have "character" is to say that they have enduring features of agency that sometimes drive their actions in ways that override the immediate calculation of costs and benefits, or the immediate satisfaction of preferences. For example, a person might have the virtues of honesty, courage, or fidelity -- leading him or her to tell the truth, resist adversity, or keep commitments and promises, even when there is an advantage to be gained by doing the contrary. Or conceivably a person might have vices -- dishonesty, cruelty, egotism -- that lead him or her to act accordingly -- sometimes against personal advantage. 

Questions of Character is organized into five major sets of topics: ethical considerations, moral psychology, empirical psychology, social and historical considerations, and art and taste. Fileva has done an excellent job of soliciting provocative essays and situating them within a broader context. Part I includes innovative discussions of how the concept of character plays out in Aristotle, Hume, Kant, and Nietzsche. Part II considers different aspects of the problem of self-control and autonomy. Part III examines the experimental literature on behavior in challenging situations (for example, the Milgram experiment), and whether these results demonstrate that human actors are not guided by enduring virtues. Part IV examines the intersection between character and large social settings, including history, the market, and the justice system. And Part V considers the role of character in literature and the arts, including the interesting notion that characters in novels become emblems of the character traits they display.

The most fundamental question raised in this volume is this: what is the role of character in human action? How, if at all, do embodied traits, virtues and vices, or personal commitments influence the actions that we take in ordinary and extraordinary circumstances? And the most intriguing challenge raised here is one that casts doubt on the very notion of character: "there are no enduring behavioral dispositions inside a person that warrant the label 'character'." Instead, all action is opportunistic and in the moment. Action is "situational" (John Doris, Lack of Character: Personality and Moral Behavior; Ross and Nisbett, The Person and the Situation). On this approach, what we call "character" and "virtue" is epiphenomenal; action is guided by factors more fundamental than these.

My own contribution focuses on the ways in which character may be shaped by historical circumstances. Fundamentally I argue that growing up during the Great Depression, the Jim Crow South, or the Chinese Revolution potentially cultivates fairly specific features of mentality in the people who had these formative experiences. The cohort itself has a common (though not universal) character that differs from that of people in other historical periods. As a consequence people in those cohorts commonly behave differently from people in other cohorts when confronted with roughly similar action situations. So character is both historically shaped and historically important. Much of my argument was worked out in a series of posts here in Understanding Society

This project is successful in its own terms; the contributors have created a body of very interesting discussion and commentary on an important element of human conduct. The volume is distinctly different from other collections in moral psychology or the field of morality and action. But the project is successful in another way as well. Fileva and her colleagues succeeded in drawing together a novel intellectual configuration of scholars from numerous disciplines to engage in a genuinely trans-disciplinary research collaboration. Through several academic conferences (one of which I participated in), through excellent curatorial and editorial work by Fileva herself, and through the openness of all the collaborators to listen with understanding to the perspectives of researchers in other disciplines, the project succeeded in demonstrating the power of interdisciplinary collaboration in shedding light on an important topic. I believe we understand better the intriguing complexities of actors and action as a result of the work presented in Questions of Character.

(Here is a series of posts on the topic of character; link.)

Thursday, November 24, 2016

Coarse-graining of complex systems


The question of the relationship between micro-level and macro-level is just as important in physics as it is in sociology. Is it possible to derive the macro-states of a system from information about the micro-states of the system? It turns out that there are some surprising aspects of the relationship between micro and macro that physical systems display. The mathematical technique of "coarse-graining" represents an interesting wrinkle on this question. So what is coarse-graining? Fundamentally it is the idea that we can replace micro-level specifics with local-level averages, without reducing our ability to calculate macro-level dynamics of behavior of a system.

A 2004 article by Israeli and Goldenfeld, "Coarse-graining of cellular automata, emergence, and the predictability of complex systems" (link) provides a brief description of the method of coarse-graining. (Here is a Wolfram demonstration of the way that coarse graining works in the field of cellular automata; link.) Israeli and Goldenfeld also provide physical examples of phenomena with what they refer to as emergent characteristics. Let's see what this approach adds to the topic of emergence and reduction. Here is the abstract of their paper:
We study the predictability of emergent phenomena in complex systems. Using nearest neighbor, one-dimensional Cellular Automata (CA) as an example, we show how to construct local coarse-grained descriptions of CA in all classes of Wolfram's classification. The resulting coarse-grained CA that we construct are capable of emulating the large-scale behavior of the original systems without accounting for small-scale details. Several CA that can be coarse-grained by this construction are known to be universal Turing machines; they can emulate any CA or other computing devices and are therefore undecidable. We thus show that because in practice one only seeks coarse-grained information, complex physical systems can be predictable and even decidable at some level of description. The renormalization group flows that we construct induce a hierarchy of CA rules. This hierarchy agrees well apparent rule complexity and is therefore a good candidate for a complexity measure and a classification method. Finally we argue that the large scale dynamics of CA can be very simple, at least when measured by the Kolmogorov complexity of the large scale update rule, and moreover exhibits a novel scaling law. We show that because of this large-scale simplicity, the probability of finding a coarse-grained description of CA approaches unity as one goes to increasingly coarser scales. We interpret this large scale simplicity as a pattern formation mechanism in which large scale patterns are forced upon the system by the simplicity of the rules that govern the large scale dynamics.
This paragraph involves several interesting ideas. One is that the micro-level details do not matter to the macro outcome (italics above). Another related idea is that macro-level patterns are (sometimes) forced by the "rules that govern the large scale dynamics" -- rather than by the micro-level states.

Coarse-graining methodology is a family of computational techniques that permits "averaging" of values (intensities) from the micro-level to a higher level of organization. The computational models developed here were primarily applied to the properties of heterogeneous materials, large molecules, and other physical systems. For example, consider a two-dimensional array of iron atoms as a grid with randomly distributed magnetic orientations (up, down). A coarse-grained description of this system would be constructed by taking each 3x3 square of the grid and assigning it the up-down value corresponding to the majority of atoms in the grid. Now the information about nine atoms has been reduced to a single piece of information for the 3x3 grid. Analogously, we might consider a city of Democrats and Republicans. Suppose we know the affiliation of each household on every street. We might "coarse-grain" this information by replacing the household-level data with the majority representation of 3x3 grids of households. We might take another step of aggregation by considering 3x3 grids of grids, and representing the larger composite by the majority value of the component grids.

How does the methodology of coarse-graining interact with other inter-level questions we have considered elsewhere in Understanding Society (emergence, generativity, supervenience)? Israeli and Goldenfeld connect their work to the idea of emergence in complex systems. Here is how they describe emergence:
Emergent properties are those which arise spontaneously from the collective dynamics of a large assemblage of interacting parts. A basic question one asks in this context is how to derive and predict the emergent properties from the behavior of the individual parts. In other words, the central issue is how to extract large-scale, global properties from the underlying or microscopic degrees of freedom. (1)
Note that this is the weak form of emergence (link); Israeli and Goldenfeld explicitly postulate that the higher-level properties can be derived ("extracted") from the micro level properties of the system. So the calculations associated with coarse-graining do not imply that there are system-level properties that are non-derivable from the micro-level of the system; or in other words, the success of coarse-graining methods does not support the idea that physical systems possess strongly emergent properties.

Does the success of coarse-graining for some systems have implications for supervenience? If the states of S can be derived from a coarse-grained description C of M (the underlying micro-level), does this imply that S does not supervene upon M? It does not. A coarse-grained description corresponds to multiple distinct micro-states, so there is a many-one relationship between M and C. But this is consistent with the fundamental requirement of supervenience: no difference at the higher level without some difference at the micro level. So supervenience is consistent with the facts of successful coarse-graining of complex systems.

What coarse-graining is inconsistent with is the idea that we need exact information about M in order to explain or predict S. Instead, we can eliminate a lot of information about M by replacing M with C, and still do a perfectly satisfactory job of explaining and predicting S.

There is an intellectual wrinkle in the Israeli and Goldenfeld article that I haven't yet addressed here. This is their connection between complex physical systems and cellular automata. A cellular automaton is a simulation governed by simple algorithms governing the behavior of each cell within the simulation. The game of Life is an example of a cellular automaton (link). Here is what they say about the connection between physical systems and their simulations as a system of algorithms:
The problem of predicting emergent properties is most severe in systems which are modelled or described by undecidable mathematical algorithms[1, 2]. For such systems there exists no computationally efficient way of predicting their long time evolution. In order to know the system’s state after (e.g.) one million time steps one must evolve the system a million time steps or perform a computation of equivalent complexity. Wolfram has termed such systems computationally irreducible and suggested that their existence in nature is at the root of our apparent inability to model and understand complex systems [1, 3, 4, 5]. (1)
Suppose we are interested in simulating the physical process through which a pot of boiling water undergoes sudden turbulence shortly before 100 degrees C (the transition point between water and steam). There seem to be two large alternatives raised by Israeli and Goldenfeld: there may be a set of thermodynamic processes that permit derivation of the turbulence directly from the physical parameters present during the short interval of time; or it may be that the only way of deriving the turbulence phenomenon is to provide a molecule-level simulation based on the fundamental laws (algorithms) that govern the molecules. If the latter is the case, then simulating the process will prove computationally impossible.

Here is an extension of this approach in an article by Krzysztof Magiera and Witold Dzwinel, "Novel Algorithm for Coarse-Graining of Cellular Automata" (link). They describe "coarse-graining" in their abstract in these terms:
The coarse-graining is an approximation procedure widely used for simplification of mathematical and numerical models of multiscale systems. It reduces superfluous – microscopic – degrees of freedom. Israeli and Goldenfeld demonstrated in [1,2] that the coarse-graining can be employed for elementary cellular automata (CA), producing interesting interdependences between them. However, extending their investigation on more complex CA rules appeared to be impossible due to the high computational complexity of the coarse-graining algorithm. We demonstrate here that this complexity can be substantially decreased. It allows for scrutinizing much broader class of cellular automata in terms of their coarse graining. By using our algorithm we found out that the ratio of the numbers of elementary CAs having coarse grained representation to “degenerate” – irreducible – cellular automata, strongly increases with increasing the “grain” size of the approximation procedure. This rises principal questions about the formal limits in modeling of realistic multiscale systems.
Here K&D seem to be expressing the view that the approach to coarse-graining as a technique for simplifying the expected behavior of a complex system offered by Israeli and Goldenfeld will fail in the case of more extensive and complex systems (perhaps including the pre-boil turbulence example mentioned above).

I am not sure whether these debates have relevance for the modeling of social phenomena. Recall my earlier discussion of the modeling of rebellion using agent-based modeling simulations (link, link, link). These models work from the unit level -- the level of the individuals who interact with each other. A coarse-graining approach would perhaps replace the individual-level description with a set of groups with homogeneous properties, and then attempt to model the likelihood of an outbreak of rebellion based on the coarse-grained level of description. Would this be feasible?

Saturday, November 19, 2016

SSHA 2016

Image: Palmer House lobby

The 41st annual meeting of the Social Science History Association is underway in Chicago this weekend. I've been a member since 1998, approaching half the lifetime of the association, and I continue to find it the most satisfying and stimulating of my professional associations. 

The association was founded to create an alternative voice within the history profession, and to serve as a venue for multi-disciplinary approaches to research and explanation in history. It is very interesting that many of the earliest advocates for this new intellectual configuration -- including some of the founders of the association -- are still heavily involved. Bill Sewell, Andrew Abbott, Myron Gutmann (the current president), and Julia Adams all illustrate the importance of interdisciplinary work in their own research and writing, and these social researchers have all brought important innovations into the evolving task of understanding the social world. 

SSHA programs have highlighted a diversity of ideas and approaches over the past four decades -- historical demography, bringing culture into politics, the value of the social-causal mechanisms approach, using spatial techniques (GIS) to help further historical understanding, and the role played by identities of race, gender, and nation in historical process. A renewed interest in Eurasian history and global history is also noteworthy in recent years, offsetting the tendency towards eurocentrism in the history profession more broadly. The approaches associated with comparative historical sociology have almost always had a prominent role on the program. Memorable sessions in previous years by Chuck Tilly, George Steinmetz, Liz Clemens, and Sid Tarrow stand out in my mind. And in recent years there has been a healthy interest in issues of philosophy of history and historiography expressed in the program.

Here is the SSHA's mission statement:

The Social Science History Association is an interdisciplinary group of scholars that shares interests in social life and theory; historiography, and historical and social-scientific methodologies. SSHA might be best seen as a coalition of distinctive scholarly communities. Our substantive intellectual work ranges from everyday life in the medieval world – and sometimes earlier -- to contemporary global politics, but we are united in our historicized approach to understanding human events, explaining social processes, and developing innovative theory.

The term “social science history” has meant different things to different academic generations. In the 1970s, when the SSHA’s first meetings were held, the founding generation of scholars took it to reflect their concern to address pressing questions by combining social-science method and new forms of historical evidence. Quantitative approaches were especially favored by the association’s historical demographers, as well as some of the economic, social and women’s historians of the time. By the 1980s and 1990s, other waves of scholars – including culturally-oriented historians and anthropologists, geographers, political theorists, and comparative-historical social scientists -- had joined the conversation.

SSHA is a self-organizing configuration of scholars, and the annual program reflects the interests and initiative of its members. It is organized around 19 networks, each of which is managed by one of more network representatives. The program consists of panels proposed by the networks, along with a handful of presidential sessions organized by the program committee to carry out the year's theme. (This year's theme is "Beyond social science history: Knowledge in an interdisciplinary world".) Here are the networks and the number of sessions associated with each:


There are several things I especially appreciate about sessions at SSHA. First, the papers and discussions are almost always of high quality -- well developed and stimulating. Second, there is a good diversity of participants across rank, gender, and discipline. And finally, there is very little posturing by participants. People are here because they care about the subjects and want stimulation, not because they are looking to make a statement about their own centrality in the field. There is a healthy environment based on an interest in learning and discussing at the SSHA, not the pervasive careerism of more discipline-based associations.

Sunday, November 13, 2016

DeLanda on concepts, knobs, and phase transitions

image: Carnap's notes on Frege's Begriffsschrift seminar

Part of Manuel DeLanda's work in Assemblage Theory is his hope to clarify and extend the way that we understand the ontological ideas associated with assemblage. He introduces a puzzling wrinkle into his discussion in this book -- the idea that a concept is "equipped with a variable parameter, the setting of which determines whether the ensemble is coded or decoded" (3). He thinks this is useful because it helps to resolve the impulse towards essentialism in social theory while preserving the validity of the idea of assemblage:
A different problem is that distinguishing between different kinds of wholes involves ontological commitments that go beyond individual entities. In particular, with the exception of conventionally defined types (like the types of pieces in a chess game), natural kinds are equivalent to essences. As we have already suggested, avoiding this danger involves using a single term, 'assemblage', but building into it parameters that can have different settings at different times: for some settings the social whole will be a stratum, for other settings an assemblage (in the original sense). (18)
So "assemblage" does not refer to a natural kind or a social essence, but rather characterizes a wide range of social things, from the sub-individual to the level of global trading relationships. The social entities found at all scales are "assemblages" -- ensembles of components, some of which are themselves ensembles of other components. But assemblages do not have an essential nature; rather there are important degrees of differentiation and variation across assemblages.

By contrast, we might think of the physical concepts of "metal" and "crystal" as functioning as something like a natural kind. A metal is an unchanging material configuration. Everything that we classify as a metal has a core set of physical-material properties that determine that it will be an electrical conductor, ductile, and solid over a wide range of terrestrial temperatures.

A particular conception of an assemblage (the idea of a city, for example) does not have this fixed essential character. DeLanda introduces the idea that the concept of a particular assemblage involves a parameter or knob that can be adjusted to yield different materializations of the given assemblage. An assemblage may take different forms depending on one or more important parameters.

What are those important degrees of variation that DeLanda seeks to represent with "knobs" and parameters? There are two that come in for extensive treatment: the idea of territorialization and the idea of coding. Territorialization is a measure of homogeneity, and coding is a measure of the degree to which a social outcome is generated by a grammar or algorithm. And DeLanda suggests that these ideas function as something like a set of dimensions along which particular assemblages may be plotted.

Here is how DeLanda attempts to frame this idea in terms of "a concept with knobs" (3).
The coding parameter is one of the knobs we must build into the concept, the other being territorialisation, a parameter measuring the degree to which the components of the assemblage have been subjected to a process of homogenisation, and the extent to which its defining boundaries have been delineated and made impermeable. (3)
Later DeLanda returns to this point:
A different problem is that distinguishing between different kinds of wholes involves ontological commitments that go beyond individual entities. In particular, with the exception of conventionally defined types (like the types of pieces in a chess game), natural kinds are equivalent to essences. As we have already suggested, avoiding this danger involves using a single term, 'assemblage', but building into it parameters that can have different settings at different times: for some settings the social whole will be a stratum, for other settings an assemblage (in the original sense). (18)
This is confusing. We normally think of a concept as identifying a range of phenomena; the phenomena are assumed to have characteristics that can be observed, hypothesized, and measured. So it seems peculiar to suppose that the forms of variation that may be found among the phenomena need to somehow be represented within the concept itself.

Consider an example -- a nucleated human settlement (hamlet, village, market town, city, global city). These urban agglomerations are assemblages in DeLanda's sense: they are composed out of the juxtaposition of human and artifactual practices that constitute and support the forms of activity that occur within the defined space. But DeLanda would say that settlements can have higher or lower levels of territorialization, and they can have higher or lower levels of coding; and the various combinations of these "parameters" leads to substantially different properties in the ensemble.

If we take this idea seriously, it implies that compositions (assemblages) sometimes undergo abrupt and important changes in their material properties at critical points for the value of a given variable or parameter.

DeLanda thinks that these ideas can be understood in terms of an analogy with the idea of a phase transition in physics:
Parameters are normally kept constant in a laboratory to study an object under repeatable circumstances, but they can also be allowed to vary, causing drastic changes in the phenomenon under study: while for many values of a parameter like temperature only a quantitative change will be produced, at critical points a body of water will spontaneously change qualitatively, abruptly transforming from a liquid to a solid, or from a liquid to a gas. By analogy, we can add parameters to concepts. Addition these control knobs to the concept of assemblage would allow us to eliminate their opposition to strata, with the result that strata and assemblages (in the original sense) would become phases, like the solid and fluid phases of matter. (19)
These ideas about "knobs", parameters, and codes might be sorted out along these lines. Deleuze introduces two high-level variables along which social arrangements differ -- the degree to which the social ensemble is "territorialized" and the degree to which it is "coded". Ensembles with high territorialization have some characteristics in common; likewise ensembles with low coding; and so forth. Both factors admit of variable states; so we could represent a territorialization measurement as a value between 0 and 1, and likewise a coding measurement.

When we combine this view with DeLanda's suggestion that social ensembles undergo "phase transitions," we get the idea that there are critical points for both variables at which the characteristics of the ensemble change in some important and abrupt way.


W, X, Y, and Z represent the four extreme possibilities of "low coding, low territorialization", "high coding, low territorialization", "high coding, high territorialization", and "low coding, high territorialization". And the suggestion from DeLanda's treatment is that assemblages in these four extreme locations will have importantly different characteristics -- much as solid, liquid, gas, and plasma states of water have different characteristics. (He asserts that assemblages in the "high-high" quadrant are "strata", while ensembles at lower values of the two parameters are "assemblages"; 39.)

Here is a phase diagram for water:


There are five material states represented here, along with the critical values of pressure and temperature at which H20 shifts through a phase transition (solid, liquid, compressible liquid, gaseous, and supercritical fluid). (There is a nice discussion of critical points and phase transitions in Wikipedia (link).)

What is most confusing in the theory offered in Assemblage Theory is that DeLanda appears to want to incorporate the ideas of coding (C) and territorialization (T) into the notation itself, as a "knob" or a variable parameter. But this seems like the wrong way of proceeding. Better would be to conceive of the social entity as an ensemble; and the ensemble is postulated to have different properties as C and T increase. This extends the analogy with phase spaces that DeLanda seems to want to develop. Now we might hypothesize that as a market town decreases in territorialization and coding it moves from the upper right quadrant towards the lower left quadrant of the diagram; and (DeLanda seems to believe) there will be a critical point at which the properties of the ensemble are significantly different. (Again, he seems to say that the phase transition is from "assemblage" to "strata" for high values of C and T.)

I think this explication works as a way of interpreting DeLanda's intentions in his complex assertions about the language of assemblage theory and the idea of a concept with knobs. Whether it is a view that finds empirical or historical confirmation is another matter. Is there any evidence that social ensembles undergo phase transitions as these two important variables increase? Or is the picture entirely metaphorical?

(Gottlob Frege changed logic by introducing a purely formal script intended to suffice to express any scientific or mathematical proposition. The concept of proof was intended to reduce to "derivability according to a specified set of formal operations from a set of axioms." Here is a link to an interesting notebook in Rudolph Carnap's hand of his participation in a seminar by Frege; link.)

Sunday, November 6, 2016

Nine years of Understanding Society

image: Anasazi petroglyphs at Newspaper Rock

This week marks the ninth anniversary of Understanding Society -- 1105 posts to date, or over 1.1 million words. According to Blogger, over 7 million pageviews have flowed across screens, tablets, and phones since 2010.

The blog has been an ideal forum for me to continue to develop new ideas about the social sciences, and to reflect upon new contributions by other talented observers and practitioners of the social sciences. It is a material record for me of the topics that have been of interest to me over time, like points on a map outlining a driving trip through unfamiliar country. (The photo above was such a moment for me in 1996.) Each entry describes a single idea or insight; taken together, they compose a suggestive map of intellectual development and discovery. During the year I've gotten interested in topics as diverse as the early work of John von Neumann on computing (link, link), Reinhart Koselleck's approach to the philosophy of history (link), quantum computing (link, link), China's development policies (link), and cephalopod philosophy (link). I've continued to work on some familiar topics -- generativity, reduction, and emergence; character and plans of life; causal mechanisms; and critical realism.

It is interesting to see what posts have been the most popular over the past six years (the period for which Blogger provides data):


Key topics in the foundations of the social sciences appear on the list -- structure, power, pragmatism, poverty, mobility. But several novel topics make the top ten as well -- supervenience, assemblage theory, and hate. "What is a social structure?" was written during the first month of the blog. The top key words in searches are "social structure" and "social mobility".

Some of the philosophical ideas explored in the blog have crossed over into more traditional forms of academic publishing, including especially the appearance of New Directions in the Philosophy of Social Science earlier this fall. (Here is a site I've created to invite discussion of the book; link.) This book bears out my original hope that Understanding Society could become a "web-based dynamic monograph", with its own cumulative logic over time. In framing New Directions it was possible for me to impose a more linear logic and organization on the key ideas -- for example, actor-based sociology, generativity, causal mechanisms, social ontology. As I conceived of it in the beginning, the blog has proven to be a work of open-source philosophy.


A recurring insight in the blog is the basic fact of heterogeneity and contingency in the social world. One of the difficult challenges for the social sciences is the fact that social change is more rapid and more heterogeneous than we want to think. The founders of sociology, economics, and political science wanted to arrive at theories that would permit us to understand social processes in a fairly simple and uniform way. But the experience of the social world -- whether today in the twenty-first century or in the middle of the nineteenth century -- is that change is heterogeneous, contingent, and diverse. So the social sciences need to approach the study of the social world differently from the neo-positivist paradigm of "theory => explanation => confirmation". We need a meta-theory of social research that is more attentive to granularity, contingency, and heterogeneity -- even as we seek for unifying mechanisms and patterns. (The very first post in Understanding Society was on the topic of the plasticity of things in the social world.)

A new theme in the past year is the politics of hate. The emergence of racism, misogyny, and religious bigotry in the presidential campaign has made me want to understand better the social dynamics of hate -- in the United States and in the rest of the world. So an extended series of posts have focused on this topic in the past six months or so (link). This is a place where theory, philosophy, and social reality intersect: it is intellectually important to understand how hate-based movements proliferate, but it is also enormously important for us as a civilization to understand and neutralize these dynamics.

So thanks for reading and visiting Understanding Society! I know that without the blog my intellectual life would be a lot less interesting and a lot less creative. I am very appreciative of the many thoughtful visitors who read and comment on the blog from time to time, and I'm looking forward to discovering what the coming year will bring.

(Mark Carrigan's Social Media for Academics is a very interesting and current discussion of how social media and blogging have made a powerful impact on sociology. Thanks, Mark, for including Understanding Society in your work!)