Alain Desrosières, La Politique des Grands Nombres, Histoire de la raison statistique, Paris 1993, 2000
Review with respect to the Digital Humanities
Numbers and statistics are an extremely important element in the Digital Humanities. The Frankfurter Allgemeine writes, after the Digital Humanities Conference 2012 in Hamburg, Germany, of the empirical turn in the humanities. High time, then, to read the classic book of Alain Desrosières again (this is also true for Theodore M. Porter’s Trust in Numbers, of course). It is about the social world as a construct, about calculating probabilities, mean values, the law of large numbers, nominalism and taxonomy, the history of the statistics in different countries (France, United Kingdom, Germany, United States), (sample) surveys, econometrics and about the prediction of a victory of the Republicans that was wrong because the survey has been made by telephone.
x x x
“Everything that happens in the world, happens in accordance with laws”, Stader declaims in Robert Musil’s book Die Schwärmer. Such lawful acts and events can be measured – an advantage owed to the empiricism that the social sciences willingly claim against the humanities. About this arose a controversy: The measured objects should be perceived incontestable. But this is not the case – the question is: Existed the fact in question before the measurement, is its reality independent of it, or does the definition create the object, would it, in reality, be a convention?
Emile Durkheim treated social facts as things. This objectification moved sociology to the vicinity of natural sciences. The statistical procedure is similar: with its “formalized synthetic terms” it operates with formulas that simulate accuracy. But: these tools are the product of a development with all its variations and uncertainties. It is important to know how they came about. Because the statistics serves the sciences and politics. Their tools enable the creation of objects that not only describe the world but also take an impact on it. It is “an interaction between the world of knowledge and the world of power”.
Modern statistics provide a “single measurement space” in which things can be compared, an important condition for the formation of a “political space”, in which decisions are enforced uniformly. The statistical objectification combines science and administrative practices. On them the “administration of the social world” is based: The statistics is an “administrative task of recording data”, Desrosières summarizes his concept of analysis.
x x x
The social world is a construct. Things described by statistics must be consistent to be valid. Therefore, it analyzes what gives things their cohesion. It gets its meaning from the state, which uses the data. The basic act of all statistical work is the creation and maintenance of registers, which guarantee the identity of persons legally and administratively. Important in this context is the codification: Statistical constructions should receive “a theoretical independence of the singular and local circumstances” in a country. So, first of all, it is all about “the unification of the national space”.
x x x
Calculating probabilities, „a method by which the rationality of decisions in situations of uncertainty should be justified”, has been developed in the mid-17th Century. It was still strongly influenced theologically in its beginnings: Dice and lots were means of divination, the exploration of the divine will. Later it was about uncertain future profits, investments in maritime trade, insurance, annuities. There was a debate on security and knowledge, the attempt to define an idea of “what simply was probable”. The probabilistic method objectified uncertainty and in so far was a part of the process of secularization. It postulated that similar causes lead to similar consequences. Therefore, it is possible to link future events with past observations in way that the uncertainty of these events can be described.
x x x
How from the many emerges the one, the mean value? This was essentially a question of the old philosophy, the one to the reality of aggregates of things or persons. Since Durkheim we work with the concept of social groups existing outside of individuals. Thus, there is a transition from one level of reality to another – and from one language to another. The statistical construction of the mean value provides the ability to deal with social objects without deforming them, even when they differ greatly in their construction.
The mean value refers to what lies between two extremes. It creates, on the one hand, the best possible approximation to a value at different measurements based on incomplete observations, and, on the other hand, a new reality based on the observation of different objects. Deviations are now considered as “imperfection in the implementation of a model”. In relation to man, on a massive scale, both behaviours and physical characteristics show a big regularity because of the “law of large numbers”. The “average person”, the guiding principle for political decisions is formed by such a “totalization”. It embodies the administrative perfection. With it emerges a new entity: the society.
A good example of the measurement of the effectiveness of actions in relation to a large number of people is the establishment of social security systems, which ensure statistical coverage of the individual risks. Epidemics for instance must be considered as a whole. Mortality statistics in relation to social groups provide information on possible preventive measures. Hygienists constantly tried to achieve improvements through collective preventive measures with the help of statisticians.
x x x
The development of statistics can retrospectively be regarded as a product of the philosophy of Karl Pearson. In his famous book, The Grammar of Science, he took position against causality in favour of correlation. Pearson’s theory of knowledge was close to nominalism in which terms are stable because they are charged “with history and experience”. For Pearson, scientific laws were “abbreviated formulas for the synthesis of perception routines”. He saw statistical objects as real, when they could be sent to and used by others on a “turnkey basis”. That Pearson, based on Galton and Darwin, also tried “to improve the human species on the basis of heredity” is not be further developed here.
In certain cases, Desrosières writes in relation to Pearson, “things exist because others need them”. In other cases he sees them rather as conventions. Such questions were asked particularly in relation to the study of pauperism. In England, one of the values used to measure poverty was the number of people who were receiving welfare assistance. Thus, this object exists because of social coding through an administrative process – but that may be subject to fluctuations, and perhaps only a part of the “real poverty”.
x x x
A state is a “singular totality of social relations” and the statistics their objectification. Insofar, statistics has a strong legitimation: a double guarantee of state and science. This seem to give statistics a high degree of credibility.
In France, the first national surveys were used to collect data on the movement of the population. Then it came to the study of the living conditions of workers (wages, working hours, unemployment). This questions have constantly been expanded and deepened, so that in the 20th Century it finally was all about “the analysis of the relationship between capitalism and state”, as Desrosières writes. The Royal Statistical Society simultaneously analyzed in the United Kingdom, where the study of poverty was in the foreground, seven prerequisites for that “a statistical operation produces consistent objects”: Definition of units, homogeneity of the populations in study; exhaustiveness; relative stability, comparability, relativity [and] accuracy”.
x x x
The German word “statistics” in the 18th Century referred to a “descriptive and not quantitative knowledge of the state” [sic!]. Statistics from the beginning was closely linked to the German administration. Desrosières interprets this as one of the mechanisms to hold together a state “whose consistency has more problems than others”. Closely connected to the beginning of the statistics in Germany is the Verein für Socialpolitik (Association for Social Policy) that built a public insurance systems on the idea of mutual self-help among workers. Max Weber led an important inquiry for the association, studying farm workers in East Prussia.
In the United States Census provide every ten years the basis for the distribution of seats of members of Parliament and of direct taxes on the states. This repeatedly led to bitter debates: how should slaves, how foreigners being counted? The statistics also were from a society in which the various authorities always had to be balanced. Therefore, statisticians not only should document and describe the state of affairs, but also help to find political solutions to conflicts between states. Their figures were more than elsewhere not “a truth that stood over the various camps” but linked to argumentation, decision and influence.
Data were first processed by machine, 1890, in the United States. In the 1930s, new statistical tools “to manage the political, economic and social imbalance”, as Desrosières writes, have been developed. Here, the definition of unemployment posed a problem: Besides the question of the number of people affected, the question was, if this number should be put into proportion to the whole or only to the (potentially) working population. The statistics also played a crucial role in other social facts in order to make things consistent and thus politically workable. In this context, George Horace Gallup showed that sample surveys are more durable and better than conventional recordings. He became famous when he predicted, in 1936, the re-election of President Roosevelt (see below).
x x x
Systematic “random-based sample surveys” are an example for the fact that the invention and application of a technology is related to certain conditions attached to the possibility of innovation: Before solving, the problem must be “invented” – or perhaps better: “found”. First, one had to know if the whole, thus exhaustive surveys, can be replaced by a part, partial surveys. Then it was about the method: deliberate or random selection? Finally it came to the question of how observations can be generalized in a way to allow statements about the society as a whole. For a representative sample the statistics uses a design approach according to certain criteria in respect to collective actors. The framework of this “macro sociology” is based on the “law of large numbers”. It postulates an external reality beyond the individuals. This was the main idea of Emile Durkheim in his famous study about suicide.
From the 1930s, the representative method was increasingly used in market studies on consumer choice and in election predictions. This required for nationwide standardizations and equivalences with respect to the tested products and objects. For election forecasts, samples had to be collected close to the electorate. In 1936, the telephone survey in the United States became famous. It was predicted a victory of the Republicans – wrongly, as it became clear after the election. The reason for this false prognosis: Only individuals were interviewed who had a phone, so were relatively prosperous.
x x x
The statistics establishes a cohesion between singular things, and lends the objects thus a more comprehensive reality and consistency. This includes questions of taxonomy, which lead to very different concepts of social tools in the humanities. They all carry in them “the project to organize the things in a comprehensive way” as Michel Foucault said. In the “Archaeology of the taxonomy” he lists the classifications, which appear in the statistics ever again: “Codification as sacrificing insignificant perceptions, choosing the right variables, the technique of the construction of equivalence classes, the realism of the categories and the historicity of discontinuities”. For example: To reveal the order of nature, Linné deliberately limited his experience range: He dropped hearing, taste, smell and touch in favour of the vision, with which he recorded sizes, lines, shapes and relative positions. So, Linné chose some characteristics out of all features to build his classification on these criteria. This exclusion of certain properties created discontinuities, boundaries between the classes. The resulting categories are considered to be sufficiently stable to allow them to be incorporated in the production of knowledge.
x x x
The history of economic analysis contains three different elements: the mathematization, quantification and the probability calculation. Although their association joined difficulties in the new econometrics, it was clear that it would break away from philosophy and political economy. The questions were now empirical and analytical – such as: What kind of relationship can be established between theoretical laws and the observed data? There were two opposing trends: one from the data to theory (indices, economic analysis, and national income) and one from the theory to the data (estimation of parameters, verification).
In the 1930s, with the help of probability, it was possible to find a link between the empirical observations and theoretical formalisms. In relation to economic cycles the newly developed statistical tools led to a “taming of chance”. The method allowed to eliminate oscillations with ever longer periods progressively: initially seasonal, then short-term (three to four years) and finally medium (eight to nine years) cycles have been made visible. The result was a long-term trend. It could be shown that “the energy to maintain the cycles is supplied by random disturbances”. Thus, it became clear that the economy is driven in an uneven manner.
So, it was realized that a government can influence the movement of the economy: one the one hand to intervene selectively, on the other hand to change the internal relations in the dynamic system. During the economic crisis of the 1930s following regulatory decisions were examined to reduce unemployment: charitable works, protectionism, industrial rationalization, reduction of monopoly prices, wage cuts and currency devaluation. Keynes noted that random shocks may only be applied if the effective causes can be analyzed in advance completely without doubt. He thereby pointed to non-measurable factors, such as psychological, political or social variables.
The lack of a controlled experimental situation and the non-observance of the ceteris paribus clause repeatedly led to debates about the possibility of applying empirical techniques in economics. An alternative between subjective and objective probability offered the term likelihood, the phrase that “everything happens as if”. The idea behind that: Within a well-defined probability distribution, to find out the one that gives the greatest plausibility to a set of observations. The Cowles Commission took up this idea. Its program was part of the founding of modern econometrics.
Economists assumed now that they can rely on a system, if it worked in the past and if no important elements are changed. So, in the 1940s, it came to a synthesis of all these theories, methods and empirical values: old and new economic theory, statistical records, national income, inferential statistics, and linear algebra. Tax rates and price controls were considered as structural models in which authorities could make optimal decisions. The most important area for the application of these new methods and techniques then was the national income.
x x x
What does it mean for the Digital Humanities to read Alain Desrosières book again?
First, we learn that statistical figures are simultaneously found and constructed. Indeed, a social fact exists before its measurement, but is formed as a scientific object only by the statistical operation. Desrosières’ elaboration of these relations and connections into categories, such as Michel Foucault has provided them, is very valuable for the human and social sciences.
Second, it is important to realize how closely state, administration and research are cooperating in the field of statistics. Modern statistics provide a “single measurement space” in which things can be compared, an important pre-condition for the formation of a “political space”, in which decisions are enforced uniformly. The “objectification”, the creation of “permanent things”, combines science and administrative practices. The “administration of the social world” is based on them. Shortly: The statistics is an “administrative task of recording data”.
Finally, also the history of statistics is of great interest: its emergence in the context of political philosophy and its detachment towards the econometrics. In this context Desrosières creates a knowledge about the basic elements of statistics, which could not be considered here in detail (the law of large numbers, averages, probability, sampling, etc.), but which is very import in the practice of the Digital Humanities. In any case: Who in social and human sciences is dealing with the function of the statistics and its importance for the state and society – in general but also especially in France, the United Kingdom, Germany and the United States – is very well served with this interesting book of Alain Desrosières, written in the tradion of the deconstruction of a Michel Foucault.
OpenEdition suggests that you cite this post as follows:
Guido Koller (September 6, 2012). Statistics: The Politics of large Numbers. We think History. Retrieved May 16, 2025 from https://doi.org/10.58079/vadq