A brief history of information
Towards the end of the Second World War, the simulation of medium and long-range missile trajectories, and the modelling of fissile reactions inside the atomic nucleus, created a need for more powerful algorithmic and numerical computations. Thanks in part to the theoretical work of John von Neumann, the first computers were born.
At that time, office work was characterized by a standardization and rationalization that were far less advanced than they were in industrial production. The application of the first computers to management tasks immediately resulted in the disappearance of all freedom and flexibility in the implementation of working procedures – in short, in a brutal proletarianization of the class of employees.
In the same years, with a comic belatedness, European literature found itself confronted with a new tool: the typewriter. Indefinite and varied work on the manuscript (with its additions, references and footnotes) disappeared in favour of a more linear and flatter writing; there was a de facto alignment with the standards of American detective novels and journalism (hence the appearance of the myth of the Underwood typewriter – Hemingway’s success).2 This degradation of the image of literature led many young people with a ‘creative’ temperament to move towards the more rewarding paths of cinema and song (ultimately dead ends; indeed, the American entertainment industry was soon to begin the process of destroying local entertainment industries – a process that is now coming to an end).
The sudden appearance of the microcomputer in the early 1980s may appear to be some sort of historical accident; it did not correspond to any economic necessity, and is in fact inexplicable unless we factor in such elements as advances in the regulation of low currents and the fine etching of silicon. Office workers and middle managers unexpectedly found themselves in possession of a powerful, easy-to-use tool that allowed them to regain control – de facto, if not de jure – over the core elements of their work. A silent and largely unrecognized struggle lasting several years took place between IT departments and ‘basic’ users, sometimes supported by teams of passionate micro-IT specialists. What is most surprising is that gradually, as they became aware of the high costs and low efficiency of heavy computing, while mass production allowed the emergence of reliable and cheap office automation hardware and software, general management switched to microcomputers.
For the writer, the microcomputer was an unexpected liberation: it was not really a return to the flexibility and userfriendliness of the manuscript, but it became possible, all the same, to engage in serious work on a text. During the same years, various indicators suggested that literature might regain some of its former prestige – albeit less on its own merits than through the self-effacement of rival activities. Rock music and cinema, subjected to the formidable levelling power of television, gradually lost their magic. The previous distinctions between films, music videos, news, advertising, human testimonies and reporting tended to fade in favour of the notion of a generalized spectacle.
The appearance of optical fibres and the industrial agreement on the TCP/IP protocol at the beginning of the 1990s made possible the emergence of networks within and then between companies. The microcomputer, now reduced yet again to being a simple workstation within reliable clientserver systems, lost all its autonomous processing capacity. There was in fact a renormalization of procedures within more mobile, more transversal and more efficient information processing systems.
Microcomputers, though ubiquitous in business, had failed in the domestic market for reasons that have since been clearly analysed (they were still expensive, had little real use, and were difficult to work on when lying down). The late 1990s saw the emergence of the first passive Internet access terminals; in themselves they were devoid of intelligence or memory, so that unit production costs were very low, and they were designed to allow access to the gigantic databases built up by the American entertainment industry. Finally equipped with an at least officially secure electronic payment system, they were attractive and light, and soon established themselves as a standard, replacing both the mobile phone, Minitel and the remote control of conventional television sets.
Unexpectedly, the book was to constitute a perennial pole of resistance. Attempts were made to store works on an Internet server; their success remained restricted, limited to encyclopaedias and reference works. After a few years, the industry was forced to agree: the book – more practical, more attractive and more manageable – was still popular with the public. However, any book, once purchased, became a formidable instrument of disconnection. In the intimate chemistry of the brain, literature had often in the past been able to take precedence over the real universe; literature had nothing to fear from virtual universes. This was the beginning of a paradoxical period, which still lasts today, where the globalization of entertainment and exchange – in which articulate language was of little importance – went hand in hand with a strengthening of vernacular languages and local cultures.
The onset of weariness
Politically, opposition to globalist economic liberalism had actually started long before; it became apparent in France in 1992, with the campaign for the ‘No’ vote to the Maastricht referendum. This campaign drew its strength less from reference to a national identity or to republican patriotism – both of which disappeared in the slaughter of Verdun in 1916–1917 – than from a genuine widespread weariness, from a feeling of outright rejection. Like all historicisms before it, liberalism threw its weight around by presenting itself as an inescapable historical change. Like all historicisms before it, liberalism posed itself as the assumption and transcendence of simple ethical sentiment in the name of a long-term vision of the historical future of humanity. Like all historicisms before it, liberalism promised effort and suffering for the present, relegating the arrival of the general good to a generation or two away. This kind of argument had already caused enough damage, throughout the twentieth century.
The perversion of the concept of progress regularly wrought by various forms of historicism unfortunately favoured the emergence of comical philosophies, typical of times of disarray. Often inspired by Heraclitus or Nietzsche, well suited to middle and high incomes, with a sometimes amusing aesthetic, they seemed to find their confirmation in the proliferation, among the less privileged layers of the population, of many unpredictable and violent assertions of identity. Certain advances in the mathematical theory of turbulence led, more and more frequently, to human history being depicted as a chaotic system in which futurologists and media thinkers strove to detect one or more ‘strange attractors’. Though it was devoid of any methodological basis, this analogy was to gain ground among educated and semi-educated strata, thus durably preventing the constitution of a new ontology.
The world as supermarket and derision
Arthur Schopenhauer did not believe in history. So he died convinced that the revelation he brought, in which the world existed on the one hand as will (as desire, as vital impetus), and on the other hand as representation (in itself neutral, innocent, purely objective and, as such, susceptible to aesthetic reconstruction), would survive the passing of successive generations. We can now see that he was partly wrong. The concepts he put in place can still be seen in the fabric of our lives; but they have undergone such metamorphoses that one wonders how much validity remains in them.
The word ‘will’ seems to indicate a long-term tension, a continuous effort, conscious or not, but coherent, striving towards a goal. Of course, birds still build nests, male deer still fight for possession of the females; and in the sense of Schopenhauer we can indeed say that it’s the same deer that has been fighting, and the same larva that has been burrowing, ever since the painful day of their first appearance on Earth. It’s quite different for men. The logic of the supermarket necessarily induces a dispersion of desires; the shopper in the supermarket cannot organically be the person of a single will, a single desire. Hence there is a certain depression of will in contemporary human beings: not that individuals desire less – on the contrary, they desire more and more; but their desires have become somewhat garish and screeching: without