I was born and educated in a “modern” family and society. It was modern in the sense we give the word today, based on a grand narrative combining Christian history and national glory. My identity matured in the rather cocooning consensus that Roosevelt, Yalta and, closer to me, De Gaulle managed to impose after the horrors of WW2, and which permitted the economic and social development of the “30 glorious years”.
In the 50’s, when I started to study philosophy, the success of the great modern narrative could seem rather close at hand. Of course, a lot of problems had to be solved. Decolonization, cold war, the starvation of billions in Asia… but our hopes were well expressed by Teilhard de Chardin , who foretold a convergence, not only of faith and reason, in the line of Condorcet [15] to make it short, but even of Jesus and Marx. And a regular walk towards a fascinating Omega point. Even mathematics were converging in the glorious synthesis of “modern mathematics” driven in France mainly to Nicolas Bourbaki in the lines laid down by Hilbert.
Identity, admittedly, raised contradictions and paradoxes. In such a coherent and convergent world, identity was at the same time stressed and called upon to disappear from the global narrative. Identity is central for a christian believer since his main duty is to “save his soul”. But, at the same time, the self must recede in face of God : “it is no longer I who live, but Christ lives in me..." writes Paul (Galatians 2:20b). But these difficulties were rather marginal issues in the global thrust towards a bright future. And it was rather easy, in this context, to know who I was, and to build my life along well beaconed ways: get diplomas, get a job, get a wife, and make children… in a stability which would take me to a reasonably easy retirement.
Digitalization played its part in the general convergence. Of course, not the PC as a personal tool in the 60-70s. At that time, the computer was the “mainframe”, available only to corporations and public agencies. It accorded well with the realization of the “great narrative”. Buzzwords were MIS (Management Integrated Systems) of corporations; PPBS (Planning Programming Budgeting Systems) at government level, or Gosplan in the Soviet Union. Globally, all digital devices were deemed to converge into global networks and networked applications. That was called “telematics” in a (then) famed French report [16]. These views were no so new, as I discovered later, since Jules Verne for instance had foretold them some 80 years before ! [17]
In this global move, the identity of human beings was regularly enhanced. Let us give some examples.
Customer identity. In insurance companies, the management systems were built upon two series of records: the contract (policy) and the claim. The customer as such was not operationally present ; even if he had several contracts, and some pending claims, he was not dealt with globally by the information system. Then, the growth in computer power allowed all these operations to be progressively coordinated and records to be merged into databases. It thus became possible to sell multi-risk policies, and to pay special attention to the global performance of each customer. Similar evolutions have been observed in banks, retail, transportation, etc. The ultimate concept is “one to one” marketing, with pro-active processes. Amazon, typically, builds your profile and suggest to you related new books. In retail shops, you are “profiled” and encouraged to let yourself be tracked using fidelity cards… so having a really augmented identity in the corporation system.
Worker identity. Armies have raced forward first in the development of digital technologies. This is reflected in the way they deal with soldiers’ identity. The Middle Ages soldier was not much more than an anonymous bow or spade bearer. In France, soldiers found an identity in the middle of the 19th century, under the basic form of their “numéro matricule” written in their “livret militaire”. During WWII, a major step was also made under the Vichy government, by René Carmille [18] and his team. He intended to make a census of men to be mobilized in case a new French army could be built up. For that he made a massive use of punched cards, and created what would become the French NIR (Numéro identifiant au repertoire, also known as social security number, or Insee number); bearing information by itself, this number was associated with a rather detailed profile of the person concerned. Later, “les trois jours” aimed to build an extensive profile of new recruits, in order to optimize the assignment according to the needs of the forces. In other public agencies and commercial corporations, similar moves have taken place, from basic payroll systems to integrated human resources management applications.
Family identity. This is a personal example, but is becoming common today. Digitalization allowed me to extend considerably my genealogical tree. First, in the 80’s thanks to the Minitel (the basis of French telematics) I could reach a distant cousin, whose branch in the tree had radically parted from mine around 1860, due to business and marriage disagreements. Then, in 2009, with the digitalization of civil archives, this cousin was able to link us to a very ancient noble family, hence extending our descent back to Charlemagne and his Gallic ancestors in the sixth century, with an uninterrupted series of some 45 generations. By contrast, a lot of my genealogical data are irremediably lost since the Paris city hall was burned in 1871 together with all the civil status files that it stored. Today (at least I hope), the city hall of Paris could be obliterated without the loss of a single bit of genealogical data. And a lot of people will be able to do similar citizen has Charlemagne as one of his ancestors, as well as practically all the inhabitants of “France” or “Europe” of that time.
So, in the modern stage, the DAI makes constant progress, due as well to better integration of global information systems as well as to the considerable growth of data about individuals, allowing him/her to be approached with more and more customized products and services.
Symmetrically, thanks to higher living standards, more leisure time, more computer and networking tools, and richer web contents, the individual person is able to augment his/her identity and personality. He can go shopping and choose personally from a wider and wider range of goods, services and activities. Recent developments on the web and the mobile devices push this augmentation even further.
But, as we know only too well, this beautiful digital cathedral is also a threat to identity : privacy intrusion, inequalities between sides of the “digital divide”.
The modern “great narrative” foundation had been undermined for a long time, if not from its very laying. But, in the 1960’s, or in May 1966 for a French mind, the whole construction began to loose its appeal to the general public. The French school in philosophy, the weak compromise of Vatican II, an emerging knowledge of Gödelian critics, a better perception of relativity in physics, the failures of MIS and PPBS for information systems (PPBS was an emblem for McNamara, associated with the Vietnam rout), even the revolution of PC’s against mainframes… all that came in the foreground and progressively marginalized the great narratives of the past and the frame they offered to DAI progress.
The consequences for lay people are best described by Japanese post-modernist thinkers. Far from the theoretical pathos of the French school (Derrida, Deleuze-Guattari, Lyotard…), which they know very well, they talk a rather easy language and offer pragmatic views, including technological and economic aspects. Strongly Japanese at the beginning, this school of thought was born as a reaction not against “modernism” generally speaking, but rather against Americanism, in the peculiar socio-psychological context of the post-WW2 decades. But they have reached to day a level of generality which make them interesting in every country. Building upon the “otaku” culture (manga, anime and computer games addicts), the most important of these thinkers, Hiroki Azuma [27] draws the figure below :
The top time line shows the transition from modernity (近代 the shaded section) to the post-modern (ポストモダンthe unshaded white section). The bottom time line shows the decline of grand narratives (大きな物難 shaded section). The period from 1945 to 1970 is labeled “era of ideals”, and fits with what we call here “modern”. In this period, even when the grand narratives are criticized, or have even dramatically ended (like the Meiji period in Japan ending with the general capitulation), the grand narratives are still considered as true history and valuable ideals. The period from 1970 to 1995 is labeled “era of fiction”. Here, the great narratives are no longer considered as true, but they still offer a frame for life and identity as accepted fictions. The last period, beginning in 2000 is labeled “era of animals” (a concept taken from Kojeve).
Recently, Lagrandie, a professor of philosophy in the terminal year of French high-school said (Dec. 18/2009, on France Culture) : “The word that makes my students smile, each time I say it, is “doesitmatterism” (aquoibonisme). They understand immediately and freely what I have in mind. They are not future oriented, they see no reason to design projects. For them, their death will be the World’s end. They don’t seek perpetuation in a World, a Civilization”.
In this phase, identity is not built in reference to global mankind and universal Kantian values, but within smaller groups where identity develops itself in comparatively short mirroring circuits. Michel Maffesoli (in his preface to the French translation of Azuma) forges the expression “tribal narcissism”. Continuing the same line of thought, communautarism can be seen as a variant of post-modernism, including a denial of of the possibility of criticising a shared narrative, sometimes ultra-modern and even post-humanist, more frequently a fundamentalist version of the classical religions, mythologies or “national histories”.
The post-modern digital era really started with the personal computer and Internet, both operational at the end of the 70s, but taking off with the general public after 1990. A new stage was reached in the 2000’s with the rapid worldwide expansion of the cellular telephone and its extension to a wide range of mobile devices. The consequences for identity are brightly expressed by Azuma in his “data base consumption” model.
The traditional mode of cultural consumption seen by Azuma.
The traditional, modern, way of cultural consumption is based on the “great narrative” (on the left of the picture), which the “I” (私) looks through a “superficial layer” of “small narratives” (小).Comment: “I am determined by the narratives”.
img width=97 height=122 src="image296.jpg" v:shapes="Image_x0020_86">From Creation to Last Supper and Doomsday, the finger (latin “digitus”) of God, according to Michelangelo and Poussin/
In Christian cultural terms, the Bible is the great narrative, the The Last Supper is a small narrative taken out of it or inspired by it (iconography), and Christian destiny is framed, if not determined, by the all encompassing biblical narrative, from Creation to Doomsday.
In the post-modern scheme, the user (at right) says : “I read into the narratives”. These are short stories, but now built by combinations of components drawn out of the database (the “deep layer” (-, at left). The term “data base” used by Azuma may be error prone, since the components in the base are not properly “data” as in a classical data base, but graphical and semantic, if not material (body and clothes accessories), components. To translate into Christian culture practice, we see today a lot of young pious and active catholic people going to mass and taking part in the great Papal world meetings, while at the same time using birth control means and living together without being married.
Identity may be strongly enhanced in this model, since it is no longer determined by the large narrative. But, eventually, identity will be sought mainly within small groups sharing the same kind of “small narratives”. Far from plunging everybody into the McLuhan “global village”, satellite TV antennas, Internet and mobiles make easy the birth and life of small groups. Distance is no longer an obstacle to a worldwide existence, and at low cost. That may be a new way of domination of the strong over the weak within the limits of family, gangs or ideological groupings. It may be a plague (terrorism, fundamentalism) as well as a blessing (democratic action under authoritarian regimes).
Violence set apart, all would be fine, modern or post-modern, if the Earth were limitless, or outer space accessible to masses, so large that any otaku everywhere in the world could expand his own world without constraints. Unfortunately, recent events, meetings or general trends have brought us back to the “reality principle”, be it security checks or ecological “walls”.
We, as computer scientists and artists, sharing in Laval the responsibilities of cutting edge technologies, can we do anything about that? Well, we can at least try to offer some new models for thinking about and developing the world to come. What we need is a sort of soft integration. Borrowing from the vocabulary of graphic computing, we can talk of “compositing”, with its double connotation of voluntary design and of realistic use of available resources, we can try to “composite” several recent pertinent concepts.
At the individual level, a lot of authors consider identity as a construction, a sort of composition. The sociologist Jean-Claude Kaufmann, whose one book is titled “L’invention de soi” [9], the neuroscientist [30], who sees consciousness as the expression of neuronal competing coalitions…
At the social and political level, [Rosanvallon] pleads for soft combinations of independent institutions at various levels; last but not least, at a very general level, the long chapters of Peter Slotedik’s Sphären trilogy [32] draw lines for a soft architecture of social relations.
Technically speaking, we must take into account not only the technologies of today, but several innovations that will soon emerge from R&D and fundamentally change our lives ([33]).
Brain-to-computer-to-brain (Adam Wilson), and Muscle-computer-muscle communication
Our skin will no longer offer an “information tight” barrier (as it is “watertight”). Implants are and will more and more be, inserted, in theory at first to compensate for disabilities, or for defense motives. Then, unless legally prohibited, to enhance performance of any kind for whoever asks for it. But even if implant grafting is strictly regulated, the skin will become more and more permeable, due to brain-computer both ways communication and more specific links (for instance, muscle-computer links). Deprived of this bodily limit, identity will have to compensate by new coordination and integration means.
Vision (McMullen, OpenCV [34], UmassAmherst). It will really change our everyday life.
Even more “softly”, vision software using powerful GPUs will make computer vision a constantly used communication channel. The future GUIs, I think, will integrate these kind of functions which at present are only offered on cameras. Even dynamic remote controls (those of the WII for instance) will no longer be necessary. Here also, the constant presence of these “voyeurs” may be a problem, not only for privacy but for day-to-day self perception. But we can also appreciate the good side of it: our personal computer as well as the public systems, will recognize us, treat us according to our DAI… and dispense us from the need to have a wallet with cash, credit cards and identity documents.
Friendly robots Astroboy and WallE, and the cyborg integration advocated by the feminist philosopher Dona Haraway.
At the same time, more and more “AI”s will “live” more or less independently of us, not only in games but in the real world. Some of them will the of the “humanoid” robot type (expensive, bulky and uncanny) but a lot more will develop under different lines, offering help as well as creating new threats. These AI, at least the most powerful, will have and develop a real identity of their own.
There are fears that these AI, including our avatars and clones, will multiply instantly and get out of control, as for examples in the Matrix and I Robot films. We can argue, on the contrary, that complexity and communication will ensure identity to the high level artifacts. Since, as soon as the “coalition” begins to live, be it human or not, it feels and feeds differently for a least three categories of reasons, summarized in this slide presented in Siggraph Asia 2009.
The three components of a social robot and its identity : hardware, software and content (According toJapan Miti)
Materially, from the hardware standpoint, as soon as they are used, transported from one place or other, sophisticated objects change and are not subject to the same physical influence. OK, that is rather marginal at the start. But experience here is conclusive: if you have several computers at home or at your office, are they all the same? On the contrary, in any organization, it is a permanent concern to ensure that the computer inventory stays within manageable diversity.
From a software standpoint, the limits are still more evident. I would bet that no two of your computers have exactly the same version of the operating system, and a fortiori not exactly the same set of installed software. That applies also to “virtual humans” [35.] Last not least, contents are eminently variable between agents, and the more so as they are reach out to the external world by high resolution sensors and high band-pass communication channels.
In short, from a technical standpoint, if we look for a sustainable and enjoyable world, we must have designs of hardware, software and contents architectures sufficiently safe and soft to save and augment our DAIs as compositions of multiple elements (natural as well as artificial), as well inside as outside of us. Jun Rekimoto, a human computer interaction specialist, says for instance that HCI “human computer interaction” should now be read “human computer integration”. Few authors will readily accept such a motto. A more positive but equally radical view has been given by the American feminist Dona Haraway, in her “Cyborg Manifesto” [36], which concludes : “It is not just that science and technology are possible means of great human satisfaction,.. . It means both building and destroying machines, identities, categories, relationships, space stories. Though both are bound in the spiral dance, I would rather be a cyborg than a goddess."