Common nouns | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Index Berger's Works
Proper nouns A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z| HOME DICCAN

A Digital Ontology

Contents transferred into the DU series

Pierre Berger

Foreword : grand narrative and responsibility

Once upon a time, some two billion years ago, the first digital entity entered Earth: life emerged into the primeval soup.

Some millions years ago, another fundamental jump happened, when our ancestors invented the articulated Language, with its deep but clearly digital : system of binary oppositions [Saussurre].

Some hundred years ago, the Modern times drew Mankind out of its static views of the World,  and presented History as progressive. Digitization was a major component of the new impetus, from a neatly digital way of writing the numbers and computing, digital slicing of time ("foliot"), the cutting of printing elements into our present "cases" (upper and lower...) up to the philosophical division of problems recommended by Descartes.

In the 1940's, a lot of theoretical and technical advances led to the modern computer systems, and showed how central were the binary concepts and devices, from mathematics to corporation management, including nuclear physics and military applications. Von Ne perhaps the best at tying all these trends into the digital synthetic concept.

In 1953, Watson and Crick deciphered the DNA, and Life began conscious of its intimate digital nature.

In 1965, Moore formulated its law of growth for digital integrated circuits, which has been transposed to all electronic devices. This exponential development law could be adapted to cover all kinds of technologies and to the whole span of World history.

Since then, the digital coverage of the World thickens and becomes more dense each year, tending even to replace the hitherto dominant processes of production, consumption and entertainment by a transfer into "second worlds". These could well be anyway the one and only safe way for the future of Mankind, if they prove able reducing the matter/energy dissipation/pollution problems.

Still more fascinating, for the best and the worst, the two digital tracks tend to merge into one, where bionics unite the digits of DNA (natural life) with those of electronic devices (artificial life).

A main concern is the weakening of all the borders which hitherto protect our individuality. Our body extends beyond skin with the proliferation of "prostheses", but is penetrated by rays, ultrasounds, drugs, chirurgical interventions, transplants and implants. Our soul projects itself to the limits of the universe, multiplies itself in virtual words, but our privacy is little resistive to the commercial, administrative and military information networks. And it will go further with direct link of the brain to the external word, to control it, or be controlled by it. And at the end, the brain to brain, telepathic relation? At present, properly unconceivable.

The commentators of today part themselves in pro- an anti-technology. The opposition is often expressed in dramatic terms, be they optimistic or pessimistic about the future of Human species. That does not prevent a bit of schizophrenia, with persons spending their time on the web to stress the dangers of computers, and others who claim both a simplification of Administrative processes and a strict protection of privacy. This trend is reinforced by the new grand mythologies that have replaced as well the Greco-roman as the Christian doctrine : Starwars and the Lord of the Rings. Here clearly, even when advanced technology is shown, the ideal is clearly put not on progress, but on the restoration a “natural equilibrium”. The “bad” (Sauron, Dart Vador) have the most advanced technology. The “good” (Hobby One Kenoby,  Frodon) rely mainly on their “soul”.

Let us try to unify the knowledge fields implied in this saga. Let us try to go further, to show the depth of digitization, the riches of its perspectives and the subtlety of its possible theorization... while aiming to propose, with modesty, some guiding lines into our present and the building of our future.

We shall try to soothe these fears by showing that the digital entities have themselves their intrinsic limits and weaknesses, which we call globally "digital relativity". In some way, the global centralized control imagined by Orwell in his 1984 novel is fundamentally impossible. But we pay the price of these frailties with the ever increasing cost of safety, not to say "sanity" in our systems, with heavy anti-virus and firewalls of all sorts, at the individual as well as corporate and political levels.

Looking further, digital relativity opens also the way to the emergence, in digital "machines" of emotion, meaning, and art. Our Roxame software (among the works of so many "generative" artists or algorists) has begun to show how a "machine" can be considered as an actual artist in its own right, or, symmetrically, as an art critic. So, explode the last bunkers of the "human exception". In spite of many failures, in spite of the still wide power gap parting our machines from our brains, the visions of AI (artificial intelligence) have still their legitimacy. A lot of thresholds have been over passed (win over a Chess great master; propose rather correct translations of much text...). At the same time, the limits of human rationality facing the over-complexity of modern World are but too evident. Then, the coming of some kind of post-human entities, possibly hostile but why not friendly and somehow respectful of us, the proto-humans, as we are of our ancestors, is not so mad a perspective (as Asimov told).

At this point, optimism Vs. pessimism is a rather private alternative. And indeed of little pertinence, as such global ways of thinking lead to simplistic or schizophrenic action and politics. What we call for is more a due combination of creativity and vigilance. As far as we can, as far as the World evolution depends on us.

We shall develop these themes in four parts.

1. The first one will show the generality of forms and structures met in the Digital universe. In that space, with is particular time, beings have their place and their roles. From the basic bits up to the global world, small and large beings develop their structures and co-exist. Some of them are representations, but a major part stands for itself. Among the different systems of assembling bits into entities or larger and larger series, language holds a pre-eminent place, as well for representations as for active beings, through programming.

2. Metrics and values will take the second part, and we shall propose some innovative concepts. Upon a sort of thermodynamic base, we shall consider the now classical, but still disconcerting concept of complexity, and draw the sketch of a method to evaluate the power of digital beings.

The mere problems of these metrics will lead to the central issue of digital relativity, with its multiple sources, including but not limited to Gödelian logics. That will open the space to a concept of liberty for all digital beings, with its preservation and maximization as a general principle of development in the Digital Universe so extending in this domain the classical Minimal action principle. From this central principle can be studied the classical "transcendental" values: true, good and beautiful. We shall start from the good, and consider the true as a particular kind of good, letting the beautiful to the next part.

3. We shall jump to a more sophisticated group of issues with more explicitly cognitive beings, with their operations of assimilation and expression, surrounding a generative/decision core. At this level may emerge the problematics of emotion, and of meaning in its deepest form (from the DU standpoint). Hence a rich basis to look at beauty and art, as far as it may be practiced by digital beings, seen as the summit of development of our digital universe.

4. In the fourth part, we will connect our abstract construction with the "real" world, with its concreteness, its matter and energy, its relativity, and of course its "real time". Hence a look from DU into History, which may be seen as emergence of the digits in physical as well as emergence of matter in the digital constructs. We shall, in a perhaps too classical way, consider mankind, that is We, and I among us, in our particular and - should I proffer that truism - eminent role. But we shall see this role in perspective, not as some end of History, but as a major stage towards something higher, some Omega point to speak as Teilhard... and so conclude on the vast drama that our generations, willing or not, are the players.

5. We conclude with openings on our future, and what we can, and must do to develop it along good lines>.

Acknowlegments

This book has been, I must say, a quite solitary effort. It's interdisciplinary, technical and for some, shockingly anti-humanist stand, had not made easy the dialogue and cooperation. I could not even find understanding in the reflection club I had founded in 1991, Le club de l'hypermonde, which in spite of my efforts, went back to more conventional topics.

I must anyway give thanks to my family for the long hours stolen to them for this adventure, and to my brothers Joël (for his encouragements) ant Marcel (for his precious hints and reference providing).

My thanks go mainly to some other thinkers, whose recent books have been instrumental to develop the present text, from the bases elaborated in my L'informatique libère l'humain, la relativité digitale (L'Harmattan 1999). The book contains an extensive bibliography (I can send it by mail, at cost price, for 20 euros). e

The last decisive one is Mathématiques et sciences de la nature, la singularité physique du vivant, by Francis Bailly and Giuseppe Longo (Hermann 2006). This book is the most recent (as far as I know) expression of their long and deep cooperation, which I first appreciated during a séminaire at ENS (Ecole nationale supérieure), at the beginning of this century. At the level of science they surf on, I confess that part of these lectures was well above my abilities. But, those days, I am not alone to sail by reckoning in the vast seas of modern science. After all, is it not another aspect of digital relativity... From my standpoint, they remain too faithful to the "human exception" thesis, and then under-estimate the importance and positivity of the digital beings.

Another key book was La révolution symbolique, la constitution de l'écriture symbolique mathématique, by Michel Serfati. What he says of the Viète-Descartes-Leibniz intellectual revolution, of the upturn in symbol role in science development came appropriately to feed my search about digital beings autonomy and self development. I tried to push him into my saga, with Renaissance mathematics as a major landmark between Aristotle and Von Neumann, but he resisted my lure into a so risky travel or, actually, not so in line with his own historical feel. Fortunately, I found in Des lois de la pensée aux constructivismes, an issue of Intellectica (2004/2, no 39), edited by Marie-José Durand Richard, a complement on the intermediary work of Babbage and Boole.

At some times, the cause of digital autonomy, mainly artefacts at first sight, comes in conflict with the traditional humanist motto : humanity transcends any artefact. That is a very old debate, of course, from the Golem to Frankenstein and Asimov, from cybernetics to systemics, from Pygmalion to Totalement inhumaine, by Jean-Michel Truong [Truong]. I found fresh fuel for a new push along this way in the philosophical work of Jean-Marie Schaeffer, mainly his recent La fin de l'exception humaine [Schaeffer 2007]. Alas, he does not deal with the man/machine issue.

On the human sciences side, La technique et le façonnement du monde, mirages et désenchantement, edited by Gilbert Vincent [Vincent] shows the negative side of constructivism, and fostered me to answer these concerns with more... constructive views. Interesting ground along this axis has been ploughed by Razmig Keucheyan in Le constructivisme, des origines à nos jours [Keucheyan]. It is a rather detached view of constructivism, certainly not a pro-constructivism pamphlet. One of its main exciting themes is a connection made between various forms of constructivism, notably sociological and artistic versions.

The same Jean-Marie Schaeffer pushed also in this kind of intellectual genetics, in his L'art de l'âge moderne, l'esthétique et la philosohie de l'art du XVIIe siècle à nos jours [Schaeffer 1992]. This last point entered in resonance with my development of the painter-system Roxame, and our reflexions in the artistic group Les Algoristes, which Alain Lioret and myself founded in 2006. His book Emergence de nouvelles esthétiques du mouvement [Lioret] concludes with an evocation of systems "which are based neither only on random, nor on human, not on machine". Cooperation of the group, and mainly with Michel Bret and his neural networks is in presently in progress, and gives force to artistic as well as theoretical new developments, of which this book is, at present, the most abstracted expression.

Let's say some thanks also to Jean-Paul Engélibert  for their nourishing compilation L'homme fabriqué, récits de la création de l'homme par l'homme [Engéliberr] which makes readily available in French 1182 pages of great classics, from Hoffmann to Truong, including Shelly, Poe, Villiers de l'Isle Adam, Wells, Capek, Huxley...

Readers may be surprised to see an English text based on such a French bibliography. That may be due partly to the rich life of French philosophical and interdisciplinary thinking of our times. In spite of critics, the "French theory" is not dead, even if it takes now other ways than in the good old times of Baudrillard. As far as I know (but we know so little, nowadays), there is no such effort of thinking global beyond the Hexagon limits...

As for me, with apologies to some of my French readers, who have still some reluctance for the Von Neumann language, I take it as an orientation to the future, which I see as globally speaking, if not English or American, at least some not too broken species of Globish. Perhaps am I wrong here also, and the poor times of the present United States will let the way open to Spanish or Chinese as the most common language. For the time being, in business and science worlds, English keeps a strong leadership. And I think that the best to be wished to French culture is to cut the umbilical cord with my mother language, and not to wait for translators to give the largest possible audience to our ideas.

Anyway, after some 40 years of daily writing in French (mainly as a professional computer journalist), and trying to pay the best homage to my native idiom, I have more pleasure in writing the language of Shakespeare or Oscar Wilde than of Corneille or Victor Hugo, and even of my children and grand-children.

An ambition with such wide zoom aperture cannot avoid some naiveties here and there, and I hope that the reader will show understanding for my shortcomings. Still much better, warm thank to those who will take out of their time to send remarks, to which my email pmberger@orange.fr will give a hearty welcome.

A digital cosmology

0.0. A cosmology

A cosmology has for aim to describe the whole universe, its history and developmental laws. A digital cosmology sees it as an indefinitely large set of bits (each one having at a given time the value 0 or 1. Or equivalently, true/false, black/white). It is a little easier, and rather equivalent (we shall discuss that later), to consider it as a text, a sequence of characters, including, of course, the numerical characters. or a hyper plane of any dimensions.

We could as well use the word "cosmogony" since we give some basic hints to the origins. But such a presentation of the past aims, of course, to its extension into the future : constant law are useful if they are here to stay, and support not only survival but development.

We shall frequently use the term DU (digital universe). An hypothetical GOS (General operating system) manages its global operations.
We shall refer to the four Aristotelian causes : matter, form, efficient, final.
Matter will refer to the basic digital raster. Material objects in the common sense will be called physical.

This work has for object to help today human beings, beginning with the author himself, to feel better at ease in the present world and its trends. Or, to say more, to enjoy it more fully, lowering his inhibitions without, at the same time, lowering his vigilance about its negative sides and real threats.

For that, it tries to build a "conceptual frame, tending to unity" [Longo-Bailly] which could be compared to the cybernetics and “systemics” efforts in their time. A main common point is to go past the opposition between human and non human, and to soothe fears that relativisation of mankind is a threat to humanity. For that, it should be shown (but we can give only partial hints),

- that something better than the present mankind is not unthinkable, but difficult to be imagined and demanding, at the same time
. some deconstruction of our binary conception of mankind, without falling into any kind of radical pessimism
. some elements of constructions of this "better world" which, paradoxically perhaps, includes a stress on the radically non-perfect nature of this future; any "perfect system" is from start inhuman and lethal; "idea kills life" (in the line of Freud through Denis Duclos, in [Vincent]); on this line, our view may be taken as “consructivist”, as far as a lot of the terms and concepts are not directly “natural” but constructs, if not “theoretical fictions”. And so more since artificial digital beings take such a part in the story ! We take here “constructivism” in a resolutely positive and... constructive way !

- that the digital systems have not the inhuman and lethal rigidity traditionally associated with "binary" systems. Here, the considerations about "digital relativity" are central. But also our efforts to show that terms like "meaning", "sense", "emotion" and art ; on this line, the role of art will be stressed as major ; not to give the artists an exclusive role, but more, in the line of [Michaux] "art in the gaseous state", that any activity has to become more and more artistic.

Bits everywhere

In the XXIth century, to view the world as primarily digital is no longer revolutionary. We live, work and entertain most of the time with digital tools or toys. Interconnected through Internet, digital devices take as central a place in our minds, as Earth for our bodies. Then we must do (and too bad if it is only one more, and waiting for the next to come) a Copernican revolution, placing in the centre the digital construct, in other words the set of patterns that machine rise proposes to day to men. If offers us a mirror, an antithesis, an extension, an ally, an accomplice for new steps.

The former views and philosophies of the world neglect or despise the technologies. They still set in the centre the human biped which emerged from the monkey to reach universal power on earth but also in atrocity. For all the philosophers, up to Husserl, Heidegger and their epigones, even Sloterdijk, humanity is central. Even if it is a superman, as for Nietzsche. Or a collectively divinized man, as the proletarian for Marxism.

Extending here the views of Hegel, we follow him in his long way of “Spirit” growth. But we negate his idea that History finishes with man enlightened by the Hegelian philosophy. We also push beyond existentialism, which explicitly puts man in the centre of the world, with his mere non-determination. For we think that machines also, at least since Von Neumann, can be undetermined.

The ideas that we propose here, aiming at a formalization, if not a computation, of all these concepts, cannot pretend to solve all questions, nor to bring a totally encompassing framework.

One of those tenets, expressed in our book L'informatique libère l'humain [Berger 1999], is that the process is hopelessly endless. The digital dialectics has no stop point. Anyway, it would be even more horrible if one wall was reached, (in the Jean-Michel Truong manner). We can at least find some comfort in the digital relativity part of the reflexion : the impossibility of a totally computable work is both despairing and reassuring against the kind of rational dictatorship that it could result in. (a fear formulated for instance by Denis Duclos, in La technique et le façonnement du monde [Duclos].

An important part of DU is its stable part, the bits stored for long term use, a patrimony. But DU is not static. I may be thought as a soup, as the primeval soup where life began, a magic cauldron with powerful bit currents, highly emissive poles as well as pure receptive poles, highly interactive sites... Globally, DU is expanding. To day. It is thence that we can look for the past, the origins, and the future.

We could entitle this work “The selfish” bit, as a generalization, or a deepening,  of “The selfish gene” [Dawkins]. Let us note that a bit is altruist by necessity: an isolated bit has no meaning.

Is the bit something subjective ? An “a priori” scheme, along a Kantian view. Or something objective, really observable in concrete, even material objects. The answer is : both. In an ADN molecule or in a computer, the digits a as objective as any other part or feature of the object.

A world that stand for itself

Bits, numbers, texts and bitmaps are not primarily representations of something else, even if representations we primary in the building of the “digital” term and concept.  The digital beings are the reality itself. They beings exist and develop for themselves, or for other digital beings. . Some parts of it connect with "matter" or "physical world", but it becomes just a secondary aspect, an historical constraint. The map is the territory, contradicting the classical motto of the General Semantics "The map is not the territory". Or at least, digital is what matters firstly, even if it refers, and somehow demands, a physical basis.

Then the traditional problems of truth, the debates between realists, idealists, nominalists or positivists are left behind. Though, of course, we shall have to consider them at some time, but without the restlessness of Cartesian doubt or the cynical indifference of scepticism. In DU, significant and signified are all bits.

Our view is neither materialist, nor spiritualist. It tends to avoid any "dualism", though this question, even considered as lateral, cannot be avoided.

Digital beings are no more mainly tools, even if computers emerged primarily as computing tools. There is no life without a digital code. And DNA code cannot be taken as a tool for the rest of the being.

But digital beings cannot stand alone. They need some material, physical basis, some “incarnation”. Like the ADN nucleus in the rest of the cell. And we do not pretend that DU “is” the real universe, but only that it is a pertinent and useful part of it, or projection of it on some “epistemological plane”.

0.3. Epistemological stand : post-post modern

Since digits entered into existence well before men, and will perhaps survive manking, our view can be taken as “non humanist”.

We wont start from man. Following part of the cybernetic and systemic view, we can say as [Bertalanffy] (1966 for the French edition) "It is a essential feature of science to desantrorpomorphize progressively, in other words to eliminate step by step all aspects due specifically to human experience".

But a digital view or the world is not incompatible with human, even mystical, view, as the great introduction of St John gospel prologue "In the beginning was the Word".

Here man is an aim, an ideal to build, or to go beyond. Not a datum, let alone a pre-existing nature that we should try to find and to reach. We have not to "become what we are" (as said Nietzsche after oriental philosophers). We are not, worse again, a lost pristine nature that we should restore by our own strength or the mysterious intervention of a Saviour or the heroic deeds of Frodon or some Jeddai knight. We have do become what we will, and for that to design consciously what we want to become, even if that cannot be a nonsensical fantasy.

After such bold and immodest pretensions, let us show some basic limits of such a cosmology.

- Our model refers to a global DU, and its operating laws and processes which we shall call GOS (General operating system). Such a general entity bears contradictions, as "the set of all sets".

- On the same vein, our cosmology aims to a most general law of universe (at core, the L maximizing law), when precisely proposing a "digital relativity" which excludes the possibility of such a global views.

- Human conscience, and related concepts like emotions, are at present far beyond what we describe. Then part of the work must be taken as a rather simplistic model, a metaphor more than a real explanation. Nor must we push too far the metaphor laid out in the preamble, and imagine DU as a conscious being pushing its interest. That may be useful, or motivating at some times, like God for the pious, or Gaia for some ecologists.

Post-modernism could deter us from such ambitious views. But postmodernism also finds its limits, and the selfish claim for freedom may crash on a lot of “walls”, mainly perceptible today under ecological mottos. Somehow, then, our approach may be taken as post-post-modern.

In spite of these limits, we would not have undertook such a writing without the conviction that it may help to better understand the world of to day, to contribute positively to its development, and to enjoy it without guilty feeling.

PART 1. BIT, THE DIGITAL ATOM

1. The bit, atomic but clever

1.1. Digital and binary

The definition, and crucial quality, of the digital universe, is that it is made of bits.

So, it can be seen as a set of binary values. The most simple view is an indefinite string of 0s and 1s along the numeric scale of integers (N). Or a 2D or 3D raster raster bitmap image. Or, more formally, DU is {0,1}*, the infinite set of 0,1 pairs, and still more exactly this set as connected to the real world, “incarnated”.  In nature as in artefacts, digital beings never operate in total independence of matter. In the reproduction of living beings, the combination of genomes will give a new being only through an epigenetic process which is not totally digital, immaterial.

Interesting structures operate at a higher level : molecules more than atoms, genes more than DNA pairs, words more than letters...

Note that "digital" is not synonym of "numeric" as implied by the French translation. See the "representations" chapter.

Digital and binary are practically taken as synonyms.
A digit may be a number (from 0 to 9). It could be also taken as any “key” (which we press with a finger, digitus, in Latin). But, since Von Neumann,  binary is recognized as optimal.

1.1. The bit, mathematical, logical...

The bit is solid, robust and perfectly atomic in the etymologic sense of unbreakabilty. There is no thinner nor smaller logic element than the bit, no smaller information unit , an no smaller element in decision and action.

The bit is also polyvalent; one could say "plastical" or “pivotal”, as well for abstract mental constructions (logics) as for physical devices.

Is the bit an optimum from a logical standpoint ?
- a ternary logic would, for some experts, be better, since the mathematical constant e is nearer to 3 than 2
- in some electronic devices, or parts of devices, a "three states logic", the third state being somehow neutral, is more efficient
- modal logics have attractive aspects that not always reduce to binary (modal logics ... )
- a bit is not the least possible information quantity ; its proper value is reached only when the two positions in the alternative are equally probable ; if one out of the positions is more probable than the other, the informational quantity decreases, and vanishes when one position is certain.

Some other considerations :
- is a bit is a particle, and not an atom ?
- a bit is a kind of balance, a dissipative structure (a priori in DU, and on full right in the physical universe)
- bits may be seen as electrons running through logical gates, which are stable structures of nuclei which determine the flows; hence, immediately, the irreversibility of logical functions
- a bit may be seen either as a “cut”, the transition between two spatial regions  ; or as “pixel”, a small region
- a bit “exists” when it is located, either “materially” (a geographical position on Earth, a place on a chip) or “digitally” (a digital address in some address space) ; then, the fundamental assertion in DU : “there is that”

1.2. One bit, two bits...

A bit alone has neither meaning nor value by itself; it is not even perceptible. at limit, a sole bit does not even give the direction of transfer, implied only by the transmitting device

Nevertheless, the mere existence of a bit “means” at least  an elementary optimism. If there is a bit, that implies that somehow order is greater than disorder, structure more powerful than chaos. At least locally. From the most elementary level, we can then subsume the presence of a system funding values, yields, operational capacity, and at least a difference versus indifferenciation.

But that, by itself, places the bit inside of a system which lets appreciate it.

A new bit is cut, a new being is borne. At some moment, the causal vice loosens. A new complexity level is reached, with potential DR for undetermination and recursivity, letting appear a new autonomous being.

1.3. Discrete and continuous

Could this DU model, and under which conditions, build the continuous ,

In a large measure, discrete beings may be processed continuously. For instance, cookies bore by a rolling chain during the baking phases. Reciprocally, continuous products may be "discretized" for processing (batch processing in chemical industry or information systems).

The term "transaction" says, generally speaking, that a batch of operations is applied to a batch of something, often by an exchange between two processors. This term is used mainly for money and information. Formal atomicity rules have been expressed for a correct and fail-safe transaction processing (double commit, Acid rules).

Continuous and discrete are never totally separated. "Every morphology is characterized by some qualitative discontinuities in the substrate" (Mathématiques de la morphogenèse [Thom]). In other words, digital is always present, even in analogical systems (dissymmetry?)

“Inside” the DU model we can build the real line R, to find again the continuous. This construction cannot be done "directly", since it supposes infinite somewhere. But a way through formal, the infinite potentiality of a "meta" level in regard to the inferior level. Projective spaces ? Hence, one could finish up on analogue computation, if there is still something to work out of it (not totally, see Bailly-Longo).The cut in R real line construction is dual of the fundamental N cut. It is their combination which gives this impression of "total" recuperation of the real.

Continuous beings, and matter itself, emerge only by irreducibility of logic forms (square root). Matter (or animality) is what supports keeping on, bridge between, waiting for (let us hope) a more subtle formalism which re-establish continuity, or even better logical coherence.

Physically, continuity is an illusion (quanta), or better a construct, no more "natural" than digits.. Continuous functions are only approximations to express the large numbers. It is built by the brain. And is a sort of necessity to think as well as to act.

A continuous function is continuous in respect of the being it describes, but nevertheless is discrete in its expression (text). It draws a bridge, symmetrically to physical matter, thought as an infinite base for bits and quanta.

1.4. Digital and analog/analogue

The distinction applies to representations (here analogue is more frequent) as well as to devices and specially electronic circuits (here analog is more frequent).  At first sight, the distinction seems perfectly sharp and clear. In practice the border may be fuzzy.

The digital/analog and discrete/continuous oppositions are not orthogonal. Analog is nearer to continuous, but differerent. But digital is necessarily discrete.

1.4.1. Representations

Thesis. An analog representation
- is continuous (if the represented entity is continuous) and
- is obtained from the original being with some kind of continuous function: optical transfer, electrical value transformation, etc. .

A digital representation is built by three basic cuts:
- the representation is not the represented being: the map is not the territory
- cutting the representation into bits
- the representation structure is not (or at least, is not necessarily) homothetic to the represented being. (Analogue/digital). This last cut is what makes the difference between analogue and digital. (better words should be found).

Typically, the difference is illustrated on a car dashboard, with needles on dials (analogue) opposed to digits (here, numerical digits)

In most cases, the border is not so sharp.
- digital representations, at high resolution, appear as continuous and analog
- for ergonomic or aesthetic reasons, digital data are frequently presented in analogue form
- quanta theory

The difference between analog and digital representation lies in the mapping of being structures into bits. Analogue mappings transfer, more or less, the spatial relations of the beings into the arithmetical succession of bits. Digital mappings use the arithmetical succession as a support to any kind of relation internal of the beings. That is the basic addressing system. A digital representation is always, more or less, coded. A 2D raster demands some definition of lines and rows.

Example: on a random sequence of 0s and 1s (not too compressed, let us say one among ten), density computation at one point, more or less thin, we get a continuous function using interpolation or smoothing.

Kinds of analogies
- myth
- anthropomorphism, and in robot design
- logical inference as analogue of a process
- iteration as an image of the wheel.

1.4.2. Devices

>Devices may be mechanical or electronic.

Electronic circuits are assemblies of gates. 

Analog circuits. See definitions and catalog of circuits on the web.

analogue_circuit.bmp digital_circuit.bmp

Circuit schemata : analog (left) digital (right)

An analog computer (that existed until the 70's) uses such functions.

Some sayings of the Futurists :

Analogy is nothing else that the immense love that ties distant things, apparently different and hostile. Using wide analogies, this orchestral style, at the same times polychromic, polyphonic and polymorpphic is able to embrace the whole of matter life […] 

The analogic style is then the absolute master of the whole matter and its intense life... To wrap and hold the most fugacious and unhandable aspects of matter, we have to build nets of images or analogies, an to throw them in the mysterious sea of phenomena.

If we want enter in communion with the divinity, high speed running is indeed a prayer. Wheel and rails holiness. Kneel  over the rails and pray to the divine speed. High speed drunkenness in car is drunkenness where we feel merged with the one only divinity. Sportsmen are the first catechumena of this religion. Probable destruction, for soon, unavoidable, of homes and towns. Elsewhere will be built large meeting points for cars and airplanes.

Mathematicians, we call upon you to love the new geometries and the gravitational fields created by masses moving at sidereal speeds.
Mathematicians, let us assert the divine essence of random and risk.

Let us apply probability computing to social life.

We shall build futuristic towns, design with the poetic geometry.

(Marinetti was aware of Riemann works and General relativity)

On the last point, he does not seem more delirious than Simone Weil seing in mathematics a proof of Gods existence.

About the Chevalley/Zarisky dialogueChev-Z.jpg

- they start from language, then  “I mean”

- it is not directly ontological

- what is interesting after that : what shall we do with that,  what can we modify, play with…

 - the geometer can move, deform (topology),  dip into a 3 dimensions space

- the algebrist can write some form for f, an any thinkable form. Let us note that, seemingly, Chevally chooses a two dimensional space.

Then, what will do the computer scientist for « I mean » ? Bézier  ? Write some algorithmic lines ?

And the neurologist ?

1.4.3. Life

 

Thesis : Life is partly analog, partly digital.

1.4.4. Any object is “hybrid”

Thesis. No being is totally digital, nor totally analog. There remains always some analogies in the coding, the code structure, etc. At least in the ordering of its bits, which are “analog” to its structure.

For example, a digital image is digital for the coding of the pixels, but the raster organization of the pixels is analog to the space of the representation.

1.5. Sampling and coding

Digitalization of representations has two complementary sides : sampling and coding. For pictures, they become pixelization and vectorization, which  can be traced far back in art history as pixelization and vectorization. We could call the former “material digitalization”, and the second “formal digitalization”. This opposition is rather parallel to the analog/digital opposition.

Digitalization of images 1. Pixelization. From left to right (extracts)  : Byzantine mosaic in Ravenna (6th century), pointillist painting, “Le chahut”, by Seurat (1889-90), and  “La fillette électronique”, by Albert Ducrocq with his machine Calliope (around 1950). 

 

Sampling/pixelization: Byzantine mosaics deliberately use a sort of pixel, tesserae, to create a specific stylistic effect [1]. The impressionism, and even more neo-impressionist pointillism have scientific roots. Computer bitmaps emerged in the 1970’s, with forerunners such as Albert Ducrocq [2], handmade paintings generated by his machine Calliope (a random generation electronic device using algorithms for binary translation to text or image). 

This form of digitalization stays more “analog” than the vectorial one. The “cutting” operates at the representation level itself. The conversion requires comparatively few conventions, but for the raster size and the color model. Within precision limits, any image can be pixelized, and any  sound can be sampled.

     

Digitalization of images. 2. Vectorization. From left to right: Guitar player, by Picasso ,1910, Klee tutorial 1921, Schillinger graphomaton, around 1935.

Coding/Vectorization is a more indirect digitalization, since it creates images from elementary forms, as well as from texts, grammars, and the particular kind of text that is an algorithm. Here also, roots go deep into art history. An important impetus was given by cubism. The Bauhaus tried hard on this way, see for instance Itten [3] for color and Klee [4] for patterns. A first explicit vision of vector automatic image generation was given by Schillinger [Schillinger]. Incidentally, Schillinger was mainly a music composer and compostion teacher, and of course, digitization and music went along similar ways.

 

In music, the main coding system today is Midi.

 

The cutting operates not directly on the representation, but in language objects. The conversion requires more or less detailed conventions. It may be hierarchized and takes finally to language. Any analysis (and symmetrically, any synthesis) may be considered as a coding (and possibly decoding).

 

Not every representation (picture or music) may be appropriately coded, or it may imply considerable loss, specially if the code is used to generate a “copy” of the original.

 

The two ways of evolution have merged in binarisation in the 1950-60’s, as stated in the seminal paper of Von Neumann et al. [6], “We feel strongly in favor of the binary system”, for three reasons :

- hardware implementation (accuracy, costs),

- “the greater simplicity and speed with which the elementary operations can be performed” (arithmetic part),

- “logic, being a yes-no system, is fundamentally binary, therefore a binary arrangement… contributes very significantly towards producing a more homogeneous machine, which can be better integrated and is more efficient”.

What about the "analysability" or "divisibility" ?
- in DU, any being is fully divisible into bits (and most often than not, for practical purposes, in bytes)
- sometimes a being may be fully divided by another being, and then considered as their "product". Examples
. a text (in a normal language, natural or not), is the product of the dictionary times a text scheme
. a written text is the product of a text scheme, a language, and graphic features: font etc.
. a bitmap with its layers (R,G,B, matte...).

1.6. Concrete/abstract

The concrete/abstract distinctions stems from the basic assertion scheme :  there is that.

The concrete object is : what is there.
The abstract object is : what is that.

“That” is a term, or a concept. It has (somewhere in DU) a definition or a constructor.

The extension of “that” is the set of concrete objects corresponding to that definition (or so built). Or possibly the cardinal of this set.
The comprehension of “that” is the definition. It is also a DU object, then may be seen as a string of a definite number of bits.

Thesis : the definition or constructor does not, in general, describe or set the totality of the object, everyone of its bits. The definition may be just a general type, of which a lot of bits will be defined by other factors. An object may be recognized as pertaining to a type, since it has the corresponding feature.
Among other consequences, that lets space to actions on this object by other actors or by itself.

If a concrete object is strictly conform to its construction, it may be said “abstract” or “typical”, somehow, or “reduced”. “Reductionism” is the thesis that the whole of DU may be obtained from its definitions.
The ratio of “type defined” bits over the total number of bits of the object could be taken as a measurement of its “concreteness”.

Quotations : An abstract system is made of concepts, totally defined by hypotheses and axioms of its creators (Le Moigne 1974, after Ackoff) . Abstract beings have a limited and definite number of psychemes. (Lussato, unpublished1972)

Thesis : DU itself may be defined, but of course very partially. That may be controversial. The address of DU is somehow the “zero” address.

Thesis: When comprehension grows, extension decreases, since there are less and less objects corresponding to the definition. It comes a moment where there is only one object corresponding to this definition. For instance “world champion of chess” has only one object at a given time. Or there can exist no object answering the definition, even if it bears no contradiction.

Thesis : If there is only one object, if the extension is 1, then one can say that the abstract and the concrete object are the same. (to be checked).

There is a kind or symmetry:
- when abstract grows, concrete emerges out of it
- when concrete knowledge (measure) grows, abstract must become more finely cut, then more massive. 

An example of concrete beings emerging from abstract descriptions : segment one (from the "one to one" marketing language), "the one who...". . Evolution of information quantity about a consumer. At start, the customer does not even exist in the information system. It records only anonymous (or difficult to group, as with insurance contracts) operations. Then there is growth not only to the individual customer but to its environment.

customers.jpg

See identity : what makes an object unique.

Absolute concrete may be taken as a sort of limit, of what is accessible by all beings in DU (and possibly without conventions ? )

Thesis :  Codes without redundancy entails hierarchy loss. Ant a wrong bit spoils the whole system, if there are non longer stronger and weaker beings.

Note : a lot of developments here can be found in the literature about Object Programming. See in particular [Meyer].

1.7. Natural and artificial

These concepts are controversial. We take here a stand adapted to our DU presentation.

Thesis : digital is not a property specific to man-made objects. A bit is natural or not. The opposition natural/artificial is orthogonal to the opposition digital/analog.

Artificial beings:
- are synthesized by living entities (though generally not with a clear anticipating view) or by other automata?
- can mimic natural beings, but lack some or other aspects of the real natural being;
- may described in terms of functions, aims, adaptation;
- are often considered, in particular at design time, in imperative terms more than descriptive [Simon, 1969].

The natural beings are the non artificial ones. a being may be said natural if one does not know any DU being that has made it "consciously". That is generally easy to say of non living beings found in "nature", as stones. Living beings can be considered as natural since the origin of life is unknown, as far as the biological reproduction is compulsive and not really controlled. On the other hand, we shall call "artificial" beings that result out of a "conscious" activity, such as bird nests or beaver dams, and of course the products of human activity. This distinction is problematic, as well as the word "conscience" itself, which is generally admitted as a mainly human feature, somehow shared with animals, but not at all with machines. Does technological progress leads us to conscious machines and to a merging of natural and artificial world (or, it may be also said, as a complete "artificialization" of the world) ? That is an old issue, which emerged mainly in modern times, but has very ancient historical roots.

As long as life is so different from non-living existence, we may generalize "artificial" to all artefacts due to living beings, at least if they are external to their bodies, relate to a sort of "conscious" activity.

So, we call natural all beings not explicitly made by living entities, including these entities themselves, as long as they cannot reproduce themselves by explicit processes.

However, artificial entities can reproduce themselves explicitly, at present only under the elementary form of virus. Then virus and their productions may be considered as artificial. If some day, and that day will perhaps come soon (or perhaps never), mankind succeeds in creating life from non living beings, the border will lose its present radically impassable nature. On the long term, men would be able to make living beings, and robots would be able to make human beings...

Thesis. DU globally taken is natural.

Quotation : The artificial world defines itself precisely at this interface between internal and external environments. Simon 1969].

1.. DU and its parts

About addresses
Ideally, with DU taken as of only one dimension, the address could consist only with the number of the first bit (possibly first and last bits). That would be absolute addressing.

In practice, DU is parted in various subspaces and we use “relative” addresses, with a hierarchy or addresses inside, for instance the directories and subdirectories in a disk or a web site.

Thesis : The smaller is the subspace, the shorter can be the address.
We have had an example with the phone numbers. At the beginning of the XXth century, a Company could just mention “telephone” on its correspondence sheet: human operators could find it directly from its name. Then, for convenience as well as automation needs, numbers have been used. For instance, in a small town : 67 (if you called from another town, you would add “69 at Lasseube (France). Then the number has been enlarged progressively to
00 335 47 32 18 67

Thesis : Within the same (sub)space, relative addresses are longer than absolute ones. There is a sort of gap (here, "excess" of code) in order to access to an individual, and to describe it. For a population, the minimum is the logarithm of the population. For the French social NIR (security social code), 1 38 08 75 115 323, that is 1013, of 6.107 inhabitants.

At the other extreme from the bits, we can consider a global digital universe, DU taken globally. That is perhaps unrealistic, and more sensible to consider that there is a plurality of digital worlds, separated/connected by analog/physical regions...

It could be useful to distinguish in it:
- a general memory, or general bit raster (DU itself)
- action and management rules and operators (Global operating system, GOS)
- ways of ensuring communication between beings...

For life, there is no such raster, and it uses the laws of matter to effect these functions. See architectures.

More generally, see structures in chapter 7

Globally, DU is necessarily auto referent. Parts of DU refer to other parts. And some parts refer to themselves. (a map of addresses contains necessarily the address of the map ?). In a first analysis, probably, auto referent parts should be put aside.

GOS functions are, or may be :
- general clocking ; that is not necessary ; processes could go asynchronous, or synchronize without any global intervention (communication protocols)
- laws operations (and "enforcement"), security ; that could be mainly local ; at the global level, ownership of territory is ensured by the underlying matter ; the more digital the world, the more "soft" is the assignment of locations inside DU as well a in underlying matter ;
- noise generation ; normally, it will not be properly assigned to a DU processor ; but why not;
- communication between processors ; here also, roles may parted between matter and more or less general processors
- controlling the operations of non autonomous beings ; that, de facto, is more the task of local beings ;
- insure the reproduction of DU upon itself at each global cycle, if we retain this kind of model ; this point is near to metaphysical.

GOS may be seen as the DU's God. It this case, it must front the paradoxes of the creator, of the rational God of natural theology, and every self-referent being.

We shall now use frequently the term "material" to mean the location and disposition of a being in DU, opposed to ifs "formal" features. When referring to the material universe of common sense, we shall call it "physical".

Dimensions

The number of dimensions (mathematically speaking) has not here the radical implications which mark the physical universe. DU is at first sight rather indifferent to dimensions. As well one dimension in all, as one dimension per bit. . Genome is one dimensional for its basic structures. 2D is of frequent use for chip design and for representations, because our vision is so structured. More exactly, it is frequently 2+, or 21/2, with the use of layers (chips, raftered images) or of compacted perspective for maps, for instance.

The dimensions are important when being have relations, then must be defined :
- independence of subparts
- difference in metrics.
Unless otherwise stated, DU may be thought of as a one dimension bit string.

The space is indefinite, potentially infinite. Infinite may be called for by a loop without stop condition, but of course we never get there by a finite number of cycles.

There is no proper zero, anyway

Addresses

1.9. Varia

Clever and selfish ... These words are anthropomorphic. But, as with Dawkins it may be an efficient scheme, a useful “theoretical fiction”.

Possibly a double way of bits into matter :

- a small surface into a raster (e.g. pixels, voxels)

- a partition of a space into two parts.

Bits and infinity : through recursion.

agregation2.jpg

Integration.jpg

 

 

3. Beings

To make us an idea of the global structure of DU, we cannot avoid to use more or less metaphoric reference to the world of today, with its two families of digital beings: living beings and digital machines. Our eyes show them as a population of entities, materially grouped in bodies and machines, partly connected by wires. What we do not see are the net of "light" connexions
- by light and sound, which have been the major communication media of living beings for long,
- by electronic waves, which began to operate with the wireless communication systems at the end of the XIXth century or our era,
- chemical by odours or enzymes,
- by social relations between living beings.

If we zoom into these beings and communications, we shall find smaller elements : computer files and cellular nuclei, records, texts and genes... and, on our way down, just above the final "atomic" bit level, a general use of standard minimal formats :
- "bytes", i.e. sets of 8 bits (or multiples). That allows to point easily on series of useful beings, as numbers or alphabetical characters, or pixels.
- "pairs" of two bits in the genes. .

All these beings and their communications are in constant move. Their distribution change and their communication rates increase.

The basic grid. When DU is large, the grid is "deep under", but must never be forgotten (see Bailly-Longo)

3.1. First notes

This part could be taken as well as a kind or metaphysical ontology, or a semantic study of ontologies (computing field) or a development on object orientation. The terms are “constructs” more than observations.

A being is basically known by other beings who have its address. This one may be soft. But we must never forget that the being, to be real, must be instantiated somewhere materially.

A being may have an internal structure, but if the being does not include a processor, then this structure is meaningful only for other beings of sufficiently high level to “understand” it.

A being may have internal addresses.

Seen from outside, a being is a product by an address and a type, or class (that implies libraries of types or classes).
The address may be managed indirectly by GOS or an appropriate subsystem.
The type could be just "a bit string".

A being realizes a sort of coherence between the 4 causes. S has the necessary matter, the proper form to realize its finality and proper input (efficient cause)

Then the yield will be sufficient, and the being may live long, because this conformity of finalities, coming frequently from the environment, gives hope to receive the means of subsistence

The existence of a given being is defined from an external system. But we shall see that things will become more complex with large beings, which take in charge by themselves (at least some aspects of) their existence and the preservation or their borders, without removing the need of a final reference to GCom.

In some measure, the internal structures of a being are controlled by the being itself. A self referent being exerts a pressure upon itself, keeps itself together.

3.2. Basic binary beings

But for ulterior rules, beings are either distinct, or one is contained in the other. There are no "transborder" beings, nor "free" bits in a being (this point could be discussed: we suppose that all the bits between the addresses belong to that being. Adapt that when there is more than one dimension).

Internal addresses may be masked to external user beings. Like in OO programming.

The limits of a being may be defined :
- the limits of other beings,
- implicitly (?), materially, by the characterising of the supporting device, for instance the organization in cylinders and sectors of a hard disk (or more soft ? including the word length)
- explicitly, by a parameter contained in the being (header and giving its length ; that supposes that the user of the object knows how to interpret this header ; if the being contains a processor, this latter may define the length.

How are defined the limits of a being :
- extension (interval in D1)
- formal formula, more difficult if many dimensions)

Anyway, the limits of the beings must be known by GOS and the other beings in order to protect it against destruction by overwriting.

An important point is that the metric qualities of a being, in particular its mass, change radically its features.

In particular :
- a rather large being generally (some million bits) contains elements describing itself (header, at least)
- anyway, if the being is large, its type may generally be recognized from the content itself.

Thesis : the larger is the being, the more it is self-referent. Would it be only for efficiency effects. Typical example : header of a file, descriptive parts of a computer mother board and operating system.

Question : is there a minimal (digital) size of a object for even a minimal kind of self-reference ?

Let us now try to make a classification of beings, mainly in order to set a basis for a metrics.

Data, memory, representation, motor (clock is the only real motor).

3.2.0. Attraction

What holds the bits together? Here the four Aristotelean causes work rather fine. But evidently for DU itself globally.

Materially. The general robustness of DU. Plus the material connection of the bits. To begin with as connected regions, materially. The hardware hierarchy: gates, chips, cards, frames, farms, networks, The Net.

Formally. That is mainly given by the formal definition of a being, its type and description, which is external to the being for the small ones, more or less internal (header) for larger ones. Form may include quantitative relations (format, possibly "canon").

Function, of finality gives hopes to survive if somebody finds it useful. We shall talk later of "pressure", or ranking, which are somehow measurements of the functionality or utility of the being.

Efficient cause concerns more the changes of the being than its existence, but for the initial move: the creations of the being.

3.2.1. Organ, function

These concepts apply to a being seen from outside. This point matters in particular to express the Moore's law, and more generally the yields.

A being may be analyzed top down by analysis, as well functional as organic; at some times they coincide, and a function corresponds to an organ, and reciprocally. But sometimes not.

The organ/function distinction comes from observation of living beings (anatomy/physiology). It has been applied to information systems design (functional then organic analysis).

3.2.2. Material features

Seen from outside, or from GOS, seen as an "organ", a being is mainly defined by the space it occupies. Sometimes an address may be sufficient, but in other cases a more elaborated spatial description may be necessary.

In the physically biological world, an organ is first identified by some homogeneity of colour and consistence, which will be confirmed by a deeper analysis, showing the similar character or these cells. It is frequently surrounded by a membrane.

We frequently read that an organ is built bottom up. But that is probably inexact, or at least imprecise. A body dissection or a machine dismounting will show organs top down!

The relations with the rest of the system are primarily material, space vicinities. But also various "channels", be they neural or fluid conduits.

3.2.3. Functional features

A being is also defined by its function, a concept near to the "finality". In programming, as well as in cellular tissues, functions refer frequently to type (epithelial cell, integer variable). The bit is the most basic type, with no specific function, unless explicitly specified (Boolean variable). . See OO programming. But the type may be very general: bit, etc.
A function is generally defined top down. May be defined bottom up by generalization.

In abstract systems, it is tempting to think that descriptions can be purely functional, when in concrete ones; there is never a perfect correspondence between organs and functions. An organ may have several functions, and a lot of functions are distributed among several organs, if not in the whole system.

The "Overblack" concept

As a form of exploration, we could try to specify a machine which would maximize the non-isomorphism between organic and functional structures:
- each function diluted in a maximum number of organs, if not all
- each organ takes part in a maximum of functions (cables and pipes included?)
Orthogonality ?

3.3. Passive beings

At first look easy and intuitive, this concept is not so easy to be defined formally. As soon as we have O depending on I, and differing of it, S does a "processing". There are limit (or trivial) case of passive beings/processors:
- the pit: it has I but not O (or possibly not related to I)
- the null processor: O = I
- the quasi-null processor: O are nearly identical to I.

A being is said passive if it does not change otherwise than by external actions (I) (?)

Symmetrically, some case of active beings:
- pure source: no I, but O
- O radically different of I, with a real processing depth
- a large time liberty of O emission in respect to I.

We can go farther, structurally. At some times, S may decide to be passive, while remaining largely master of itself.
A passive being has no evolution between cycles. It is a pure address where other beings may write and read. There is no protection but the protection afforded by GOS and other interested beings.

Nota: A message is considered as active if the host lends him the handle  (that has no being for a two bit processor). The active side may be determined by the physical character or the host (said passive against soft...)

Kinds of "living"
- being-processor-automaton. IGO Input, genetics, output
- representations processing (assimilation, expression, dictionary, language)

The clock is an important sign of activity. But a being could be active using the clock of another one, or the GOS cycle...

How can a clock be recognized from outside?
If S' has a cycle much shorter, and observes long enough, it can model the values of the clock bit, and assess its frequency and regularity
of course, this clock could be controlled by another S

If S' cycle is longer, it could perhaps detect and assess the clock on a long observation time,

3.4. Operator/operand/operation

In physical machines, the distinction between operators and operands is not a question. But, in digital world, "operate a representation" and "represent an operation" could be strictly synonyms. Then there remains no difference between operators and operands, programs and data, etc. Not only is the informational machine reversible (as the perfect thermal machine), but there is reversibility between machines and what they do machine!

Nevertheless, in the general case, we can note:
- difference in repetition; the operator does not change at each cycle of operation; in DU, normally, there is no wear. in some way, an operator "massifies" always the being it processes..
- an operator is supposed to be heavier..

Note. The fusion between operators and operands demands that a limit has been reached where it is possible. beings must be light enough, de-materialized, to follow the move without delay nor deformation; and operations, through sensors and actuators, must be reduced to an operation on digital beings... But, moreover, the general digitization was a basic requirement.

When you enter an lift and push a button, your gesture may considered as well as an operation on a representation (the stories, from low to high), or a representation of your intention: the story where you want to go.

One could say that a basic dissymmetry remains : the initiative comes from me. But it uses the initiative of the lift builder...

3.6. The basic "agent"

We give a particular importance to a basic "agent", inspired from the finite automaton, which has a good side : it is widely studied, not only by automaticians but also by logicists and at large by digital systems specialists. See the book of Sakarovitch.

3.6.1. General principle

The description that we give here calls for a deeper one, in order to link it to the general theory of finite automata. With its inputs outputs, internal states and operating cycle. We shall frequently abbreviate with S.

(That scheme could be adapted to the four causes scheme, with efficient cause as I (which lets in E and its function a auto-mobility), formal as type, material as implementation (in the DU raster) and finality as output (that is only partial).

Materially, an automaton is a binary being, with its beginning and en addresses. Moreover (applying the finite automaton structure), we may part it in three regions, I,E, O. The O region groups the bits to be emitted and the addresses. In the computations, we will admit the I bits are in the beginning of the being and O bits at the end.

All beings internal to a processor communicate with the external world only through the I and O zones (or ports) of the containing processor. Then a processor defines firmly an internal zone and an external region. And the more a processor is developed, the more it controls its ports. More firmly, but also more intelligently (open systems).

A being is seen from outside (image) as a part of DU, with a succession of I (non state region), E (enclosed region, not accessible from outside) and a region O with its states accessible from outside, either at any moment or an intercycle moments (synchronous model).

The agent operates by cycles. Within each cycle, each processor operates according to its internal laws. Then, transmissions occur between beings. In that elementary model, we consider that : :
- the cycle duration is sufficient to let all internal operations take place,
- there are no communication limits between beings, bur for distance limits (see below).

3.6.2. A minimal model

Let us try to elaborate a minimal model of "core" for an automaton, with B = clock plus the constant part, unchanged, or possibly copied from cycle to cycle. B may considered as the program running the automaton, or the parameters of its class. B includes the e and f functions, of pointers to functions that can be external. Somehow, the operating system.

By construction, as long as this automaton remains itself, B is constant, but E changes accordant to the inputs I.

We say nothing of O, which does not matter at this stage (as long as there is no remote random looping on I). B could be "subcontracted".

Tes.jpg

(this scheme should be taken also in ratios, with typical cases).

If S is really finite, there is nothing else, in bit numbers

Te = b + I + E (possibly + margin)

The loss bears not only to S (operating bugs) but on B. Then, after a while, the system no longer knows how to reproduce itself correctly, and dies.

We can add  into B and E some redundancy and auto-correcting codes, but that increases its length, hence its relativist frailty. And so more since we must add also B in order to integrate the coding/decoding algorithms.

Then everything depends on the relation between the two growths.
Loss probability on length function of Te
And this length itself depends on the auto correction rate.

The reasonable hypothesis is that auto correction is profitable up to a given level, which we shall say optimal. Hence, we shall consider that their relativistic loss rate is the net rate.

Once that rate known, we can deduce from it the probable life expectancy of the system (excluding the other possible destruction factors, like external aggressions for instance).

We may also make hypotheses on I, considered as meaningful for L.

Favourable case : I brings correcting elements to E (and also to B ?)
Notes about B : limit of its reduction, is reliability.

control.jpg

3.6.4. Automaton cycle.

On every cycle (the cycle concept, with its radical cut inside time, is fundamental in the digital concept), each processor changes according to its internal laws. Then transmissions occur between automata. In this elementary model, we consider that:
- the cycle time is sufficient for all internal operations
- there are no band pass limits between automata, but some distance minima (see below).

Possibly : a being may have its own clock, in order to sequence the internal operations during the cycle phase.

Then we can consider that E contains a part of B which does not change from an instant to the next one, and which supports the invariants of S, including the auto-recopying functions from an instant to the next one.

If we have A = B + E, we assume that GCo recopies B and the whole automaton according to elementary technologies. Then, we need an automated addressing in addresses shorter than log B + E.

And log A
I f A is small, then address is large. e.g. 2 bits address, 2 bit,
A 21K, 10 bits address.

We see that framing/rastering is profitable, would it be only for this reason.
In a comparable way, we shall have op codes, pointing on op codes.
GCom must grant the transitics, ie transport inside A as well as successive instants, then rabattement.

Relation B/E (neglects the relation to I/O):

BplusE.jpg

We reach here hierarchy, nesting of sub-automata. As soon as there is a processor in A, their relations are defined by IEO(A). It is mandatory to
- organically, assign to it an existence region in the string, at a moment in the processing (rabattement); if DF, delay/reliability parameters
- functionally, an external description (type, IO addresses).

TE is the "rabattement" on one dimension of all zones of a S (if infinite ?) TGG is the rabattement on one dimension of the whole of MG

The conditions of this rabattement may be discussed.

gap22.jpg

3.6.5. Being identification

We have a sequence of outpupts O. We try to describe them. i.e. we design an automaton which generates these O.
This operation will generally be stochastic, heuristic, with an error risk.
Symmetrically, rational identification of the World by the automaton.

We wan distinguish bodies with addresses. We admit that all the S have for I the totality of DU.

Find E and the f and g functions through I and O.

A very important stage is when S is able to know which O has impact on its I. And possibly to experiment systematically, generating different outputs to see what it gets. At some level, the mirror test and the emergence of the I : emergence is the moment when distinction of I from D (what is perceived of D) (????what is D?) . To identify O, S needs a mirror. We shall have the global O, image of the self for the external. Inside this global O we will have the explicit O (gesture, speech, attitudes).

Automata functions are not reversible, in general. Probably it would demand a very particular automaton to obtain reciprocal (or symmetrical) functions giving I = f'(E,O), and E = g'(E,O)...

3.6.6. Being operating mode

If there are no inputs, the operations are necessarily cyclical, but the cycle may be very long, with a theoretical maximum on S variety, or more exactly on the variety of its modifiable parts (deduced from that the basic program).

(???) For instance, if b(E) is larger than 30, in general, the E = f(E) function will not do such a path. But there can be a lot of sub-cycles, nesting. For these sub-cycles to differ from the general one, there must exist some bits external to the cycles, for instance a counter.

Nevertheless, new things may appear in such a closed phase, if S analyses itself and discovers :
- empty, useless parts,
- better ordering methods (defrag)
- some KC as much as we have generators,
- parts.

Here, we must look for a relation between the input rate and the drill depth. There is a normal way of operation for S for a "typical" I rate. When I stops, we go to deeper and deeper tasks, until we reach the root. Thereafter, the S is bored.

What about an absolute reduction limit, and a relation between root and HC ? That could be studied on a given E, with its modifiable art. But also a DU given with its contents.

We could also speak of some kind of "internal inputs", it the organic interior of S has with the environment other connexions that the I proper. That should be modelled cleanly. For instance with a partition of I in levels of different depths:
- operation I, day to day life
- ....
- superior I : consulting, development.

On the opposite, if for a long time, the inputs are very strong and never let go down in depth, there is a sort of overworking, over warming, and S will ask some rest or sleep (preventive maintenance), or else ends breaks down.

But, in the internal work, going towards the root, S may find new patterns.

3.7. Some beings defined by matter/finality

A functional typology may demand organic features :
- minimal size, or maximal, sometimes strictly defined,
- maximal admissible distance for the processor (due to transmission times).

3.7.1. Message

- Materially mobile, Form defined by format/language. Finality : action on the receiver. Efficient : the emitter being

A message is in duality with beings. We could think of a "message space", with particular openings on GG. A message has a particular kind of address. Near to stream. It moves, and GOS or a subsystem manages the move

3.7.2. Program

Materially a passive being inside S, or called for when demanded. Formally: written. Finality : function. Efficient cause : call to the program.

"Une action, résultat d'une opération de contrôle ou de régulation, est programmable si elle peut être déterminée à partir d'un ensemble d'instructions. Pour cela il faut que la transformation à contrôler soit déterminée et qu'on en ait établi un modèle complet." Mélèse 1972.

"La programmation a pour objet la transcription finale en langage-machine des résultats conjugués de l'analyse fonctionnelle et de l'analyse organique." (Chenique 1971

3.7.3. Process, processor product

Materially fixed, inside a processor. Form defined by program. Finality : generally an output. Efficient cause : triggered by I or internal S cycle.

To be defined in duality with processors.
The role of time. By definition, partly non permanent, anyway not the same type of stability as the processor, who support the processes.

How is a process organized ? Organic and functional definition? In a descriptive system, as with software components, the organic side would be the interface representation; functional on the other hand would be the methods in the components? Or vice-versa ?

See services.

3.7.4. Pattern, form, force

These concepts are “open” and vague as long as they are not defined precisely in some formal context.

3.7.5. Behaviour, emotion

On the low levels, behaviour dissolves in matter. At the highest level, it is part of reason.
Architecture of behaviours : hierarchy, sequence, nesting.

A behaviour may be innate or learned. From an IOE standpoint, that does not matter very much.

Innate means : settled with the being creation. A type is instantiated (or a genetic combination of types) which had previously behaviours.
Then, this behaviour may be kept on as an internal resource (accessible by pointing on the type), or, on the contrary, the corresponding code is copied in the new being.
A first, that has no important but for performance.
Later, on may imagine that the being pushes on its integration, changes this behaviour, overcharges the code... then goes further from genericity.

Other behaviours may be acquired/learned. With diverse degrees in the autonomy of acquisition:
- simple transfer (but different from innate insomuch as it is built after, an independently of, the creation or the being)
- guided learning
- autonomous creation of a new behaviour.

Example : the apes learning to wash potatoes.

Robots.

Emotion. More passive than behaviour. Maybe of rather low or high level. Emotion is a kind of pattern.

3.7.6. Culture

Materially : like patterns and behaviours, but larger. Form : idem. Finality :global L, plus pleasure. Efficient cause : learning.

The set of patterns owned by a being.

What can be transmitted from one being to another being core using only its imput channels. Possibly with feedback systems (the learner sending acknowledgment signals to the teacher system).

3.8. Identity, digital from stard

3.8.1. No identity without digits

Literature about identity, for instance Kaufman [9], Halpern [10] or Wikipedia never refer to technologies, and certainly not digital technologies. Even Keucheyan [11], who devotes a chapter to “the social construction of identity”, and another one to the “community of Matrix fans on Internet” never digs down to their digital base. Nonetheless  identity, even before it is psychological or social, is digital from start, since it is built around a   personal digital code, a number written in base 4, our DNA. Then, as soon as we acquire a social identity, through family and registration, it is no less digital, since written in first and second names, another string of digits, and present in more and more digital information systems. 

Nowhere is this digital basis of humanity more strongly asserted than in the first chapters of the Bible. First, the importance of genetic species : “Let the earth bring forth the living creature after his kind” (Genesis. I,24), then the role of man in applying names to species “The Lord God formed every beast.. and brought them unto Adam to see what he would call them” (Genesis II, 19). Saint John will endow language with the highest status, in his famed Gospel preamble: “In the beginning was the Word” (John I, 1).

Human life can actually be described as a long search to make these two ends meet, the innate base (the biological code) and the social project  (the civil status), aiming to make our identity really our own. “Become who you are”, says Nietzsche (after Pindar and Seneca). A difficult task even for God, as again says John “… the World was made by him (the Word), and the World knew him not” (John I, 10).

3.8.2. A purely digital model of identity ?

It would be exciting and probably useful to build a purely digital model of identity and its augmentation, around some ideas as :

- the evaluation of the “strength of identity” of a being, based on the (digital) complexity and the specificity of its features, hence its “recognisability” in a digital space;

- paradoxical aspects of identity, as says for instance Lévy-Strauss [12] : “One of the paradoxes of personal identity is that it expresses oneself by adhesion to groups and crossing of collective identities”. This could be modeled with networks of sufficiently sophisticated automata;

- the fundamentally recursive nature of identity, with mirroring effects as seen for example in the master-slave Hegelian dialectics [13], the Lacan’s “mirror stage”, the Cartesian “I think, therefore I am” or even the scholastic proofs of God’s existence and identity (aseity) ; a formal (mathematical/logical) approach could be particulary promising, since reflexive deductions (diagonalization) are basic for the Gödelian “undecidability” principle and…  computer structure itself [14].  

Let us now focus on recent decades and the DAI progress up to now, from the modern views dominating in the 50’s to the postmodern explosion giving it a name and global perception since the 1970’s.  The dimension of a conference paper, of course, allows space only for a broad brush approach To begin with, our neat parting between the two successive stages must more properly be seen as the dialectic relation of two mental and social “paradigms” that have more or less co-existed from start, even if or course, the prefix “post” assumes that one followed the other. We shall conclude with some views about the future, as far as it is predictable.

3.10. Internal/external

An important point it the relation of a being with its metas, internal and external :

- internal : impossible on 1 bit ; possible with 2, external : if the meta is much longer than the being, it is an address, and the being is an address or a pointer

materially : interdiction to address there.

epistemo3.jpg

action_on_system.jpg

internal_external.jpg

Transparence

An example on VGA colors

on a .bmp file
scan : predict the color of next pixel, or generate globally the array
do it on 16 colors to be compatible with Roxame
on every point, compute/compare with the getpixel

model 1. without constraints on the program length
if mem = 1 bit
can there be false memories hiding...
T he program has choice between two algorithms to compute the next bit
These two algorithms may be changed ; they could be mere bitmaps

ex. reads the first bit, then does not take the following into account
if the two models are rightly chosen (from outside of from the two first readings), in particular if one of the two represents a very frequent bitmap, the method may be rather good
if the bitmaps are generally made of large monochromic blocks, it is a good repetitive solution

I f mem = 2 bits, one can
.A extend the precedent method to 4 images
.B extend to frequent models by blocs of 1,0 or different zones ; in that case (and the preceded lightly upgraded) we are lead to the pattern recognition issue, with bets of the form changes

a B1 model, where the form would be the colour, in blocs, non dithered; then the prediction is the repetition of last colour
a B2 model, in grey, with black, white and two greys

Study of the transitions from black to white
if two bits may code a transitional state
I want a pattern number (1 bit)
+ x transition bits
. if OK, stay
. else, do not change, but store the new values (4 bits) and predict as above
if OK, stay
E lse, if the new value + the precode gives a new model
if yes, take this model
else, no change

Change the transition register (4 bits)
and place the position bit

total, demands a 14 bits memory

The VGA color system forms with 16 + 16
theoretically, 5 bits give them
but I have no data about the transition
look for smaller algorithms
We could try to make appear "hidden" beings through more or less active operations...

 

Expression and production

In these schemes, we admit that the expression is the production; but it could be a very different thing (with an ad hoc program it the memory is changed). For instance, with a set of light colours, overprinting in black of error points (one could go out of the contouring)

Note also: when binary words values are looked for, everything is made false, or difficult, since a reference may send as well to a byte as to an enormous being.

Autonomy

Autonomy calls for a rather sharp parting of an "internal" part of the being from an external. It is a characteristic feature of life (Bailly-Longo). Mechanical and electronic problems, for safety and efficiency reasons, tend also to close themselves inside casings and screens.  Privacy.

 On the other way, electronic communication tends to cross the skin barrier, mainly for medical exams, but also more and more to reach new control modes, for crippled persons only at present. Here also, some form of convergence between natural and artificial beings is at hand.

A recent example has been publicized recently : control of a walking robot by the brain or a remote monkey (Miguel Nicolelis. See for instance wikipedia). See notes in my Identity paper in Laval

In the future, my privacy may be little connected to my skull or home, but by some borders and laws in the “external” digital world. See Daniel Kaplan book.  Notion of property

That lets open the question of internality/externality in DU proper. That seems to be resolved in the address system. The external world is accessed through I and O, the internal through E. Considered as a 1D string, the being has borders at its beginning and end addresses. Internal addresses must point between them, external outside of the interval. The notion of "membrane" has no meaning at the basic level. It could have such on higher level of the addressing system.

Varia

To see also : internal/external as a difference in control. Referring to four causes
- Materially : interiority easily defines (under some topological conditions) , specially in 1D. Reinforced by membrane an free zones.
- Formally : given by the pointer and formal rules (OO programming). In some way, form contains matter (but does actually mean?)
- Finality : not pertinent (?). Altrism Vs. selfishness
- Efficient : external beings cannot operate directly on the internal parts.

Lifecycle : see 4.5. about L

4. Time

How far is time digital ?

Day and night. AM/PM. Early/late. Calendar. History.

Beginning/end. Past/present. Instant

Antique hours. Equal hours

Time may be considered as just one more dimension of DU. but with particular features beginning with its irreversibility.

Instant and bit. A cut or a small part in the time line ?

Time in DU is strongly
- oriented ; at low level by cycles ; at high level by Moore
- cyclical (depends on which level)

Clocks

Part of digital devices have their own clock. , this base is operated by the elementary clocks. Hence, Bailly-Longo proffers depreciative appreciations of machines. Digital time would be "a Newtonian absolute" and "the time of the myth, proper to Greek poetry". In fact, it is only in particular case, in "real time" programming, that this clock time plays explicitly a role. Normal operations use standard time references (which are digital, of course). And Bailly-Longo say of "time of myth".

At first look, this dimension seems orthogonal to space dimensions. But, as a lot of operations are time dependent, this must be elaborated, as Einsteinian relativity has shown for the physical world.

This model is simple only if we admit that there is a general clock beating for the whole universe. But actually, a lot of beings have their own clock, and the synchronization, necessary to communication, may introduce some sort of continuity.

Time in the basic automaton model. A given being may be considered as a sequence of states, with possibly a starting state (original state) and a final state.

The clock. A bit is enough, if it is at random.
But aim to a clock's autonomy is assert its difference, its comparison and its randomness in relation to the cycle of the accessing being. And, to see that, one bit is not enough. we need a succession of bits in time.
Number of cycles in a system (see Bailly)
Temporal mass in R4 : integral of masses through time.

Representations of time

Representations of time allow prediction and history and bring some kind of reversibility.

More and more digital along the centuries. Sundial. Equal hours clocks in middle-ages (check)

We could see DR life as put in rhythm by a general two phased cycle. In one phase, each processor runs its program, i.e. processes the O(I,E) and E(I,E) functions. In the other one, the O are transmitted to the specified I. We can consider that
- this transmission function is operated by a general Com program
- or that that each processor emits towards all the others, each one making do to take what is useful for it.

The choice between the two systems is more an optimization than a fundamental question. If there is a general Com program, it has to know where are the processors and the free regions. Actually, with a smart Com, we could have all sort of beings in the general space, with protections, etc.

Time, as well as all the beings and GG may be rastered. Periodical structures. More elaborated structures (variable length).
Classical case: signal sampling.

Transversality in respect to time

time_transversality.jpg 

Time_depth_complexity.jpg

Time depth and complexity (?)

Temporality and scheduling

DU life is also digital, with changes taking place in a null duration instant, and states before and after. It is controlled mainly by the clocks incorporated in part of the beings. The clock is the minimal agent.

DU could be considered globally as an agent with its own clock, possibly even with the unique clock, controlling or triggering all the other clocks. But that is not possible due to DR, which prevents control at long distance.

Nevertheless, it may be practical at some times to think of a generally controlling GOS, operating in two phases. In the first one, each being executes its own functions O(I,E)  et E(I,E). In the other one, the O's are transmitted to the I's.

And that is necessarily true in some subspaces, and specially in any being, at least in an being core with its clock and basic functions. Large beings could have non synchronized parts (for instance larger memories)

Time as well as any form in DU, may be organized in larger structures, periodical or not. We must think that GOS has something like an eternal life. Bits also. Intermediate structures will have finite life cycles.

In the automaton model, time is oriented. O's follow I's. A program is the scheduling of functions by succession, with possible parallelism. A program is precisely the scheduling of a function by a succession.

4.5. Life cycle of a being

Part of the gap: ontogenesis duration, which is not negligible in relation to phylogenesis, at least in the recent periods. Somehow, an important moment in History is when the most evolved beings have had an ontogenesis time in the same order as the historical events, as several phylogeneses. Then they see the world changing. (A man is eternal for a rose (Candide, by Voltaire ? )). "De mémoire de rose).

For a rather simple machine, the life cycle (reliabitity bathtub curve) is largely flat. For complex machines, at least some of them, there is a surge to maturity. That is true at least for those which have complex relations with their environment, since the environment itself has to adapt. Then, there is a surge to maturity, then decay. That is due not only to wear and tear of the being, but also to its obsolescence due to the environment evolution (and better system models emerging).

The stability of a given family is artificially maintained beyond its natural obsolescence (e.g. material of functional) by its functional performance for a population of users.

Cycles may be nested in large cycles. Or be in succession.

If E is very large, it structures itself progressively, by autostructuration (innovation research) or by technology acquisition. Intermediate solution : learning.

After that, we can assume, more realistically, that there are "synchronous (sub-)spaces", beyond with some sort of synchronization is necessary (see DR).
Inside a synchronous space, a clock sets the time. It may be given with a simple looped gate. By definition, an elementary time it this space is then a cycle of this clock.

4.6. Varia

Behaviour has a time structure.
Method is a structure in time (succession)

Music, cinema, dance, clock.

Cinema : continuity reached by a succession of still images.

Real time.

5. Relations between beings

How do beings relate to each other, and what manages the relations. To make it simple, one could say that all that was at first enforced by laws of matter (attraction), and, as far as DU develops for itself, they end to be enforced by "soft" laws.

Inside a biological body, the main coordinations are supported by enzyme circulation and by nervous (digital) system or, more exactly, systems, since the human body, for instance, has several ones, including specific (sub) systems for cardiac control, for instance. The control functions are then distributed between the central system (brain), various local systems, and finally the cell control systems.

In a computerized system, links are ensured electronically through wires, optically (mainly fibre optics) or wireless. The main control is concentrated in a central unit, hosting an "operating system", but
- with important unit controls for various devices such as disks and more or less integrated "peripherals"
- sometimes with a radically non-central type of control, a "connexionist" view of things, mainly with "neural networks" (at present, the only practical way of implementing these networks is to program them on classical computers, but that is not a real breach into the theoretical principle of non centrality).

At the most global level, it is tempting to imagine something as a "general operating system" managing the totality of communications. The web, Google, the rather permanent remote control of computers seems to go this way. Orwell's threats, or a new kind of Brave New World ? See below what we say of digital relativity.

There are two main elementary structures : hierarchy and loop. They are minimal in the sense that a minimum of relations are necessary to build them : n relations for n elements in loops, n->1 in hierarchies. They also correspond partly to order and equivalence relations in set theory.

The loops bear complexity, reflexivity, recursivity. Hierarchy bears little "complexity" because, at least in principle, on can always deal only with the higher levels.

We could find examples in sociology or market, useful. not difficult to be transcribed. We can also find models in the web and computer networks, the anthills and beehive.

If we describe the structures induced by communication, we can use a representation with graphs. A simple " A and B are related implies no direction not periodicity. The relation is transitive, symmetric and reflexive (it is useful to say that A is in relation with itself). Then, such relations define an equivalence relation, and a partition in subsets of beings communicating with each other.

The relations may be oriented (emitter/receiver, for instance), and valued, according to the number of bits exchanged, by cycles, or sets of cycles, or the distance between beings.

A being is totally controlled by another one if it communicates with it exclusively. Could be more precise.

Between high-livel beings, relation may be “communication” at a series of levels.

5.1. Hierarchy and nesting

Note : that is not specifically digital !

There a first basic (trivial) hierarchy induced by the (law of) non intersection of beings other than content/container. Then we have bits at the low extreme, and the global DU a the highest, with systems of parts, subsystems, groups, etc. in between.

The creation of a hierarchy is done to delegate functions and then to alleviate the high level being. Then the domination relation is transitive, and the tree grows.

The simplest solution puts all elements along one line, each being dominating another one and being dominated by another one (but for the extremes of course). Here, the number of hierarchical levels is equal to the number of elements. This solution is rarely optimal. And the domination risks to be uncertain beyond several levels of differences.

Another very simple solution is to have one dominant element, and all the others below (and here also, generally ordered).

If S receives a strong flow of unpredictable I (but somehow "interesting", i.e. not totally white blank) it may consider that this flow on its I is sufficient to make its own Land accept a sort of dependence. Some S" could say that S is henceforth predictable as S' mirror, (if S does not add something). That could be risk for S, it its survival depends on its unpredictability.

The number of levels as well as the hierarchical span are not arbitrary, it the structures have to be viable. Thresholds limit the number of subordinates to one authority, of neutrons in a nuclear core. Mass default according to atomic weight may even be taken as the search of an optimum by Nature.

Narrow spans (from 7 to 12) fit for hot relations and rich interrelations between human beings. Pleiades and cenacles. Stars and polyhedra. Beyond seven, the notion of plurality is nearly lost.

But the survival of these structures calls for a supporting space, and matter. Continuous space is a limit, as the liquid or the perfect gas. Concrete space bears always some grain, of measurable size. You can count the number of grains in a paper sheet or an mud bloc. In this case, a very large hierarchical span will generally be looked for in order to get flush forms, white sheets, liquid marks and stable results in votations. A lot of technological processes aim to the creation of this gap. Sometimes destroying pre-existing structures.

beings also use nesting and layering, somehow analogue to hierarchies in matter, but more supple. If a same region, physical in DU, is called by two beings to create different beings, there goes no without problem if it is at the same moment. As long as one stay virtual, it is no more problematic that several millions of maps for the same place on Earth.

Functional nesting

Let I be a sequence of beings   in, with same O, we say that the operation will split as following :
first for i from 1 to n,

En = fn (in)    (the may exist several)  and on  = g(in)

,then the functions e = f(e)

then the functions o = f(e).

It may be important to chose a right order or operations (lexicographic order, for instance.
Then a typology of functions (assignment, operations... functions properly said

5.2. Loops

A catalogue of circularities:
- iteration loop (programming)
- feedback loop (cybernetics)
- C++ "this"
- paradox
- conscience
- auto-reference
- mirror; not a loop in itself; more generally, effects of O on I (at the following cycle).

Self loops. Cycles (see time).

5.3. Dependance, causality

Of course, independence.

5.4. Communication

5.5. Operations

The operator/operand distinction or merging

Create/instantiate
Blend (pictures)
Com
Maths
Text and image operations
Material operations: move on the raster, cuts.
Behaviour operations.

6. Representations

6.1. Generalities

Representations are a kind of beings inside DU.

Representation per-se (per construction) or per accident (trace)

In the traditional world, an enormous ditch parted representations from the represented beings. From mass and energy standpoints, there was no common scale between a text or a painting and the person or the country described. The difference was also borne from the difficulty, and often the impossibility, to duplicate the thing, when a copy of the representation was comparatively easy. (That is of course less true for sculpture or performance arts). "The map is not the territory" was a fundamental motto of any epistemology.

We must, specially here, "desanthromorphize" ! We do not deal here with representations for human beings (see in special chapter), but for systems in general (finite automata, computers (not seen for their physical aspects)...
A major point: text; the normal way human beings use is through writing or sound. In both cases, these documents include a lot of "decorative" aspects that in principle are not meaningful, not real part of the text
but how far, inside a natural language or not, can we reduce the things to a (standardized?) minimal expression
Residue?

In DU, as long as connection with physical word is not needed by human beings, this gap vanishes. Representations of beings may have a role or mass reducing for easer use, but remain in the low physicality of electronic devices.

In parallel, human beings have learned to enjoy representations, up to the point of live purely electronic ("virtual") worlds, where representations stand for themselves, and the maps no longer need a territory to be valuable, though calling for appropriate scale changes for human efficiency or pleasure.

External representations are made by a being to be transmitted to others. In this line, a representation is a kind of message from an emitter to a receiver (possibly multiple).

Internal representations are beings referring to other ones, and making it possible to use these ones in a more economic way. In the classical discourse about computers and information systems, the reality is outside and representations value depends on this world. In DU, a major part of representations stand for themselves and are used for themselves. Art was a precursor of "second worlds".

A virtual being is not a representation. It is a being without a location in the material DU. Even if it mimics real beings. It has a form reality through
- its presentation to and interaction with real users, or better, a community of real users
- its persistence in time
- (in physical world) its presence on a real URL.

Representations as such are passive and dependent beings. They are "read", or "looked at" when that is better that to a direct access to the represented being, or when the represented being is not accessible. For instance, historical beings no longer existing, or distant beings whose access in long and costly, etc. They must not exert an autonomous activity, but for the case of “simulation”.

Representations are created in two cases :
- when S needs them for its own use ; if it is for future use, the representations are stored in memory, they become part of S knowledge, or "culture" ; in the normal cases, theses representations are compact and structured, in order to reduce storage costs and to facilitate use ;
- when S has to present things to other beings ; in this case, these representations can be short or, in the contrary, developed to cope for transmission errors or to conform the receiver needs or desires.

There are basically two ways :
- from intrinsic nature
- from some sort of scanning or artificial compressing.

If the represented beings are changing in time, their representations must conform to these changes. There are several way to obtain this result :
- regular access to the being in order to obtain the new information
- knowledge of the laws of change according to time, and computation of the new values when using the representation
- integration of a specific clock in the representation, which will change in a way conform to the original.

In the last two cases, arise the possibility of errors, if the law of change is erroneous, or not sufficiently precise, or it unpredicted events affect the status of the represented being.

Metrics of representations: e.g. number of bits in the representation/number of bits in the represented

The simplest modes of reconstitution are :
- a pointer on (the address of) the being itself
- a bit per bit copy of the being.

These two modes are impossible or of limited use :
- by address : the being is far located, or does not exist any more, or not yet (it is a being to be made now or later)
- by copy : the being is heavy, possibly heavier than the totality of S itself. Moreover, for some actions/processes, the important lies in certain features of B, which would have to be computed each time.

Then it is appropriate to replace a being by a representation
- shorter (generally much shorter)
- more appropriate to the actions involved.

Nota : see in 1.4. notes about analog and digital representations

6.5. Numbers and data

>galvanomètre.jpg

Typical example of a basically analog de ice,  but taking the (human) user to digitalize ; express a figure (here in practice with two meaningful digita). Nevertheless, such a device is sometimes used “analogically”, when the user gets some movements of the needle (rapid or slow move, oscillations…).

 

En salle de contrôle, voir Louis Richaud. En salle de contrôle, tableau de bord de la production. Informatique et gestion no 64 ergonomie, digital

 

User psychology (neursosciences) is fundamental in analog/digital HMI. The Sainte Odile Mount case.

It does not play the same way with purely machine systems, where the best is (more and more) to digitize the data as near as possible to the acquisition point, and to go back to analog only at the effecdtor level.

 

 

 

Numérique, numérisation, the French translations for digital, digitization (and scanning), are misconceptions. The French linguistic authorities have imposed "numerique" to translate the Globish "digital". That is a misconception. Indeed, the binary codes we use to store, process and transmit text, image and information in general are not numbers. They accept neither the structures nor (a fortiori) the operations valid on numbers. They are actually codes, ciphers. And, at the end of the day, the globish term "digital" is better than anyone else. So more (from a French standpoint), because it is built on a Latin root, digitus, finger, which provides a sound etymological basis for the intended meaning.

But analogy comes also at a purely formal level. By itself, the bit order, from weak to strong ones, is a form of analogy. Actually, the internal representation of integers breaks the analogy in order to enhance performance and reliability.

A number is not a digit. It may represented by a set of digits. But that is not always not necessary to use it or even to compute it, if one stays in the limits of formal calculus. (See Bailly-Longo).

N one integer defines one integer (positive)
Z elements are defined by two integers
Q elements are defined by two Z elements, hence four integer
R more complex
C complex numbers
Q quaternions

The DU model can build the real line.
The representation of a number may be more or less digital

Various representations of 8

huit, eight, ocho, otto, acht
7+1, 8 + 0,
15 - 5 , 15 - 0
2.4 , 8.1
80/10 , 8/1, (8*n)/n (n not equal to 0)
sqrt(64)
0000000000000000000008,000000000000000000
(8,1) (floating point)

1D vs 2D ?
the main problem is :
- convention, agreement on assembly structures
- way of operating of our brain ; or the rational operation, more 1 D than else
- reflexivity (part of convention)

6.6. Images

We think mainly of images as bitmaps. But vectorial and other formal representations of beings may be major, even in the expression process. Though, of course, in DU everything finally resolves in bits. But let us not confuse the bits of a formal text and the bits of the resulting image.

One can say that images are bitmaps; other ones are models.  But then what of vectorial images?

What does a bitmap brings as knowledge ?
1. Some global features and parameters, with simple additive processes : average color, average luminance, etc. Variety also may be roughly defined and comuted.
2. Distribution of values into the page: upper part lighter, centre more saturated See [COQ95]
3. General parameters as resolution, size, histograms
4. Pattern detection and recognition?

6.9. Model

The term model may be taken as

1. just a synonym of representation ; possibly simplified (comparable to a dummy, a prototype, a small-scale model)
2. a “system”, that is a set of coherent assertions, generally with some quantitative relations ; equations are a particular kind of assertions
3. a “canon”

Mainly in the second sense, the model is said to explain the being if it lets not only describe at any time, but also predicts the behaviour or the modelled being (possibly, imperfect prediction) with a precision for the needs of the user.  A being is said "explained" if there is a model of it, sufficient to simulate and predict the evolution of the being in time.

The length of the model is the KC (or at least a valuation of the KC, Kolmogorov complexity) of the modelled being. Normally, a model is shorter than the object it describes.

Then we can define the yield of a model as the ratio: length of the model/length of the modelled being, the quality of prediction taken into account.

See here the “simplexity” concept. Eg. Delesalle.

In some case, the performance of a model may be evaluated automatically. It can evolve. There are different cases, depending on the availability of external information on the domain modelled. Or worse, when an enemy attempts to conceal as much as hi can.

??? If the model operates from a source, a critical point will be the sequence length. With possibly several nested lengths (e.g. the 8 bits of a byte, and the page separators. Burt there are other separators than those of fixed length systems. Find automatically the separators, etc.

bob.jpg

6.10 Varia

. Representations are object referring to other ones, and making it possible to these ones in a more economic way. In the classical discourse about computers and information systems, the reality is outside and representations value depends on this world. In DU, a major part of representations stand for themselves and may be used as such.
The classical kinds or representations are designed according to the human sensorial system. In DU, this is not pertinent as long as S has not to interact wih humans. Then, core features and behaviours matter more than appearance of spatial shape.

The key issue, when using a representation instead of the represented object, is efficiency... or simple possibility. Beyond the address/type which gives the basic access, the representation may be built "from outside", with sampling/scanning/rastering systems, or "from inside", wih vectorial representations or other formally linguistic representations. The traditional opposition of geometry to algebra, as explained for example in the Weil/.... debate, is based on human different apprehensions of the world. The algebraic one transposes directly in DU through formal languages (indeed, DU was borne largely out of the algebraic effort), but the classical geometrical view is no directly digital, and refers to deep "gestalt" or "gestual" features of human being and in particular its neural systems. Geometry, from that standpoint, could become more digital if these neural mechanims are, in the future, sufficiently well described to afford a linguistic formal description.

For a given S, R is a reprensentation of B if S is able to re-constitute B starting from R.

The simplest modes of reconstitution are :
- a pointer on (the address of) B itself
- a bit per bit copy of B in S.

Frequently, these two modes are impossible or inefficient.
- by address : B is far located, or does not exist any more, or not yet (B is an object to be made now or later)
- by copy : B is heavy, possibly heavier than the totality of S itself. Moreover, for some actions/processes, the important lies in certain features of B, which would have to be computed each time.

Then it is appropriate to replace B by a representation
- shorter (generally much shorter)
- more appropriate to the actions involved.

Let us says first that a complete reconstruction of B is frequently pointless, if even possible. Then, a simplified image will do better, as long as it is sufficient for the actions to be done.

Then, the construction of representations may be described generally as a division (or analysis), which demands an appropriate analyser (sometimes "parser"). Symmetrically, the generation of an object similar to B will be a "synthesis", using various kinds or generators.

Repetition and concatenation

If B is a long sequence of concatenated identical strings (possibly the same bit), B may be reduced to that string times the number of repetitions.

A more general case is the concatenation of words taken in a dictionary, and in general chosen in accordance with a set of syntactic and semantic rules.

Symmetries etc.

- B can be deduced from a representation R (stored in memory, for example), and some rather easy geometrical operation, or filter.

A lot of such operations can be listed, and, for a given S ( Roxame for example), constitute a generation system.

The full meaning of the work comes from a combination of the elementary meaning of elements (words, elementary forme), of the structures, and the wey they convey the meaning intende by the builder.

(And dont forget that dictionaries as well as grammars are built with bits).

Multidimensional deployment

De-streaming
Theorem of losses during generation.

6.12. Self representation

- Representation of the self
- Representation of a being “inside” this same being.

That is a major point, as all the "self", "diagonal" patterns. It points to the "conscience" issue with its inherent paradoxes. There are several parts :
- representation of self with its parts (mirror, homunculus), which may physically be done with a mirror and a camera, for instance ; more digitally with type and constituents
- representation of self in the environment, beginning with its own address and the beings round.

6.14. Autonomy of representations

Variable (in the computer acception) is a form of "meta" : the individual featured by its trace in the information system. And, when he dies, that trace is all what remains of him. The progress of IS is also a progress of each of us trace, indefinitely, with all these files, up to "behavioural databases".

Lamartine says to Nature :"vous que le temps épargne.... :" (thou, that time spares..) mais time spares the verses of Lamartine, who takes to us and forever (perhaps) the memory of his love, when the waters and rocks of Chambéry lake are dumb and deaf to day as well as in the time of Lamartine.

Once elaborated, the representations open the way to a generativity which go beyond their original scope. That is shown by Michel Serfati about the variables and exponents by Viète, Descartes and Leibniz.

expansion_compression.jpg
Expansion, compression and the main memory units. <

7. Language

7.1. Basics

language_emergence.jpg

Languages are from start digital systems using a set of "words" and a set of "rules" to assemble them. Even "natural" languages, since Saussure, 9 are “set of oppositions”, which can be understood as a “set of bits”.

They are powerful toolsets to create beings, abstract or representative or other beings. Articulated language was created by mankind, but prepared by a progression in communication systems between animals. Languages are constitutive of computers.

A major, constitutive, feature of languages is the fact that they are shared between populations of beings, thus allowing shorter and more efficient communications. This implies conventions, established by trial and error learning (animal and primitive human language), possibly completed explicit standardization efforts (modern languages). Sharing, externalisation of the code, works better in digital forms, due the very small level of errors on copy.

Languages can be created automatically by artificial beings (Kaplan).

Languages have major disadvantages:
- they are conventional, and not directly intuitive for living beings, then must be learned, or be constitutive of the systems (if one may talk of intra neuronal language), or demand on the using machine a specific translation tool (interpreter, compiler);
- differ from one community to another, then demand translation, which generally entail losses (even from computer to computer).

But they have an exclusive ability : the reference to signified beings, and still more crucial, the self reference and hence the recursion, which brings finite inside of finite, and nest complexity into what looks at first as linear.

However, languages are never totally digital. They always encapsulate some residual analogies, fuzzy parts, even letting aside secondary aspects of speech (alliteration, prosody) and typography (font, page setting) .

Formal languages BNF.
Generative languages. Balpe. Generative grammars.

Is Midi a language ?

A programming language does not contains assertions. Declarative languages are for AI.

Text and image may be rather firmly distinguished each other. An image is a copy of some part of GG, and can be interpreted only by copy or by elaborated recognition functions. On the opposite, text is coded. When a processor gets an image as input, it can only copy it and transmit it bit by bit, unless it makes its analysis mediating an elaborate process. For text, parsing affords operations. But a text/operations mapping works perfectly only with formal languages, and languages "understood" by that processor.

Between text and image, there is the streaming time. Series of images, of symbols.

langage.jpg

Basically 3D, but may refer to 3D. or be "abstract" images, without reference to a Euclidean space, let alone physical

Why how far are text structures richer than images ?
A written text is a kind of image, with a particular structure.

7.2. Metrics on languages and texts

It is important to understand that a representation may be much larger than the being represented. The scale 1 is not at all a limit. That is true in DU but also for representation or the physical world. (For instance micro-photography, medical images, images or atoms...).

Number of letters in the alphabet. Of phonemes.
Number of words in the dictionary. Catach
Lenght of the grammar (...)

The question of optima, and evaluation of the different languages (and writing)

For programming languages, one may count :
- the number of different instructions. Cisc and risc. Relations with the word length of the target machine.
- the number of data types. Typing power.
- the number of accepted characters (was interesting for the ancient codes, which did not accept accepts; that has changed with Unicode)


- the size of the necessary interpreter or complier
- the size of machines able to compile and execute them.

The RNIR (French Social Security number)  examples shows a limit case. There exists an optimum (or a semi-optimum perhaps) in coding economy. In other cases, we can devise more delicate structures, and probably more interesting. One idea would be, at least in a research perspective, to aim at a methodical cleaning of remaining analogisms in our languages.

Function points, and more generally metrics on programming.

Zipf law.

7.3.Terminology and ontology

The word. Some thousands in one language.
How many bits for a name ? Directly, 10 characters * 5 bits, around 50 bits for one word. Which is much more than the number of words...

Structures can be found in the dictionary. Thesaurus, valuations (number of uses, from basic to very rare). Central language plus dedicated languages (medicine, law, divines, computers...).

A definition is a text assigned to a word (sometimes, a small sequence of words, or expression). In formal languages, a word can be replaced by its definition without changing the meaning of the phrase. In other languages, that does not operate correctly. In natural languages, dictionaries are done for human users and are more a sort of intuitive orientation, including contexts and possibly images (bitmaps).

Petit Larousse : Enonciation des caractères essentiels, des qualités propres à un être ou à une chose. Signification du mot qui la désigne.
en TV : nombre de lignes subdivisant l'image à transmettre
- caractère discriminatoire, parfois idéologique,
- caractères statistique
- expression d'une stratégie.

Webster (College edition) :
1. a defining or being defined
2. a statement a what a thing is
3. a statement or explantation of what a word or phrase means or has meant
4. a putting or being in clear, sharp outline
5. the power of a lens to show (an object) in clear, sharp outline
6. the degree of distinctness of a photograph, etc.
7. in radio & television, the degree of accuracy with which sounds or images are reproduced;

A nomenclature is a sort of (semi-) formal dictionary. With "record" structure. There is a typology of P. Berthier (in L'informatique dans la gestion de production)
- on a technical basis, letters and figures alternating, to give for example the part dimension
- base on plages/ranges : letters and figures pre-selected to part families and sub-families
- sequential, incremental (numeric or alphanumeric)

7.4. Extension and comprehension

Extension and comprehension are defined for terms, which we shall consider here as synonym of concepts. Extension gives the list of occurrences of the concept, or some criterion to find the occurrences. Comprehension groups all the features specific to this concept.

In our "there is that" model, extension may be seen as a list of addresses (instantiations), and comprehension as the type or class. See 7.9

If we know how measure for each word :
- the comprehension (length of the associated text in the dictionary, for instance ; we could limit ourselves, more formally, to the length of the definition itself, letting outside the "encyclopaedic" part of the dictionary, like, in Law, the law proper and the jurisprudence)
- the extension, id the name of occurrences of the world in DU (or any world, e.g. 6 human being billions, one hundred of Gothic cathedrals), t
then we have a ratio giving the yield of the concept.

For a given data base, and given number of global capacity in bits, the base may be comprise a large number of element with little bits (large extension, small comprehension) or a few words with large records. s la catégorie).

Extension and comprehension can be taken as a kind of metrics. Extension would be a negative pressure (see below). Google transforms it into a positive pressure and temperature.

>7.6. Paradigm, theory, system

Paradigms and theories are high level assemblies. A long text, or a series of coordinated words.
Question : how a theory may be used by an being ? Expert systems

A paradigm may be defined as a disciplinary matrix, literally framing research and giving to the scientists gathered under this paradigm "typical problems and solutions", define the study space; draw passageways in the set of possible research". ( La mesure du social. Informations sociales 1-2/79. CNAF, 23 rue Daviel, 75634 Paris Cedex 13)

7.7. Core words and generation

When the core of the dictionary is heavy, the other words are deduced of it with short definitions, and reversely (step by step).  ??? In which language.

If we admit that the words in the core are not defined (reserved words in programming languages ? ), and the other ones as totally defined using the core words, or still if defined from each other in an intuitive way, of constant length, then we have :

Circles in the core.

If we admit that concatenation is a sort of definition, we can have one only word in the dictionary of the + operator. If we have n words, numbered from 1 to n .

definitions.jpg

There is redundancy, etc. Ex. from a catalogue of works, make the most diverse, or the most representative selection.

Topological problems of a dictionary. Since a definition in a dictionary cannot be given but with other words in the dictionary, there are necessarily loops.
Graph theory about that.
Unless some words be given as “axiomatic” , and not to be defined.

Track to recursion. The best wt : maximal difference. But if there is repetition, other choices based of relations between beings (WFE well formed expressions, grammar). Prime numbers

We must analyse a given being :
- rules of what it understands (language, WFE)
- concrete environment : what pleases at a given time, for instance the puzzle piece
- comprehension, the "that" and the details inside E = {x ; x>1}
- extension : list of locations where "that" is instantiated E ={a,b,c...}

If we have all the beings (filling at n.2n), the comprehension is total, and extensions does not add anything. If they are well ordered, the being is identical to its number. The whole space is of no interest.

On the other hand, if the beings have little common parts (in some way, there are much less beings than 2n, comprehension will be but a small part of each being, and it remains free, disposable parts.

Inside a being, several types may combine.
Reduction of a being: "division" by all its types, the rest bears its individual features, local and irreducible.
This local part can come from
- locality stricto sensu (geographical location, proximity with a high temperature S, software "localization")
- other abstract factors.

ex. male gender, more than 45 old, born in Afghanistan, bread maker, plays flute, Labour, married, 3 children

If the type is long very long abstract description (comprehension)
A few types may be sufficient, with a short code to call them, but a type may call a long sequence (strong comprehension). In this view, that has no meaning for S itself, it has no memory of its own past, of its traces, but it can be meaningful for other parts of DU.

The bits excess of a concrete S relative to the necessary minimum for the n.2n beings may be done
- either canonically, with a bit zone giving a number, possibly split into several zones
- or, on the contrary, a distribution of the factors over the whole length of the being.

If we suppose that S has been built by a succession of generating factors, it will, in general, impossible to do the division to get them back (bayesian analysis, public key coding).

We consider normally that the "natural beings bear an indefinite quantity of bits. Among them, we can select some bits for identification (extension) and description (comprehension). The extension (en locality) is implicitly given by the position of the being in DU.

For a given beings population, find all the factors...
Let us take a frequent case : a lot more of concrete bits, for instance 500 images of 1 Mo on a CD of 500 Mo. Then, the pure differentiation takes only 9 bits. All the remaining is "local".
Then we can analyse these 500 images, with equivalence classes giving generators.

To have all, it suffices that the product of the cardinals of classes be superior to the cardinal of the whole. Again, 9 classes of 2 elements are sufficient.
More generally, we shall certainly have the fact that we maximize the analyses effect (root) with generators of same length and equiprobable.
Once given the factor family, we can compute the volume of all the beings if all of them were generated. Normally, it will be much larger than the actual concrete, or the available space.
Seen so, concrete is much smaller than abstract : imagination goes beyond reality.

Are “natural languages” so natural ?

7.8. Language as creative

A language is not by itself a representation, but a “tool” or “toolbox” to build represenrations (descriptions).

But it has also a creative role: from a particular case, generalize it, then generate variants :
. similar cases with some variation on quantitative termes, for instance
. dissimilar cases by negation.

Bailly-Longo
Serfati

Creation of new languages.

7.9. Assertions and logic

Assertion + modality
deduction/inference
formal languages

“there is that”

Method, algorithm, description.

PART 2. FOR A DIGITAL THERMODYNAMICS

History is thermodynamics at high, if not global, level.

memoire4.jpg

8. Measurements and values

"When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind" Lord Kelvin.

Several parameters can be noted as able to characterize DU beings in order to study their laws of development and communication. That could lead to a sort of DU thermodynamics. Among these:
- distance; there are differing definitions, or level of definitions, from basic ones (difference of start addresses in a one dimensional DU) up to meaning/semantic distances.
- mass ; could be purely the number of bits owned by a being ; for a one dimensional string, supposed without "holes", mass is equivalent to the difference between its beginning and en addresses ; in more than one dimensions, and if we do not restrict beings to be full rectangles, then actual mass will we smaller, possibly much smaller, than the surface of the rectangle ;
- density and load ;
- pressure and attraction ; a good approximation of pressure would be the Google ranking ;
- temperature ; a good approximation would be the being frequency of use ;
- energy ; that is a classical in information theory ;
- orthogonality ; this point refers to groups of parameters ; they are orthogonal if they are independent ; a string of bits has a maximal informational value if each one of its bits has equiprobable values, and in particular if they do not depend on each other ; orthogonality is also tied to the notion of dimension and space metrics
- yield.

That opens the way to the infinity issue.

Some general laws seem to be at hand, for instance the laws of "right distance" and of "optimal load according to the distance".

This part gives a sort of underground basis for DU. They propose ideas for a theory than a theory properly.

8.0. Generalities

A major feature of beings is that their features (...) change radically with their size. For example, unique bit offers both a maximal cut (the bit opposes radically to the infinite fuzziness of matter, of continuity), and a minimal cut, since its meaning comes only form the context. In the case of a world with a unique bit, this one would be in opposition and relation with the analogue substrate. When the number of bits gets higher, the break from the environment widens, in order to ensure the beings coherence. In some way, each bit becomes more digital. And we could perhaps define a digitization level of one bit by the length of the strings he abides (see below masses and pressures).

Assign a measurement or value to a being is to compute a number, or a "qualitative" parameter,
- using directly available data, like its address, bit length, type, latest known updating (I transpose the parameters of an ordinary OS, but it seems pertinent here), in some cases, loading the being itself,
- computing a value from the digital matter (e.g. mass, basic complexity).

Measurements may be binary : value of one bit, good/bad.
If they are taken in a small set, or scale, one will say they are "qualitative" (but a richer meaning is looked for below).
If they bring more than 4 or 5 bits, they will generally be considered as "quantitative", the number being an integer or a real number (with the restrictions on this kind of number in any finite digital system).

The data so obtained will be properly called "values" for the inquiring system if they have an impact on its L. This impact may be evaluated directly, related to a particular need of S, or by reference to other evaluating systems : standards and norms, market, expert advice.

We will take "value" as an evaluation of the being contribution to the L of another being. Possibly a very general and abstract one... but we cannot have something like an "absolute" or "true" value of an being by reference to GOS.

attraction.png

8.0.1. Quantity and quality

The opposition between quantity and quality is a double issue : type of measurement and orthogonal dimensions. We are near the to be / to have opposition.

Quality and quantity are not so different for an automaton, if not sophisticated, because it will process them through representations of a few bits. But a human being approaches quality and quantity along really different ways. Quantitative evokes some meter, some external evaluation process, when qualitative calls for touch and feel, if not hear, smell and taste.

Quality refers to a scale, with few levels, and fuzzy limits, with an order relation. Some logical operations (inferences) are possible.
The maximum is around three bits (eight levels, in correspondence with Miller's law). Beyond, it is possible to have some sub-scaling (weakly, rather, a little, typical, much, very much, extreme...). Quality may induce equivalence relations, and sorting of elements.

Quantity refers to numbers, strictly speaking. The scale may be as dense as one wishes, neat, formal, etc.; here we have not only an order relation, but a numeric structure, with possibly arithmetic or algebraic functions. But quantity is frequently pointed at with non numerical terms : little, much, more, the most... several..

A priori, the more bits we have on operands, the wider is the set of really different operations.

In some way, the change is purely quantitative as long as we don’t add bits (though, psychologically, we refer more to decimal digits). It is qualitative (or structural) if we add bits, with a lot of intermediary cases.

We shall now concentrate of basic measurements about digital beings. They are inspired at large by mechanical and thermodynamical models.

epistemo4.jpg

8.1. Mass

The simplest measure of structural mass (which is for us very near to complexity), is its number of bits.

More many are the beings, more the global mass is heavy, and more it is useful to structurate, in order to take profit out the regularities and redundancies (scale economy), the more the structural cost is profitable. It would be interesting to have quantitative law.

Mass protects. It is a stability factor. It affords redundancy. But is paid by a lack of nimbleness.

Thesis : the mass of two beings is the sum or their masses, but if one is contained in the another one.
In practical operations, the grouping of two beings will let loose interstices, and the sum will be bigger.

In a being, we have m(S) >= m(I) + m(E) ... + m(O)... (?)

hd1a.jpg

quality_power.jpg

quality_quantity.jpg

8.2. Distance

If DU is one-dimensional, the simplest distance between beings it the material distance, i.e. the difference of their start addresses It is necessarily superior to the size of the first being, but if the second is contained in the first.

As long as the mass of a bit is taken as unitary, the volume is identical to its mass. But we could add weighting.

For a processor, if we stipulate that the inputs I, are at one end, the state E in the middle and the outputs O at the other end, we could say, in first approximation, that the being mass makes the distance between inputs and outputs. We could also introduce the centres I and O regions.

With more than one dimension, we may use the classical Euclidean distance (sum of the squares). A noticeable point is that the distance between points of a being is not linearly proportional to the being mass, but varies accordingly with its square or cubic root.

If DU has an infinite (or indefinite) number of dimensions, with one dimension per bit, {0, 1}N, we have no Euclidean properly (but we could transpose the N distance, or go to Hamming distance).

See notes on spaces with an infinity of dimensions : sphere of null volume, but non null diameter.

The "distance to centre" or "to periphery" of a being may be useful. For instance the distance between two internal beings as the sum or their distances to the centre (geographic distance "à la française", where one must always go via Paris).

Addresses may be richer than the simple N set at the beginning. In particular to shorten them and make the work easier for GCom. Then we may speak of "soft" distance, by indirect addressing (distance in the addressing tree, in a directory, the number of levels that must be ascended then descended to go from an object to another), the address length (branch). The organisation "structure" of the bits recreates a sort of localization, some "matter", at low level, even if we assume that we work with pure information. The distances are tied to the "operating system", GCOS. .

8.2.2. Time distances

In a digital being, the core problem of distance is the number of bits that an automaton may move in one cycle, which is the basic time unit.

8.2.3. Formal and content distances

The Hamming distance (number of bits differing from one being to the other one)  in bits is more functional that the material distance. It is a content distance. We may also have a distance by morphing between two images (used in pattern recognition). But it is not properly semantic.

If each being has one bit only, there are only two values for the distance : they are identical or opposed. If the beings are longer, the issue is more difficult (more rich). The identity case still exist, but becomes rare. Then we have the case where the being lengths are different. Moreover, in Hamming, two messages strictly complementary have the maximum distance ; but that is shocking, because it cannot be a coincidence. We should come to a distance defined as the KC.

We are going to "meaning" differences, with processing depth, etc. (see below).

If the distance between beings grows, the intermediary being has more and more work to do. Logical, formal distance, with different codes, storing formats, logical levels, protocols, delays, redundancies.

The more functional are the distances, the more they are "external" to the being. The transfer time computation may in itself become very heavy.

Lexicographic distance : in the dictionary, Aaron is very far from Zyzyn. Phonetic variants.

Two beings are near if :
- they call for the same actions, evoke the same images
- may be combined in the same action.
- they evoke each other, or are frequently evoked together.
Motivation distances, with cognitive consonance and dissonance (we hear only what we want to hear), will and policies distances.
Sociometric distance. Here, more pi-computing, neuronal networks, etc. Number of intermediaries, or hierarchical levels, to go through.

Distance is greater if there are many beings in communication. That results immediately from the spatial volumes, but also address lengths, etc. More interesting on cognitive distances, standard requirements, etc.

8.6. Density

In the physical universe, density is defined by relation to matter : mass/volume. Then, digital density is the ratio binary mass/physical mass. See Incarnation.

If we have one dimension and connected beings, density = 100%. But with more than on dimension, there are "holes", and the density can be anything e.g. in a raster, or the useful bis in a data base. We could use something like a "format factor".

An important point for one being is the ratio length (of the being)/content. Trivial case 1/1 : the being is its own container. Or it is the same being, or it is actually a content (???). In this latter case, we must at least deduce the window border. (Notion or minimal, or standard border...).
This problem is not dealt in classical maths, with open or closed intervals, since here we are in discrete universe. (or else..).
There can be beings which are containers and nothing more.
Other cases : small numbers (7), large numbers (blob, binary large beings).

We could do with KC/mass.

The bit case : interesting here. By construction, a bit is saturated by one input and "saturates itself" by one output.

Somehow, density may be driven back to Kolmogorov, by computing the useful bits for a processor by the smallest being which would render the same services.

8.7. Pressure

We call pressure on a bit (or being) the total mass of the beings pointing on it.

??? The GOS pressure is something like a minimum. But can also count it for 0, and consider that a being does not exist (or does not exist "really") as long as no other one points on it.

To be more precise, we could quantify pressure with two parameters :
- the size (mass)  of beings pointing on it
- the distance of these beings (the pressure is less strong if they are far located).

We must also add that a being exerts a pressure on its "internal" bits. But that may simply be part of the external pressure on the global being, as long as the being does not explicitly points on its own parts. A "passive" being exerts no other pressure than the external one. An active beings with its IEO functions exerts a rather strong pressure on the corresponding parts. Could be more or less strong if we split the basic functions.

When a being becomes large, internal pressure is weaker around the periphery. Unless we had some kind of attraction by sheer mass (like gravity).

We must elaborate a way of transmission of pressures from boundaries into the whole being. And more generally, inside the being.

Law : The stronger the pressure, the more digital the being ant its contents. Historically, that may be observed in information systems. Social pressure, standardization pressure. Efficiency of sharing of resources. But we should find abstract laws.

A line, in 2D DU as in the physical world is more a line when it is long. E.g. because it  gives a greater "thinness", related to its lengths, when drawn in a bit raster.

Then, entropy principles. Crystallization when pressure and temperature are strong on a stable S :


crystallization.jpg>


Idea of a more subtle model: crystallization happens when, after a liquid phase at high temperature and pressure, temperature lowers at constant pressure.

One kind of explanation : when a being becomes large, its useful part decreases to one half, one tenth, on thousandth... then, for economy reasons, it is compressed. Compression and abstraction have similar aims.

Chunking is a kind of structuration if not of cristallization.

Some decompression algorithms are very simple multipliers (number of pixels of such colour, from this pixel/). Different types of compression :
- by gravity, when there are many bits, necessity of a compressing the centre, densification, class generation,
- data and images compression,
- statistical with or without loss ; if without loss, with or without semantic loss,
- data analysis,
- programs (pkzip, image formats)

"Negative" or "reactive" pressure

That is the number of beings (possibly with sophistication like for pressure) on which it points. Or we could say "power" of a being, if we combine with flow rates.

Attraction is
- stronger if two beings are near (in practice, how does it work?)
- stronger if to being are similar (it they are totally identical, they could be merge)
- stronger if masses are heavy.
When a being becomes larger, attraction may become insufficient to keep it together, and the being dissociates (another form of Gödel...)

Pressure distribution

Abnormalities may mark the pressure matrix, if we consider models with noise, or relativist.

Law : In a close system (even in GOS if one may think of it), the total of negative pressures is equal to the total of emitted ones. But the distribution will be far from uniformity.

One could say that pressure is a kind of constraint, preventing or slowering certain moves, with narrow passages, form contradictions...

8.9. Structures related to location

Inside DU, and any being (so bigger, so more so), pressures (masses, densities)  are not equally distributed. (Then in a raster there are “hollow” pixels. “not addressed” pixels).
That gives a valuation on addresses. The proximity of a high pressure being is positive.

From a strictly material standpoint, the value of a being (or simply of a place in DU) depends both of its length and its distance to high pressure beings. Like the law for real estate...

A large and cold being may be a nuisance to its environment, since it makes their communication lines longer.

A major problem : how far can a being modify another one ? That may be defined in GOS... or in beings dedicated to management.

8.10. Temperature

The temperature of a bit, or being is :
- the frequency of its change of state by unit of time,
- or the number of variations observed during a given number of cycles ("medium frequency").

We could rather talk of "medium frequency". The word "temperature" is not so good, since it would put at high temperature the low weight bits of my model IEO. Then we should combine frequency with the bit importance, and have something like: temperature = pressure. Frequency...

Extension to a being: weighed frequency of its change of state. Possibly forget its internal clock.
Hench, sketch typologies, crossing with pressures

press.jpg


received_pressure.jpg

At constant mass, a rise in pressure causes a rise in temperature.

About frequencies, we could distinguish an apparent frequency and a "deep" frequency, with reference to KC.
For a being, we could also define temperature as a ration pressure (emitter, received..)/binary mass.

Or : temperature = transfer rate/capacity (we get frequencies)

8.11.Energy

I MUST work on this point.
We could also talk of "white" energy. An interpretation of P (see L).

8.12. Digitization level, structuration level

With a unique bit, the cut is both
- maximal: this bit opposes to the structural mass, potentially infinite, of the continuum
- minimal: this bit gets it's meaning exclusively from the context, in this case, the analogue referential itself.

When the bit count rises, the cut is stronger and stronger, in order to hold on, to assure the message coherence. In some way, each bit becomes more digital. We could even define for one bit a digitization (and abstraction) rate, using the length of the strings where it is included (but where do we stop the measurement: byte, record, disk surface?). See mass, processing depth.

When there are few bits, we have to deal with problems of grain, moiré, Nyquist,

With some bits (data, signal), fascinations of numerology, the mere shape of digits design, letters and ideograms.

The break with the original situation parts wider on when increases the number of bits. Digitization and cut reinforce themselves reciprocally. With a unique bit, the cut is both :
- maximal, with the bit oppose to the structural mass, potentially infinite, of the continuum,
- minimal, since that bit gets its meaning only from the context, which here is the analogue referential by itself.

When the number of bits grows, it is more difficult to hold all the bits as one being.
But each one becomes "more digital".
A "digitization factor" of a bit (or abstraction) could be defined from the length of the binary chain he is part of.
But where to stop the chain, the being: byte, record, recording surface (a disk partition ?). See depth of processing.

Then the bit yield decreases. In a plan, you cannot draw an infinity of infinite lines which be meaningful. Then, the meaning growth is not equal to the basis exponential function, and indeed much lower. See also my Champery study (meaning/bit number ratio).

Similarly, demographic growth calls for artificial. Non only for economical reasons, but, more deeply, because the free existence of more and more connected individuals is synonym with larger structural mass, then abstraction and, of course, of artificiality, in particular as a break with Nature.

The growing formalization entails the growth of automata. Automation is not a global factor of unemployment ; indeed it is a fundamental condition of any demographic growth. We have here a loop:
- demographic growth causes a growth of structural mass
- growth of structural mass entails automation, which in turn, by its economical effects, the reduction of hazards and a better public health, augments the population.
(Dehumanizing ...)

We could try to search for a quantitative relation between human beings and DU beings multiplication. That would suppose a good way of measuring the quantity of automata and their structural mass. We could even build a model with the structural mass of one human being as a reference unit, and the automata mass referred to. Is that realistic ? It seems in contradiction with the human transcendence postulate.

But the growth of DU beings is also a break with its dependence of other beings (humans included). Digitization parts the work from their author, ant both get a reciprocal autonomy : autonomy of messages, of machines, of systems. The maker also gains autonomy as it gets rid of the tasks precedently necessary to ensure its living.

A being can be more or less "structured", or we could say "parsable". This parameter is 100% for a being/processor
- totally usable by the processor
- down to the last bit
- all meaningful
- without redundancy nor useless bits.

Structuration level is near to orthogonality level.

It is null :
- no structure usable by the processor, which can only store, transmit or destroy (possibly, just not perceive).

Examples :
- a bitmap is very poorly structured for any program other that a graphical one and knowing the format ; even there, factorization of a bitmap is rather limited,
- the formats include a header nearly totally structured, an empty part then bits ordered well enough to get a display,
- a program is nearly totally structured for a processor equipped with the appropriate language ; nevertheless, there can be blanks, comments,
- generally, replace the original message by a more structure one gives a shorter one, by redundancy reduction, but not always (naval battle, image decompression)

A given being may considered more structured by some processor than by another one. In other words, the processor nature is an evaluator of structuration degree of the messages it receives. Abstractly: a language corresponds to an automaton, and vice-versa.

Executability and structuration are not the same.

One could say that “more structured” is a synonym of “more digital” (see for instance 8.16)

A data file is less structured than a program (to be checked), and more than a bitmap.
A vision program detects more structures in a bitmap than a mere display routine.
Example of nesting with progressive structuration, the bitmap of program
- pure bitmap
- OCR, text structure, usable by a text editor
- program read as such by the compiler

At extreme, a being totally structured for a processor, vanishes. The non-mapping, non-transparency level defines its existence level...

Everything that is formalizable is automatizable.

Un caractère fondamental de notre civilisation moderne: l'influence prépondérante et croissante du langage formalisé et du raisonnement abstrait, au détriment des communications informelles, de l'action directe de l'environnement et de l'apprentissage par l'expérience vécue. Bruner (cité par Lussato.). Parallel : Toffler.

life_cycle_cure.jpg

Other classical opposition, near to digital/analogue. More pertinent to HMI.

8.16. Programming and coding

In programming activities, the compiled binary code is much more digital (see above) than the source code, in spite of the functional identity.

The coding of identifiers, uses frequently, for instance, in France, the NIR (aka "numéro de sécurité sociale"), decimal digits (13 in our case). But eight should be sufficient, since French citizens are less than one hundred millions. Which would allow a large third digits less. The gain could be larger with a pure digital code: 28 bits would suffice, against the 104 bits (13 times 8) in the present system. But we would lose the directly meaningful parts of the code (gender, birthday year and district).

In any algorithm (sequence, phrase...) an even in a mathematical formula writing, subsist always some traces of analogue representation. Idem in any digital hardware, at least residually (remember the problems with the Cassandra machine).

Cam systems may be considered as an analogy form of programming.
beings are interesting, giving an external reference as well for action (methods) as for memory.

Programming is by itself the building of a program analogue (functionally) to an action. But it is also a break, mainly due to the control structures, with a maximum if we to the Lisp radical structures.

A compiler is more digital than an interpreter.
Here also, the theme of an optimal granularity : size/number of classes Vs. a global assigned size (SAP vs. specific code, etc).

8.17.Orthogonality

When interacting with the other, we want orthogonality, by factorization of we what have in common (null sine), and of what is reciprocally indifferent (sine 1). Then we can usefully debate on obliques to make progresses on common and be ready to fight on basic differences.

Thus we profit (phi sine) and lose (sphere/cube).

Orthogonality with 10G human beings.
Dissolve determined beings, towards an ideal pure indetermination (Lyotard)
A the same time, an enormous stock of knowledge, determinations, differences, giving meaning to differences, both equiprobable and meaning bearer.

Two orthogonal bits are meaningless for one another.

8.18. Varia

saturated.jpg

N_objects_number.jpg

xxx1.JPG

xxx2.JPG

xxx3.JPG

xxx4.JPG

xxx5.JPG

8.18. Recurrence

The method of evaluation of the global string may be applied to the evaluation of each sequence. Other idea : in this case, the value of the global string could be the sum of the values of the different strings. That is meaningful if we dont use all the possible strigs, in other words in si n.2n> N. On the contratry, there are repetition of strings.

In other words, relatively to N, I may test all the sequence lengths, up to N, which admits one only occurrence, and at that time, the recurrence rules will afford to valorize.

redundancy.jpg

In other words, when n< log2(N)
a) a lot of strings will be identical ; all the strings can be present
b) otherwise, not all the possible sgtrings may be present ; all the present strings may be different

the red curve is : proportion of possible strings actually present
We get an interest index f.
- input : primitive strings
- output : Hmeter, for instance vision/recognition system
then, evidently, looping : number of beings / size of beings

The interesting beings are not limited to the space presented on the screen but to the global space "which can be ideated" :



we see that the saturation at N.logN is without interest. (?)
There are also rules applying to the "rest", or to the minimal overflow.
This model "c recurrent" gives an ealuation but couls also give an optimal generator (how to fill at best)

There is some symmetry evaluation/generation. It would be interesting to drill their diffence.
It is then like to define a value system for another agent, S2. If it is not interested exclusively by variety, which would be its values ?

In other words : how to find structures in its L.
For instance, S2 builds something and lacks some parts. It looks for un program implementing some precise function.

8.34. Granularity

 

Granularity is a combinatory optimum search.

A sort of optimum is reached when granularity is at the square root of the bit number : first we define several typical beings, then we call them through their codes in ordre to make assemblies. (language)

For a given binary being, the optimum is to be sought between two extrema :
- one and only bit, calling for a complete image of the being, precedently stored,
- no storage, but a successive call of all the bits, one after another.
See language. Millenia of use have probably led to some optima;

It could be interesting to search in earnest for that optimum, with some precision. For instance, we may consider that our 26 (plus or minus something) letters alphabet is an optimum of this type, since the number of letters has not significantly changed since Antiquity. The same question may be asked about ideogrammatic languages.

In programming, the being, interesting level, is bigger than one insruction. Is it better to build a few "big" classes, or a lot of small ones ? This is a practical issue for some development teams (in the 80's).In corporate Information systems, proprietary sofware has to too small grain (the program instruction). A RPG (integrated software, like SAP) has probably an excessivelyu large grain. And SAP has made meaninfgul effort in that way in the 1990's (circa) versions.

M, the diameter in meters (or feet) or its mass, is transplated in m bit numbers...
In theory, when the systems speaks to himself, D=0 and m=0. But, in facdt, there is some traveil with a minimal loop diameter, in relation with M.


max(m) : maximal point in m of a message going from A to B with max(m) = k f (D squard)
and D >> square root of M1 + squarereoot of M2

But, beware : we must mork in tri-diemensional, and ther are relations between these different dimensions, with constraints and leveraging.

The m measure is partly subjecdtive. A priori,m is much smaller than M. In ETF, weigh, volume, energy. There are possibilities of "internal" displacement of the message inside an partner.

Generally, D is much greater thatn M.

We are taken to postulate
- that there is a center
- that a message is born in some point of the emitter system
- and hence is emitted outwards.

To simplify, one may postulate that the "center" is immediately near the periphery, or that it is in the center and remains there, even when it emits.

There is a relation between position information and local information.
Theoretical idea : the message will have to transit in that space as a continuum.
The mere existence of a message implies an alterity, significant/signifie. But that alterity may be completely masked.

D measures a global load

D = d1 + d2 = d2/D

Every operation is of the form : M( r1, d1) goes into M2 (r2, d2) with d2 d1

An operation M1 (d1, m1) goes into M2 (d2,m2) will be said profitable if d grows without an m growth.

 

8.26. Supports and channels evaluation



Sensorial saturation
4 view from 1 to 3D
2 sound
2 forces
1/2 smelling
1/2 taste
TOTAL 10
Quality/fidelity, resolution
1 minimal for a recognition
2 rough, lines
3. realist, line
4. very fine
5. perfect

Interaction level
1. purely passive
2. choice, swap
3. joystick
4. intervention in scenario
5. creation in depth

Computing power
2. micro
3. mini
4. maximum possible

Price per hour.

The values interaction will settle itself in the contradictions.
The DR or "structural superabundance" is more important that the other DR (notably the basic material RD). Soon, there will be a sort of chaos by over-determination,
H ence a large "chaotic" space where values can be applied, sense appear
Error = a kind or error to be reduced
we are back to emergence problems, cybernetics, build to predict better.
To reduce the offset, we must
- renouce to interest
- enhance the model
- admit that it is only probabilist
- look more widely to values
.diacro : on several cycles ; then we can say for instance that this error occurs very seldom and may be neglected, or that it is cyclical and can be dealt with as a feature, or predicted and corrected by an ad hoc algorithm.
. synchro, study the bit, its position among others ; similar as for diachro.

Around, we can have algorithms for cost/yield of these enhancements, importance of this bit for L of the S, cost of model upgrade..
To have a evaluation of "good", S must be able to see that its O impacts its I. If it is direct, we are back to E.
Then, some kind of "distanciation" to the outside word, and even with the internal world : S may survey its E

8.27. Yield

(in respect to matter)

The yied is fist perceived in the metric relation between organs and functions. We look here for a first approach, purely formal and digital. It will developped later (see autonomy, etc.) in transcendantal values. Then, in an historical analysis, in the geral law of progress.

Other definition : yield is the ratio functional value of a system (more precisely, subsystem) and its cost.
It is not mandatory, to study this law and its effectiveness on terrain, that cost and functional value be measured with same units [Dhombres 98].

Nevertheless, a proper generalization of the law must afford to compare and combine different technologies. Then we must lookd for as much generality as we can.
We propose to use, for denominator as well as for denomiateur, the KC.
For the numerator, we will talk of functional complexity. It is the variety and sophistication of the function of the system, or of the different functions that contribute to its value. We could join wih the value analysis concepts, but this methode is seldom formalized as much as we need here.

For the denominator, we talk of organic complexity, id the number of organs and componets and the sophistication of their assembly as well as their elementary complexity.

8.30.Grosch

Le rendement augmente comme le carré de la masse.
Voir en sens inverse les effets positifs des réseaux
Mais la loi de Grosch a été conçue pour les calculateurs des années 70. >Comment peut-elle se généraliser.

8.31. Caquot's Law

>

We could relate the lines not to I/O, but to an inernal com. Here, change the model of finite automata, and deal directly wih space/time (bit.cycle). We are near gates, and E becomes a line :
>
In he last case, one ops per bit.
For one bit per cycle, we need nevertheless much more gates than bits. How many ?
If the transfer rate is one bit per cycle, then bit = capcity.
To increase the transfer rate, hom many supplementary gates ?

8.32. Infinite

Even remaining finite, we can get "infinite" by two ways :
1) E unknown from outside
2). E large compared to the internal processor, strictly speaking, so that S cannot exhaust itself in the frame of its own existence.
Then, the life cycle and its degrees correspond to the degrees of self control, id to a progressive exploration of E by Pr (Pr would be the conscience, somehow).
3) complete with DR

Principle :

a S cannot describe itself completely
a S has always uncertainty about pressures, far from GL (?) bearing upon it. This uncertainty is related to the size of DU, but beyond some dimension, will depend mainly of HC (equivalent of c in HD).99

We may apply my principle : the loads must be so bigger as the distance is greater.

Let x(x - d) be a minimal distance
x2 + ax + b = 0
with b= 0
2x + 2a
f(x)= ax2 + bx + c c=0
aD2 + bD = 0
aD2 + 2bD - 4h = 0
bD = 4 h b= 4h/D

Which operators will do the job ?

Be D the distance, M the diameter (in meters), translated in a total number of bits
m the diameter in bits
theoretically, whan the systems communicates with himself, D = 0 and M = 0. But, actually, there is some travel, the minimal loop diameter
which is related to M
S, the mass, will also be expressed in the unity

max(m) : maximal point for m of a message when we go from A to B, with max(m) = k.f(D2)
and D >> the square root of M1 + square root of M2
but, beware, we should reason in 3D
and these dimensions anr not totally independant; there are constraints and leverages
the measure of m has something subjective

a priori, m is much smaller than M. in ETF : weight, volume, energy. Then some "internal" movement ot the message are possible inside a partner

generally, D is much bigger tham M, and we can postulate that

- there is a center
- a message is born in the center of the emitter
- hence it is sent outside

To be simple, we can postulate that the center is not far from the periphery, or that we remain in the center for emit
there is a relation between the position information and the local information

Then there is the theoretical idea that the "message" will have to transit in that space, like a continuum. That is contestable, but conforms to reality.

All these aspects can partially only be modelled and combined in a global "distance" mesure. Our thesis will then have a different value if we design a printed circuit for text processing or rhetorical recommendations for a marketing department. Under these reserves...

All these aspects can partially only be modelled and combined in a global "distance" mesure. Our thesis will then have a different value if we design a printed circuit for text processing or rhetorical recommendations for a marketing department. Under these reserves...

If the target is multiple, loading grows, in ordre to take into account the diverse features of the receivers. In fact, the completed messghe is more over-structured than semi-structured.

Loading implies choices, specification according to the emitter system nature, laws, idiosyncrasies, style, heraldics...

 

 

9. Complexity

9. Basics

This concept became fashionable in the 1970's, with systemics (or systems analysis). It is a quite tricky notion, related in particular to variety, energy, nesting and recursion.

One of the most pleasant approaches is the Kolmogorov complexity, which can be briefly described as: the complexity of a being is the length of the smallest string which permits to (re)build it. One positive aspect is its comparative operationality, and its correspondence to intuitive notion of complexity. Evidently, that can be computed/reckoned only for a given processor/generator.

Complexity is the reverse side of liberty. It has its costs (exclusion, lengthening of all processes and decisions)

See also simplexity (Delesalle)

9.1. Material/functional complexity

In functional riches, let us distinguish :
- useful mass, the set of operations that can be done with the system : going back up to the useful, we shall go towards the user intentions (half word, prevenance). It is normal to tend constantly to upgrade the useful complexity of a being or of family of beings, would it be only to increase their actual use (market)
- use overload, that is the complexity of operations that must be done to use the system ; this complexity has a rather something negative ; at least, the lower, the better it is. Then we shall tend constantly to reduce it by two ways :

- On the one hand, conformation to the user structures and habits (for an English man, a one foot became a 30,48cm hat with metrics system...) of the same order: Huffman coding, with short codes fro frequent actions; but there are limits to this conformity (or conformity rate), then we must use looping, alter-ego conformity; id. creating automata into S, automatic clutch etc. (alter conformity/prothetic). That points also on the abstract/human polarity.

The use complexity cannot pass some thresholds (competency, rationality or the user) and in dynamics, if there are constraints on reaction delays (vehicle driving, combat, decision in general).

In parallel, automata creation increases the material mass, where we can also distinguish
- object resulting mass (complexity of exhaustive description or manufacturing exhaustive description); in itself, this complexity rising is good (but possibly for stability), if it is not too expensive.
- manufacturing/building charges.

The rising of these two (four) masses will converges in the rising of technical beings. We must raise at the same time :
- the ratio useful mass/use mass, which we could call "relative simplicity". The need of functional simplification has both economical (labour productivity) and a social imperative at world level: not to make artificially crippled (then excluded) people, or the least possible.

Oehmichen has said interesting things about separation of normal use and maintenance functions.
Study, according to different cases, the more or less progressive character of these curves evolutions, or of a hierarchization.
There will be both automatic and human decision flow growing.

complexity_organic_functional.jpg

9.2. Example. Algebraic forms

Let us develop an example or relation between complexity and being mass, using algebraic forms and commenting "intuitively".

Let ax and ax +b be two lines. The second one is described by a longer model than the first one (four signs instead of two). And though it be not more complex "in itself" (both are straight lines), it is in some way less accessible, since it does not go through the origin.

The x2 parabola may is more complex "in itself" that the straight line. And, the higher the degree, the more numerous appear particular points and roots.

Let us now study methodically forms of growing length. . .

With one symbol
- x, the origin point, or an hyperplane containing the origin
- a : empty set, but if a=0, then it the total space.
(If we had just one bit, coding the 0,1 pair, then we would have all the space or nothing at all).

Two symbols
- ab reduces to a
- ax and xa reduce to x or 0
- xn gives a parabola or a similar curve
- xy, union of two hyperplanes

Three symbols
- abc reduces to a
- abx reduces to x
- axy reduces to xy
- xyn reduces to xy
- axn reduces to ax
- xmn : reduces to xn
- x/a reduces to x
- xyz : union of three hyperplanes
- x + a : point -a and hyperplan
- x/y is no a polynome
near zero, a chaotic region
when it is too big, unusable, a motor or a hierarchy is needed

(x;y;z)
x-a;z
x-y;z
ax + by
axy +z
ax2 + y
x2 + y2
xn + yn

axn + y
ax + yz

If taken with sufficient resolution to avoid aliasing (and probably even if we take the minimum number of pixels to have the curve shape recognized by a human being), then, even a rather simple geometric form will need a lot of pixels then o digits in the file, even compressed. Then, the algebraic representation is good way to ), at least if we accept approximation of the concrete position of points Then it is correct to take the string lengths of the formulas to measure the complexity of curves.

In practice, curves as well as equations are looked at with the same eyes and the same parts of the cortex. And our brain has wired geometrical function in it, which make the curve shapes appear "intuitively" simple, when the algebraic forms will look painful to understand, even for math educated person.

Note the reference to infinity by formulations such as a n€N
or a + bx2 + ... + anxn
but even there, there are limits on the writing.
Natural limits on the symbol sets "as long as one want", and the graphic plays (index, exponent, etc.).

Padding is of any length, then the meaningful expressions are the quotient of expressions by the equivalence relation of padding. The implicit padding varies according to operations :
if aa is a2, then the padding is 1
if a+a = 2a, then the padding is 0 + or more exactly 000000000000+
for 147,3, the padding is made of zeroes at right and left
by constant factors in the case when form is supposed to part of an equation with 0 at right

Objective: given a finite set of points (e.g. in R2), let us look  for the simplest function offering a sufficient account (or generation) of the points. The optimum is computed on the equation length over the precision. This gives another type of Kolmogorov complexity, "with loss", from a strictly logical standpoint. An even possibly with gain, if the original bitmap is considered as a forced digitization of underlying continuous beings (e.g. circles), a sort of platonic cave.

9.3. Another model

A given device type increases its complexity by multiplying its parts, and differenciating the types (of device).
But we reach a threshold, beyond which complexity grows by association of same type assembly. There is no longer type differenciation properly. Variety and complexity migrate to a superior level.
Examples : atoms, molecules, organisms, societies biotype.
or : elementary electronic circuits, high integrated circuits, boards, systems.
Or: unit, system, and network.

Actually, beyond some limit, over-performances at a given level are aberrations more than progress ; they are exceptions, if not degenerescence in the type : superman, transuranian atoms, supercomputers, WLSI.

An automaton generates (may generate) an infinite flow of complexity. It is only through reduction that we will focus on its true structural complexity (times the fundamental clock complexity).

Systems theory as a need to confront more complex beings.

9.4. Mastership/control

The term "control" to be defined. Measurement of if (gap/offset vs. expectations?), relations to L. Reference to Ashby law of required variety.  To get/enhance/restore control, three types of actions are possible :

1. Action on the master
- chunk length growth, and chunk number (physical ascetics, meditation, training),
- thought and will structuring (Jesuit "id quod volo", simplify oneself),
- competence and reflexes acquisition,
- "assimilate" the system

2. Action on the system to be mastered

- global action, useless things suppression (elements as well as relations), general simplification
- action on the S as it is seen by the master ; there may be a growth in global complexity, for the benefit of adding new parts specifically designed to simplify the master view. For instance, a sorting of the beings may introduce artificial structures, which as such add complexity, but offer transparency to the master (wysiswyg).
- structuring, hierachisation
- delegation, modularity.

3. Interface development

- master prostheses (AO)
- system prostheses: dashboard; explanation routines in an expert system
- framing
- resolution reducing (fuzz, pixelization, convolutions...)
- intermediate representations
- progressive Cartesian segmenting
- recursive meta-complexity.

9.4.1. Delegation of complexity for control

This opposition could be completed, or simply implies, an energy transfer to the system, with at least "amplifiers". And physical mobility (see my text on three sorts of prostheses, integrated, worn, similar, or integrating. Conformity loops around the user system, autonomy works by symmetry.

With conformity of the prosthesis, simplicity is reached trough "conformation" of the system to its user(s): specific ergonomy of the workstation, task analysis, user model inside the system, soft controls. Conformity relates to user anatomy, habits (for an English man, a one foot hat is better that a 30,48 cm hat), Huffman like commands, with shot codes for frequent actions.

Good side: appropriation, grafting easiness, handicap compensation, learning easiness
Bad side: costs, design delays, local and conjunctural factors hardened, fault sensitivity (that last point to be checked).

There are limits to that style of conformity :
- physical constraints of the system, user competence, and time constraints
Then we are obliged to use "alterity" conformity, that is the creation of automata inside the machine.

With autonomy of the prosthesis, simplicity is looked for through autonomy of a controlled device. . Beginning with replacing as much as possible open loops by closed loops, using appropriate sensors, etc. automatic clutch, for instance. An important topic : system autonomous clock. Titli rules apply.
Good side : generality, performance, hardware interface reduction
Bad side : loss of control (L'apprenti sorcier). Excessive reduction of workload causing loss of attention by the person in charge. Loss of stability. Faultive loops (Larsen).

Autonomy of the system is termed "closing" (closed loops). Transparency problem.

The transfer of control to subaltern automata augments the organic complexity of the whole. (not always, since it reduces the interface complexity).
We can here distinguish
- induced complexity of the object (exhaustive description of complexity of the object itself or its manufacturing process)/

For sharing, in the human/system case, see Valenciennes model.

Common problems: manufacturing complexity and costs. Operating systems example.

9.5. Saturation for the observing being

Main representation technologies.
For instance, different representation modes. We can find
- better algorithms (in abstracto)
- new regularities, and use them.
Let M be a being.
At start, elementary generator + length of M (see Delahaye)
Then we have a generator : total length = reduced M +generator
then a series of generators, which combine with each other

Every knowledge acquisition is a gain. It means that some reduction has been achieved. "Control" is to reduce the (functional) complexity of what one controls. Id creation of new classes, of better multipliers/dividers;

Computability is of course a concept that verges on practical problems of conceiving computer algebra systems and implementing algorithms. Neither will we go into any technical details here, nor is it our aim to discuss space-time optimal solutions. On the other hand, every attempt to define computability also raises foundational and philosophical questions as well.. All we are interested in is a reasonably rigorous definition of computability that is based on common sense insofar as it reflects the capabilities of today's computers. Thomas Becker and Volker Weispfenning. Gröbner bases, a computational approach to commutative algebra Springer 1932

9.6. Relation with other concepts

9.6.1. Programming and complexity

A sophisticated way to delegage/subcontract complexity and control

A program is a kind of being. Theoretical problem: do exist programs P such as k(P) > b(P) ? N, but it can be seen from an external point of view (?)
(KC greater than P length).
One can do better in cycles number, than b(E)/b(I)
We can as example take the case when the observer S' is small in respect to S or O.
Limited rationality of S', required variety of Ashby. In this case, S may saturate S'. Then S' can never predict O (a fortiori, if I' < O).

Definition : for S', to understand S, to have S explained, is to have explicited S in S', a simple solution being a copy, but a model will in general more profitable and interesting (and the copy is frequently impossible)...

For example also: lets have the two functions E(I,E) and O(I,E) of S, and an initial state E.
Question : in which cases can we go upstream from O to I and E ? In general rule : no.

Trivial :
a) E is supposed constant, E = O
b) E = f,depfunction
c) Bayesian ?

9.6.2. Concretization (Simondon) Vs. complexity

Concretization does not necessarily entail a reduction in organic complexity: there are fewer components, but their multi-functionality may augment the manufacturing complexity. CAD/CAM and numerically controlled machine tolls should help to define (if not to measure) the complexity of components Vs. the generating programs complexity. But these measures have their own limits.

The Simondon concretization forbids large scale adaptability, but makes it easier on certain range. It is not really a progress, more a local solution.
Nevertheless, concretization leads often to beautiful beings, in contrast with Meccano built beings. Here, beauty is reached again only if the object is very large Vs. parts sizes. With serious limitations in Lego/Meccano constructs.
One could try to apply this to digitization, with the bit as the ultimate component. And low levels as bytes or pixels.

Beyond some point, there will be both concretization for some beings and more and more detailed analysis of the functions.

9.6.4. Complexity/stability

A priori : instability grows with complexity. That is frequently asserted.
Perhaps should we formulate a definition of stability ?

If the risks/threats are known, then appropriate defensive devices can improve stability.  But they could add new risks and instabilities. Something gödelian here also.

Positive coupling. In a number of cases, complexity increase is used to enhance stability. Some complex structures are very stable. One could event say hyper-stable, for instance to compare hardware and software.

From a geometric standpoint, a trajectory is said :
- asymptotically stable if all the sufficiently near trajectories at t=t0 tend towards it asymptotically when t goes to infinite (independence)
- neutrally stable if all the trajectories near enough at t= t0 remain sufficiently near at all the ulterior instances (but without necessarily going nearer asymptotical (periodicity, for instance))
- unstable if the trajectories near at t=t0 become more distant when t goes to infinite (divergence) (trajectory of the state variables in the variable space). Bertalanffy 1968

9.7. Complexity of hardware and software

Out of a first analysis :
- if the network of components is built at random (e.g. by stochastic combination of any physical components in a given electronic family), the probability of getting a stable system decreases rapidly with the number of components. In the case of electricity, we shall probably have a rapid degradation due to short-circuiting of electromotive forces, destruction of components by overheating... until we reach the stability of a dead system.

- if the network of components is built according to appropriate rules, the growth in complexity will tend to increase stability inside a given perturbation state (for Chauvet, it is a specific feature of life); but at the price of reduced reliability, life duration, etc; as soon as we go beyond the limits of that general space, we have specialization, with its pros and cons.

This space of perturbations may be:
- physical (heat, cold, and radiations),
- teleological (needs, computer uses).

By itself, an elevation of costs reduces survival hopes of a device, in a given state of the market, hence its "stability" versus the perturbations of that market.

Actually, there are two thresholds:
- the system can work only over a minimal level of complexity ; over this level, reliability may me enhanced by redundancy, fault detection and fixing and generally "fault tolerance".
- beyond another threshed, complexity by itself becomes a hindrance to maintenance and fault fixing, and the failure probability grows with the number of elements, etc. Unless, perhaps, we may do a sort of jump towards some meta-system, or a new "generation" of systems. That may happen in epistemology (Copernican upturn, present problems of evolution theory).

Thesis: for a given useful complexity, a lower functional complexity results in a growth in material complexity, but one take into account the adaptation rate (conformity, ergonomics).

Thesis. An increase in material complexity to simplify use results in creation of situations, hopefully rare, where the expert becomes even more necessary, and the fault fixture more difficult. That is not too hampering when the product is cheap enough to be replaced, and there is no risk. The worst case are those where :
- fixing is difficult and restricted to experts,
- the fault consequences are serious (human lives).
Typical case : nuclear plants.

(Compare that to organic/functional above)

10. Ops

That a proprietary (and exploratory) concept of Pierre Berger, elaborated to measure the power of digital beings. Ops has a bit.bit/time dimension, and applies to processors, memories and communication lines.

10.1. Definitions

We propose a unity called "ops", dimensionally coherent with bits2/time (in seconds, or an ad hoc unity). We adopt the following definitions.

. For memories, the number of ops is equal to the capacity times transfer rate. Example : a 650 megabytes CD-Rom, equivalent to 6 billion bits, is totally read in ten minutes, that is t 600 seconds, transfer rate 10 megabits/sec, global power 60 Gops.

. For processors, we take as numerator the square of the reference word length (32 bits for instance), times the number of instructions per second. Though rather artificial, this measure gives a correct weight to the word lengths, which impacts instruction variety as well as operand length. Example. A 32 bits, 400 MHz processor : 32.32 = 1024, 1024. Global power : 400 Gops

core.jpg

. For communication lines, we multiply the band pass (in bits per second) by the length of the line (in bits, since speed is practically constant and equal to light speed). For a given bandpass, the spatial distance between successive bits is constant. Here, the band pass plays a squared. Example : a phone line of 100 km, used at 54 kbits/second. At this speed, there are around 60 kilobits distributed over 300 . Then one bit every five kilometre, or 20 for 100 hundred. The, the global power is 60 000 .20 : 12 mégops.

In this way, we can hopefully add the power of different components. But sensible computations, in a given system, must include limiting combination factors.

We can understand the ops measures as an organic complexity (here, hardware is seen as organs for the software and data), or as a functional complexity (describing the function of systems of which organic complexity will still to be assessed, for instance by production costs, CAD programs, etc.).

10.3.Gate/truth table metrics

An example of model. We build a truth table with, on each row, first the values of I and En then the corresponding values of E and O. The length of the line is e + i + o.

There are 2e+i lines. TV, The total volume of the table is the product of the two. We shall take for ops, simply  (e+i). (e+i+o).

If several gates are combined, NP their number, we shall study the ratio TV/NP, which may be taken as an approximation of each mounting yield.

One NO gate. There is no state value. Then  TV = NP = TV/NP = 1.

In the "gate model", RD is the number of gates over which a clock is needed.

How to reconcile that with (one dimensional) DU ? One elementary gate = = 3 bits ? But, yes, we consider it is a being, the function of which is defined by its type. We factorize the issue of how it works.. as for COM.

If the model includes memories, memory of n bits : hence TV = 4m + 3

To come back to ops, we must introduce times. That convenes rather well with the ops counts, since a truth table is a square.

10.4. Gate metrics

Gate/ops. In the simplest case, one bit, one ops.

For one bit per cycle, we still need much more gates than bits ? How many (we talk about memory). If the transfer rate is one bit per cycle, then ops = capacity.

For the writes, we take the addressing/decoding, then collect the results. We can organize the memory in pages (more simple addressing mode). There are also the autonomous processor emitters, in some way.

The gate metrics reduces rather easily to ops, with an elementary gate having one cycle per time unit, representing indeed on ops. Then we have to find the 2 exponent somewhere.

10.6. Combination of systems and power computation

Combination of S, relations between their L and ops is not easy. Difference and complementarity of devices, parallel and series architectures will play.

The computation seems easy for serial mountings, if the series is unique (series of identical gates, or negation).Easy also if purely in parallel. Beyond, it gets rapidly much more complex !

If the processor has a large E, it can rotate over oneself (my God, what does that mean ? ). That relates to the autonomy problem (the longest non-repetitive cycle).
The input rate (e.g. keyboard) conditions the passing over this cycle, with random bits input, i.e. not predicable by the processors

One more difficulty : basic randomness of DR, negligible in general, but...
Ops of PR/ops of KB.

PR instruction-length * frequency
KB sends only 8 bits, on a much lower frequency

since ops(KB) << ops(PR) we can say that KB has little influence... and then it is enormous

under Windows, a the start, all the work done, takes several minutes, not negligible in terms of KB ops
but, after that, PR does no longer work, and waits for KB.
PR becomes totally dependent

But we could have loaded a program which did anything (animation, screen saver...)
the length of non repetitive loops should be computable

Other ideas to compute combinations

1st idea : fixed string emission (for instance E$),
concatenated with a bit extracted from the E$ string
then, non repetitive length (l(E$)+1) .(l(E$)), order (E$)2

2d idea : start again, placing one bit of E inside of itself
and in succession in every position
then we have (E$)2 * E$, that is (E$)3

3d idea : start again (2), but with sub-strings of length... from 1 up to (E$. We have (E$)4

4th idea : emit all the permutations of E$

For all these ideas, we must take into account the fact that E$ may contain repetitions
Indeed, from the 1st idea above, we admit only two different strings... and we are taken back to Kolmogorov

K(E$) (the smallest reduction) and Ext(E$) the largest extension without external input. In fact, this length will depend on the generation program

We could have L(Ext) = f(LE$, L PR) with the limit case LE$ = 0. Then we would have a "pure" generator

(L(PR) = program length

(something like the vocabulary problem)

if E fixed and l(E) = l(PR) + l(E$

with an optimal distribution. But L(PR) is meaningful only if we know the processor power.
We need third element L(comp) L(PR) L(E

Processor-keyboard combination

The global ops computation must include "dependency factors". The result will be in between
- ops the smallest
- product of the ops
- sum of the ops

If constraint factors are weak, the sum of ops will mean little if one of the components is more powerful than the other one : (a+b)/a = 1 + b/a

Screen/processor combination

Here it is much more simple. The smaller is the most important, with the constraint included in the graphic card. But L(PR) gets meaning only if the processor power is known. We need a third element
L(comp  L(PR) L(E)
If constraint factors are weak, the sum of ops will be of little use if one the two components is more powerful than the other one : (a+b)/a = 1 + b/a

Memory/processor combination

PR-Mem (exchanges)

1. K There is no external alea, we deal with the whole.
2. Mem is not completely random for PR, since it writes in it.

That depends rather strongly on the respective sizes

if Mem >> PR, it rather like a random
if Mem << PR, no meaningful random

In between can be developed strategies, pattern, various tracks. We would need a reference "PR", then work recursively.

There is some indifference TV/NP. The relation may be very good if we take the good structures. Then Grosch and Metcalfe laws will apply (and we have then a proof and more precise measure).

Conversely, if we work at random, the relation is downgrading. There is a natural trend to degenerescence (DR, or Carnot laws).

Conversions:
- we can define the gate system necessary to get as output a defined string,
- we can define a gate system as a connection system inside a matrix (known, but of undefined dimension) of typical gates
- we can define a system of gates of communication lines through a text (analogue to a bit memory bank

Then we can look for minimal conversion costs, a way of looking for the KC. The good idea would be to look for a KC production per time unit... not easy...

10.9. L and ops

Given
- the cycle duration of the being
- the size of I and O
- the necessary power to compile the E and O functions,
... see how that relates to L for a combination of systems.

10.10 Growth

 

10.10.1. A model of self perfecting being

In a very simplistic model, S1 aims to get the maximum out of its I. At start, it reads only a part of it, due to limited processing power. Then, it detects regularities in this part, can limit its reading and extend its reading zone.

Silence.jpg

If the inputs are perfectly white noise, or perfectly random, or more exactly, if S1 does not succeed to find any useful regularity, it will be impossible to make better. Irreducibility is a feature of randomness. But we can imagine some cases where this kind of "learning" is effective.

S supposes that there are constant bits in I. It examines each bit, separately. If one bit is the same during four cycles, it admits that this value is constant, and begins to scan another bit. It there are changes, it continues to read it, or considers this bit as uninteresting, and drops it.

S goes this way up to exhaustion of :
- its storage capacities to compare successive values of bits
- the width of the I window.

It is easy to design more and more sophisticated mechanism of this kind. With pattern recognition, learning in the proper sense, etc.

10.10.2. The (n+1)th bit

The exciting moment is when the model grows on a significant scale. In other words when, in some way, it questions itself (rather metaphoric). A digital view of the Hegelian dialectics, "steam engine of History".

A logical foundation : the only meaningful growth (or growth of meaning), goes by adding new bits. By design, they are not determined: if they could be determined form known bits, they wouldn’t matter. Then we have the daily play of interaction with the world and other humans.

That new bit, and more generally the external beings, to have a meaningful value, must keep into existence. Then S must also respect the physical matter which bears them, the energetic conditions of their existence.

But S must also want something of its value. And its place inside its E. This bit here, and generally this being in those circumstances, in a specifically defined space.

Choose the good obstacle to jump over, for my horse, for me, for my children or students. Not forgetting the limits of possible, but
- taking a risk, well calculated.
- with a good return..

The (n+1) th bit is a negation of all the first n bits. It is the antithesis. But as so it is just the precedent string plus one bit, passing from 0 to 1 (one could take the inverse solution). Or more exactly, the negation is the negation of ALL the existing strings, adding one new bit ?

But the pure negation taken as such, is of poor interest. Integration in an expression is the real task. Then everything must be cast enough.

A bit negates its environment, and at the same time settles it ("me pose", if I am this environment).

In a model with recurrence. In a being of nn bits, how many different being can we place ? A threshold is passed at n.2n. Then one can carry on a little, up to the moment when we must use n +1 bits .

This model supposes that S aims to optimize its performance, in this case the filling.

This threshold may be said "where the model size must be increased". But after some offset, a sort of overfusion, with an intermediate region of indifference, or gap. .

Theories, in general, pay little attention to these zones. They are in general skipped over, calling to large numbers, scale presentations and talking "at a constant factor".

Between n.2n and (n+1).2n+1, there is stability.

n

2**n

n.2**n

(n+1)(1 + 2**n)

Diff

Diff/n

1

2

2

6

4

42

2

4

8

15

7

3

3

8

24

36

12

4

4

16

64

85

21

5

5

32

160

198

38

7

6

64

384

455

71

11

7

128

896

1052

156

22

8

256

2048

2313

265

33

9

512

4608

5130

522

58

10

1024

10240

11275

1035

103

 

 

 

 

 

 

20

1048000

20M

21M

1M

50K

Then, when n becomes large :
- a little increase in n is sufficent to double (or augment the 2nd power) the number of differente beings for a same necessary increase in capacity,
- the region where its profitable to stay with n bits is narrow (and of little interest).

Hence (product laws)
- beings (blobs) cannot be "played" with
- replacing beings with shorter ones gives slack, but uses the place less efficiently.

There is something of the sort in the Mendeleyev atom table: at start, the combinations are poor; the more we go down in the table, the longer are the lines.

nplusonebit.jpg

When the code nearly reaches saturation, there remains no suppleness (to add some new beings), nor security by redundancy (note the proximity of these two concepts). Then, it may be profitable to jump to n+1 bits even before reaching the threshold. (A lot of things about this "slack". For instance hard disk use).

That, of course, depends of the processor costs.

Let us suppose that S has to represent 1000 beings. That means 10 bits, with a small loss (24/1000, or 2,4%). If S cuts into two features, for example one of 100 and one of ten values, it needs 7 + 4 bits, that is 11 bits, with a global capacity of 2048 beings. The separate slacks are of 12 and 60%, the combined one of 105%. But, if the features are smartly chosen, it may profitable otherwise, if at least one of the features is meaningful for a sort or the application of some pricing mode.

Another case : a memory of 10K bits can host 1000 beings of 10 bits. If S factorizes with a first feature of 10 values, needing 4 bits, it remains (10K - 4x) bits for x objects. Then, when x grows, the 4 bits take is of low importance (to be checked).

In case when S is not sure of the available capacity nn, then the passage to a greater number of its would be an acceptance of risk.

An interesting problem (this part should be eloped in "structures") is what we do after a "Hegelian" new bit. How this global negation of the entire S (antithesis to the S thesis) will be worked on after
- adding a new divisor the being ; a separate feature, when that bit is at the new value (lets say 1, as opposed to the normal implicit 0's) ; and possibly the construction of a totally new structure for the beings begging so ; or that partly in E.. that’s radical
- transferring it to one of the present divisors, or sub-part of S, as strong bit, changing the meaning of this divisor or feature; it may be radical, but limited to the subpart
- putting it on the high part of a number
- putting it on the low part of a number (we have just made a refinement on some quantitative parameter)

The next bit, how to choose it ?

That supposes that there is a choice, a gap, an opening "after" the first bit, an ordinal at least, two positions related to each other. It is "engagement". If a being is launched, thereafter the being must be eaten, S must stay on this sensor.

Every time I add a bit, it is either structural or precisional :
- a structural bit opens a new field, e.g. from fixed to variable length for a variable
- a precisional bit develop or extends a new field, or more precision on a quantity a new bit in a raster.

These roles of the new bit is defined in a processor (hardware, at limit) or some part of a program (e.g. a header)

The "internal" word, 9x8 bits, 72 bits (per character). To draw it, it would need 9 bits and a pointing mode of the one which is inside the rectangle of nine.
For extension, 2 bits, which determine the 3rd.
To compute a curvature radius, 3 pixels (only one circle passes through three points).
It will perhaps not give a possible circle, that is with a centre on a pixel, a definable radius with the number of bits allowed by the software and passing actually through these three points
More precisely : a circle drawing algorithm could find at least one circle, all the circles, if one circle exists... including these pixels. It is a very different problem from the Euclidean one

11. Digital relativity

11.1. Can we quantify Gödel ?

The importance of DR is to soothe despairing fears and to temper hopes about the possible perfection and closure of a global DU of or any more or less partial DU.

The basic idea is suggested by Gödel, with his undecidability and uncompleteness assertions, and a tentative union with physical experience limits, such as the limit of speed to the speed limit and the quanta theory with its uncertainty In both cases, theory asserts and experience confirms that numerical definite values may be assigned for these limits : light speed and Planck's constant. Hence the hope that somehow a sort of logic limit constant could be calculated to get more about the Gödelian expression "any system complex enough to include arithmetics" : how many bits does that do with ?

An effective calculus of limits due to DR would open the way to some engineering.
If we knew the e=mc2, we could directly compute the IEO space curvature. If not, we could at least say that, at tome times, there is threshed passing. For instance, the way we can find formally infinite loops in a program.

I never could find an answer. Neither by myself (but that is not surprising, due to my limited capabilities in mathematics and formal systems), nor in any book or dialogue with competent persons. Nobody however said me that the question itself is nonsensical.

Still waiting for an answer, I found some possible contributions to such a theory. (DR is also a proprietary concept, launched with my book "L'informatique libère l'humain" in 1999).

11.2. First approach :time and distances, noise

Noise

There is no such thing as perfectly white noise. It is a theoretical limit, as the real line. Noise is something like a dirty chaos More or less dirty. .

Noise plays on the transmission, with an error probability tied to distance. We admit that noise is a probability of false bits, given a size (or ops) of a being and/or a number of cycles.

Below some distance, noise is considered as negligible. This threshold may considered as a characteristic of a given DU, or as a feature or beings, which then have to be sufficiently protected if they are to survive.

In a noisy system, beings of high levels take action to compensate for noise. For instance with redundancy, or through an appropriate space organization (they locate themselves near their most related partners).

Any growth must be paid by more noise, probably with a decreasing yield (depends on..).

Borders around beings are useful. And intermediate regions to cope with small addressing errors, due to the fact that the addressing system and the communication system will be affected by noise.

Noise has more or less important effects depending on the regions it hits, active or passive, and
- bits of weight high/low
- bits in zones frequently/rarely used, of high/low pressure
- syntactic bits, with possibly cumulative effects bits of syntax, a fortiori in repetition

In a totally reduced systems (all independent/orthogonal bits), a fault on any bit kills all the system. (to be checked)

Noise vs. distance

For instance, if we think of communication between DU beings, separated by a distance of d (in bits, whatever method be used). If this universe has a noise rate such as the move from one binary position to the next entails a risk of n% random chance of a bit error, then the probability of error over d is (n.d)%. If we use error detection and correction such as a l% of erroneous bit is acceptable, then the maximum distance of communication between the two beings is given when n.d < l, or d < l/n.

This limit has consequences on the size of beings themselves: their radius, or the maximal distance between two points, or between any points and a supposed controlling centre must not exceed l/n. That could explain, for instance, the necessity for life to protect itself against (chemical ) noise behind cell membranes. It could also give ground to the utility of a distinction between internal and external modes of communication, with much more protected and heavy communication methods with the external world, contrasting with efficient but per se fragile communication inside the being.

reladi1.jpg

reladi2.jpg

Noise vs. time

Of course, the fact that DU cannot exist without some "incarnation" into physical world makes it depending on the physical relativity and uncertainty. That did not matter much for the first digital beings. ADN, neurons and electro-mechanic devices are neither large nor speedy enough to suffer from speed limit or quantum uncertainty. It is no longer so, with the gigahertz clock speed of today’s computers, and the long distance communication through satellites. Here, the quantitative values are well known, and taken into account into the design and architecture of digital systems.

In a future to come, quantum computing and possibly instantaneous communication due to the EPR effect, could have dramatical consequences on DU. How far and how soon ?

Even without reference to physical matter ant its relativity laws, a simple time coherence problem arises if we suppose that the communication speed is finite in terms of digital clock cycles. It the communication delay between two systems is longer than the clock cycle of one or two of them, control of one by the another one, or any reciprocal interaction will demand operations upon several cycles. To say it crudely, even with the best systems of to day, you could not pleasantly make love with a partner living some light-years away !

That lets open also the practical impossibility of operating strictly synchronous systems over long distances, and a fortiori the dream of a globally (if not centrally) clocked digital universe. However, if all the beings that matter have sufficiently rapid basic clock cycles, that does not prevent a time coordination on longer cycles for more global actions.

Loading law of Berger. But beware: when we augment the load, we augment the organic distance and even functional.

Time reliability (subsistence, MTBF...) linked to distance reliability.
A model where DU reproduces itself, with noise, from an instant to the following

Noise and code saturation

(See above, zones n, n+1)

What about noise in these zones ? When the code is saturated, every error is paid directly : we get a wrong being, or another being. In some way, all beings are then equivalent, but if we put a hierarchy on the bits. But, precisely, set a hierarchy is a form of factorization, with probable losses if the order (the ordinals) are meaningful, and we have no longer a pure and meaningless code.

As soon as one uses gap to make meaning, noise has very different consequences :
- absolute transformation (as with saturated code)
- big change/error
- small change/unease.

(See also in Genetics)

We should find regions which be at the same time important (bit mass) and indifferent (zones around ruptures), to have maximum effects.
We shall have big effects it noise perturbs regions with conditions (IF) or recursive.
Interesting case: the error generates spontaneously a virus.
If we make a difference between programs and data, think about the difference of effects, and reach a sort of trilogy:

errors.jpg

The bit as protection against noise, assertion of authority within chaos
1. totally protected zone
2. input zone, clock included
3. zone with very strong error correction

Attraction : Beware. Here, to hold together, bits must have a common resistivity to noise.
Gap .The miracle symmetric of sin. On the first side appears a gap between model and real, some probability, uncertainty. At one extreme, nearly certain, the venial sin... At the other extreme, the very improbable, lethal sin. And at opposite, the miracle.

Quantify, measure the gap. How many bits to a real error? Correcting codes ? Distance ? Cybernetic offset ,

Quantitative phenomenon : beings become complex enough to let it compute its life duration expectancy, which makes death appear as shocking, a kind of failure.

L evaluation has no meaning if P is infinite (though the series could converge, but with infinitely small H after some time).

How many bits do S need to be able to detect an anomaly, a bad. And how for self-representation (if not conscience).

The question must be asked more smartly. A bad may be detected on one only bit, if the criterion is external (there should be that, and there is something else).
But how many bits a being must have for
- an external S can detect a bad by internal contradiction
- S itself can detect the abnormality.

11.3. Second approach : number of sub-cycles

We suppose than, during a "principal" cycle, a processor cannot do more than a given number of operations (temporal bits, analogous to spatial bits).

Besides, the model varies strongly according to the number of cycles. We must say what the processor is able to do in one cycle (clock, or larger cycles). Addition, multiplication, ... pattern recognition ?

But we cannot dodge the issue. Otherwise, we could have anything inside a cycle.

Anyway, a processor of a given volume power cannot do an infinity of things, nor have an infinity of different states. Hence, if the number of cycles is too high in relation to mass, then we have cycles. This number would of course be very large (64 bits, yet, by the simple exponential play). But, her also, we have reductions by meaningful differences (Kolmogorof, etc.).

11.4. Third approach : logics, paradoxes, recursion, viruses

Language : holes and clashes

Perhaps more important than all that, or in duality with it, a form or relativity or uncertainty (uncertaintivity) comes directly from the mere basis of assimilation and expression, from the mere notion of analysing and creating beings. A the limit, this form is not even specific to digital systems ; or, to be more precise, to the rasterization of representations (images, sounds, measures and actuator commands). It comes from the other aspect of digitization, which is fundamental to language.

A language is a set of basic elements (character, words) plus a set of assembling rules. As such it deploys a space of possible expressions. But this space is too widely open, and lets place for a lot of uninteresting expressions and phrases. Then, grammar adds rules to generate only "meaningful" expressions. Inside that space, remains the traditional question of truth, and the way of deciding of it. Gödel has shown, at least for formal languages, that whatever is the proof system, there remain undecidable assertions.

In programming languages, we find also a scale of valuations, with at least four cases:
- expressions rejected by the compiler, for syntactic reasons and sometimes for concrete reasons tied to the development environment
- expressions accepted by the compiler but leading immediately or not to a stop, or endless loop
- formally correct program, but not conform to the intentions of the developer or of the users.
- correct program from all standpoints.

Around these basic levels may be played some. For instance :
- the compiler accepts, but gives the programmer a warning
- the program stops, but not immediately, and why not after having done some useful job ; besides, an endless loop is not always a prejudice, and is precisely looked for in transactional systems for instance (with of course some way of closing the process, but at the limit, a stop by reset or off-switching the mains may be sufficient) ; infrequent bugs are tolerable in many cases (if even totally evitable)
- the conformity to intentions of the developer and user needs is rarely binary ; in management information systems, anyway, it is well known that user needs change with time, under multiple pressures (from law changes to user knowing better).

In other cases, formally correct expressions may prove contradictory for logical reasons (a white black horse) or more remotely semantic (a modest diva).

To take a metaphor : words and rules are some way to project lines and planes in infinitum, and these lines may as well cross themselves (contradiction) or never delineate interesting spaces (incompleteness’).

The fact that a language is used by a human being or a digital machine does not change this fact. We tend to say that, in the bad cases, a human being finds always some solution, possibly with a rule change, and that the machine can not. But this distinction does not hold, for symmetrical reasons:
- human beings are frequently unable to cope with contradictions or to be practically blocked by undecidable issues
- within limits, machines may be built which cope not too bad with this kind of difficulties.

The uncanny loops of self reference and recursion

"I am a strange loop" (Hofstadter)

Self reference is a specific feature of language. Any programmer knows the power, and the dangers, of recursive functions. And a major part of paradoxes, as well as the Godelian limitations, draw on "diagonalization", particular form of self referent assertions.

For instance, the "The least natural number that cannot be described in less than twenty words" . If such a number does exist, we have just described it in thirteen words. If such a number does not exist, then all natural numbers can be described in fewer than twenty words.

In physical world, recursion is impossible, but sometimes present in some way, for instance with parallel plane mirrors facing each other.

For the philosophers, self reference has played a considerable role for long. There are to main points :
- self reference is impossible to beings : omne quod movetur ab alio movetur ; only God is creator of himself ; and that is not the last of its paradoxical nature !
- metaphysics is the only science legitimately judge or itself... and for this reason is not a "natural" science
- man is the only being able to be conscious of himself ; but what is conscience...

The Richard-Berry paradox reads: "The least natural number that cannot be described in less than twenty words" . If such a number does exist, we have just described it in thirteen words. If such a number does not exist, then all natural numbers can be described in fewer than twenty words.

Note : a universal anti-virus is fundamentally impossible.

Self-reproduction

Digital self reference has a crucial importance in one case : self-reproduction. One of the basis of life, which invited itself on Earth. And a no less invited presence in "our" world of artefacts : computer viruses. In both cases, the process has at its core the duplication of a code of some length. And, if it has some remote metaphysical cause, that does not prevent it to happen billions of times everyday without asking our permission.

Computer viruses may seem less mysterious than life. We certainly know how they appeared, and we are but too able to make them appear, with a not so considerable competence in programming. But, beyond the fact that we shall never be free of them, they have also tricky feature. To begin with, the fact that the creation of a universal antivirus program is... mathematically... impossible. And even than there are cases where it is impossible to say if a program is, or is not, a virus. And all that to diagonalization reasons which refer to the same diagonalization processes as Gödels laws.

It is even probable that some day will come where spontaneous generation of viruses will emerge out of the Googelian soup.

Hypothesis : Virus must "spontaneously" emerge :
- when DU of any sub-universe becomes very large, the probability of any string of finite length grows ; that is true also for virus strings
- when DU is large, cracks happen, favourable to virus contamination

A virus :
- cannot be held in a too small digital being
- is certainly present if the being is very large, which we prove as follows :

10700 ? (we have in mind minimal length viruses, and their probability or emergence)
- This figure reducing factors: all equivalent viruses of same length
- this figure augmenting factors: non random generated viruses, but indeed anti-virals

Recursive functions

IEO beings are recursive through their O and E functions. Functions contained in E are themselves IEO beings.

Diagonalization at the heart of Gödel
1. number of signs in the function
2. limit value for the parameters (bit length of the variables)
3. Finitude on the number of steps. We cannot go down nor up in steps. In particular, we cannot go back to the origins.
In practice, one starts with a finite being, at a definite place, finite, in DU.
Make a recursive meta-function (a sort of projective geometry), to pass to infinite.

11.5. Fourth approach : reciprocal saturation (groupware)

No system can process the totality of information available in the universe. It has to select a given number of bits in input. (A possibly voluntary selection). Similarly, its capacity (in terms of state numbers) being bounded, when an input results in a complexity growth, the system must choose what it keeps and what it drops.

That, in the real world, is observable in human being relations (see part 6).

Particular case: auto saturation. If we suppose tat a processor processes information at maximal complexity (defined as: the number of bits on output is equal to the number of bits in input, plus the number of state bits (after Kolmogorov reduction (and let us check that is properly attached to the "complete" character of complexity theory), and if it receives in input all its output bits, E should grow exponentially, and rapidly. As soon as we reach a threshold in E, the processor has to operate a reduction, at least in the function E(I,E).

Reducibility relates to a language (words, grammar) with of course volume and computing time constraints
- if S has only memory/copy, are reducible only the beings having an identical image in memory (differing only by localization)
- if we add concatenation, all beings are reducible by bit to bit copy, but that is not a true reduction ; it is more interesting for strings containing several times a same substring)
-if we add position of patters on a white page (any hyperplan), concatenation generalized to several dimensions
210 gives 1024, we earn only one character
220 gives 1048576, a little more than a doubling
factorials also are interesting
- the Euclidean division A = bQ + R can reduce any number
if b may be factorized, the giving of Q and R is interesting (to be checked), interesting for multiples of Q
Noised model

Noise from sampling and filtering

Mismatch between computing functions may cause interesting cases of noise. Such are the case of sampling (see for instance Nixon-Aguado) or filtering (see Bellanger)? That applies to living as well as artificial devices.

The sampling itself, or rasterization, of physical beings and processes, is a major theme for digital system limits. But where are exactly these limits ? Since at least the 1970's, we read predictions that limits of matter will be reached, as well for elementary circuit sizes as for communication rates. But, regularly, technological progress pushes them farther. A rather surprising news has been the quantum computation itself, which would transform the uncertain malediction into a boon for computing power !

Minimal loops

1 bit model.
Generation : if S1, display of a stored image
if 0, random generation
Interpretation
- if recognized as ... 1
- if not recognized as such, 0

2 bit model
Generation : one image among three images, stored or recognized
1, random
then, this bit, add to NEW images
For instance, when a load, le ... 2 images non recognized are stored. Then, take as the new

This principle may be applied to things radically new, or this image, or this algorithm
We may have image generation around a prefix
Let a model on one domain/flow, having a good performance on its domain
A free bit margin,
I f comes a domain where it does not wok correctly, very bad performance in recognition (e.g. new language), launch a new recognition search, with a previs bit "new"
To do more thin, we should make analysis on what we obtain to re-integrate the prefix
difference between evolution and revolution

The stream come under recognition. If it is not recognized, alarm or betterment (value index), or jump to the next recognized message. It is recognized, a message is sent to the addressee, an action corresponding to what is looked for is triggered.

11.7. Consequences of digital relativity

Hence follow some consequences.
- the limiting principle may be taken as a germ for transcendence; beyond the reachable, God. Or at least the poet, the ineffable. The ontological proof... a file so large that it would exist by metaphysical necessity. D1 itself///

1. DU cannot be totally united. A unique machine (Mumford) is technically impossible. That is reassuring, but with limits, see Google! This question is dealt with in Lyotard. Sloterdijk elaborates on that in his series "Spheres". The "myth of the machine" (Mumford), unified is impossible in principle.

2. That applies also to time. There can be no total saga, no global history. That goes along with for post-modernism. This question is largely dealt with in Lyotard. Of course not on a computed fort. But with Gödel, Thom, etc. says a lot. Nevertheless, I would like to reach a computation (something like a curvature radius in Hyperworld).

3. There are also rules on minima, on minimal loops with self reference, distance of an S to itself...

4. Distance from neighbours is part of safety. Hence also the "membrane" justification in living beings and elaborate artefacts. To be at ease, any being must have some free space around. Duprat's law (system/population alternance). Protection against elementary errors. In time also, any being, to be at ease, must have enough space around.

5. Contradictions that appear may cause stress

6. Undetermined spaces appear, which are to be filled. That may be a curse as well as an opportunity. At the limit; an interstitial region for Spirit, Eccles like, but otherwise.

That offers a base for liberty and creativity : there is an indifference margin, that everybody must use, get the best of his own L, locally optimizing. See Ecumes, of Sloterdijk. Then comes the time of conflict: then, we have to build, in some way (physical wall, netiquette...).

Gratuitous act (Sartre). Freedom. Unselfishness.

12. L

That is also a Pierre Berger concept, elaborated in 1978, evoked in an article in 1979, and first published in "L'informatique libère l'humain".

L is of material importance, since its gives a teleonomic autonomy to any digital being, but evidently in proportion to its level of being. An individual bit has no other autonomy as to stay as it is, or to beak its clock rhythm. A living being or a developed robot have intentions and sub-goals as far as their knowledge systems and behaviours allow and structure it in levels.

Everything gravitates around autonomy. For code as well as for economics. Autonomy gain through externalization: control of the environment, being given (legal) assurances. But any external extension is also source of dependency.

Autonomy increase:
- efficient: motor, automatic feeding, batteries; sensors to detect and adjust the paper feed etc.
- material: lighter processes, dematerialization; but also freedom from a definite place
- formal: from form to program, genericity. and digitization (Gutenberg)
- final : robot intentions

12.1. A formula for liberty !

To express the autonomy of a being, as well as its dynamic substance, state function and dynamic variety, we propose the formula :

where pt is the probability or existence at instant t and pi the probability of state i, among the n possible states at time t. Then L is the sum, or the integral, of the mathematical expectation of neguentropy.

As we shall see, liberty proper comes from the combination of a formula and its non total computability.

12.2. First comments and explanations

We have chosen L for "liberty".

Indeed, the first part of the formula (which we shall refer to as P) translates the first basis of any freedom : to be, to exist. The material freedom, so speaking.

The second member (which we shall name H) gives the entropy at each instant, and the absence of constraints to choose one or another state. To be free is to have the choice. And the wider and less constrained it is, the greater the liberty.

This formula expresses the permanent conflict opposing subsistence to free choice.

This formula expresses also the freedom of a being in respect to the other beings, and to GOS: P measures the fact it is here to stay, for some times. H expresses its unpredictability.

This formula has good sides :
- it relates to the concepts of variety, entropy and energy, and hence to information theory and to formal automata;
- H evokes at least a similitude with Heisenberg's uncertainty ;
- P may be taken as an actualization rate, if related to finance and more generally to economic calculus.

It makes a kind of synthesis of information, energy and money.

Causality and freedom

In the deterministic hypothesis, we can find some forms of liberty inside (to be checked). But, in general, we wont like it. Then, we have several options :
1. We assume that we still have subsets, and the problems of random and then of liberty will always be, in fine, transmitted to a large set, which we could call God, if we give to this being the character of a transcendent person ;
2. We assume that, beyond a given size, and certainly for a possible encompassing set, there are necessarily chances, indecidability, and then that some form of liberty is possible inside. (We are near the DR). That may play on bit count of a given being, or in being count. But, here the number of dimensions in DU is central, as it plays on distances.

12.3. Freedom is progressive

A key point is to assert that liberty is not a binary value. DU is not parted between free and unfree beings. All beings are, up to a point, free and controlled by the others, and always, necessarily, by GOS.

12.3.1. Intuitive examples of application

1. The autonomy of a vehicle measures as well the duration of operation without refuelling as the distances accessible. Both terms are tied by speed. They are confused if speed is constant, they differ if sped is variable or controllable. In this case, there is a possibility of arbitration between a long but rather slow travel and a shorter but speedier one.

The paying load may also be considered as a form of variety, with arbitrations between load and range.

2. A control loop preserves the existence of a system and avoids its destruction. It tends to keep it back to a middle point, from which variations are possible in both ways, when dissymmetry appears when one is farther from it. Then this regulation maximizes liberty, in some manner. That is present in living beings as well as artefacts, and is the core or cybernetics.

3. Classical games like chess and checkers tend to eliminate the adversary. But they can also be seen as the conquest of an unbarred use of the whole terrain. And intermediary steps stress the importance of strategic points control for moving freely/

4. The delimitation of a proprietary space by a system is a compromise : the space is limited, but within the limits, freedom is total. And peace with the neighbours favours security.

5. The various elements of a corporation balance sheet may be considered as liberty resources as well as of survival. Income maximization, when it is not distributed to shareholders, increases the corporation autonomy (cash flow, investment capacity).

A way to elaborate is to deal separately with H or P, and even there to limit oneself to some part (short or long term for P, reduction of constraints or variety augmentation for H). Another ways:
- an organic structuration of L: liberty degrees, memory capacities, implicit parts of the hardware (to be defined)
- a functional structuration, more interesting at the end of the day, with banalization of the underlying hardware, but also a hierarchization of functions (primitives, macros, sub-programs ...).

12.5. Graphic representation of L

The value of one parameter at a given instant can be represented as a binary string of appropriate length (logarithm on base 2 of the number of state), padded if necessary.

Note : In fact, at a given moment, the states are fixed (but possibly unknown by an external observer). What is more interesting is the possible states at the next instant...

Note: it seems pertinent, for an intuitive appreciation of L meaning, to order the bits from left to right according to their variability.  We can then use the concept of "surface without borders" or Ruyer since, in any measure can be taken as a string of indefinite length, like a number, with the comma at the centre. Only some digits near the comma are meaningful.

This is rather evident for the right end bits. Written in the customary manner, the extension to the right relates to more precision in the measure. The more precise it is, the more fluctuating, then meaningless. For instance, the length of a metal rod, at a standard temperature, is practically constant up to the hundredth of millimetre, but after that changes with any warming and cooling. For the majority of current life rectangular beings, for example a table top, length and width cannot be precisely defined under the millimetre, since they are not machined with a better precision.

At left, after meaningful digits, a number aligns implicit zeroes. But, for a number of measures, that is a convention. For example, if we give a position in space-time, we can use few digits because we refer to a well chosen origin : sea level, Notre Dame de Paris for road distances, Jesus supposed birth for years...

To complete this view, we can present the successive states of the systems in the F function as a series of parallel lines that can be placed, for intuitive understanding, placed in a kind of perspective.

(What about a negative L...suffering ? )

12.7. Notes on H

For simple devices, H is practically their number of bits (or log2 of the number of their different states), weighed by an evaluation of the equiprobability of the states.

That is the neguentropy or energy in each instant
- first, intuitive notions : widen the space (materially), the range ;
- better use of space, with recognition of new differences, analysis and description, thinner resolution of the look and of the reflexion, avoiding noise (and dust..)
- number of states / equiprobability ; hence the H paradox : increase the differences... Or become indifferent ? Hence the need of a concept like "meaningful difference".

The field so drawn is suggestive enough to call for more structuration. Anyway, we have seen that any long string needs structures. We can separate
- fields with little variation, stable, constituting the being skeleton
- fields with large blank spaces, or liquid, where liberty can deploy itself, leant against the stable fields.

In the large fields, narrower ones may be defined.
L will grow mostly with field independance/orthogonality : money was first by exchange, then metallic, then paper, then purely electronic.

Too large fields are inefficient.

In some way, the centre of the line locates a good potential difference.
Safety demands large numbers. Reliability is a low failure probability, with refers (at least potentially) to a large number of tries. What the insurance companies do between the insured people, or genetics with spermatozoids, is not without relation with a large number of meaningful figures on my bank account.

These reserves are material, both for P and H. In other words, if S disposes of an indefinite width in bits, it must find a sensible distribution between wide and narrow fields. The creation of wide fields is a guarantee of survival, but is paid by a lack of meaning (meaningful differences).

At each instant (cycle ?) it is by a deployment of the strong structures of its free fields that S will adapt itself to the external world and to bloom. As the strong structure compressed in the seed, crushed by the demanding pressure of the humus gives way to the blooming of a rich plant and its golden spike, in the broad and wide Beauce fields.

But this splendour has a cost: blooming exhausts the free spaces, and rich structures wither. But not without having borne the germ for a brighter future. chanteront.

Note here that rarity is not necessarily a positive value. beings need both rare and common matter, water as well as diamonds. And a balance between them. Improbable is not always to be wished.

The being aims to regulate the right part in order to ensure the persistence on the left. From this standpoint, the centre will move, since the variability at left will decrease, expressing an elevation in L.

H is also an index of unpredictability for the outside S; The IEO formula may be taken as an external model, and L globally as a kind of resistance to the environment. H expresses E+I (+O) size, the place physically reserved by S, while P expresses time expectancy when S will keep it for itself.

Somehow, the maximum of separations is nearest to the big bang, when plasma organizes itself uniformly in elementary particles. The whole game of grouping which follows appears as a reduction of the number of "bits", let atoms drop into heavier and heavier constructions, less and less free. Anyway, from a strict physical standpoint, History seems to go slower. After the frantic activity in the fist fractions of seconds just after the bang, some billions of years see galaxy indefinitely unrolling in a universe larger and larger, but also more and emptier.

In fact, the true divisions, those who matter, are intentional. S will progressively emerge while generating life. The atomic cut is only the reverse side of the truly important one, which looks for itself through DNA as soon as Earth is borne (and perhaps even before), which makes itself more perceptible when vegetal and animal life come to activity, and which is really borne when mankind begins to mark up his space and to cut silicon zones. And it blossoms with the Greek science, get uselessly stuck in religions, and refinds itself to day out of Internet. And it will dome day go beyond Mankind itself.

Any entity enhances its autonomy, is structural richness as well as its lifespan on both ways :
- by taking if out of other entities, which are captured, absorbed, merged, assimilated, or simply destroyed to save material space;
- by cooperation, win-win, and for that building more and more sophisticated relations, at the bases with a trimming of entity to entity distance.

That happens as well in the mineral life (inter atoms distance in the molecules), vegetal and animal (population density in a biotope), psychological (animal security distance, human distance between families), or properly digital with its specific distances, especially in digital relativist models.

Indeed, to live its independence, any entity demands, around its borders, a sufficient amount of not structured space, so that environmental moves do not perturb too much. That has to be compounded with the price to pay for the synchronization organs in autonomous units.

One difficult issue remains to be settled: compounding the L of different entities when they cooperate. Some ideas have been given for instance by Condorcet (when he shows that insurance is profitable both to the insured and to the insurer) or by Titli (distributed control theory).

12.8. Notes on P

P is the life expectancy of the being. Energy consumption may be considered as a shortening of P, for instance.

Our beings cannot be immortal, at least have an infinite number of cycles. If so, their L would be infinite or their H rapidly nearly vanishing.

The formula depends strongly on the time slicing. Can S change the rate of its clock, for instance ? Here we should study conversions between cycle length and variability. Probably, when I increase the clock rate (supposing that we have a reference to a deeper one...), variability will decrease, or become meaningless after some limit. That may be compared with the right part in the H line.

Any computation of P implies hypotheses
- on the nature of perturbations brought by the environment
- on the P of S parts.

According to the results, strategies will be widely different. A closed system, strongly protected, does not need to invest in Defense. It can concentrate on H to reduce its constraints (for humans, democracy, games, learning). But it cannot increase its vital space (a notion to define...). Example: gas in a closed room. Pushed to the limit : death.

Nevertheless, even closed, the system may have quasi-indefinite aptitudes to develop its information system, and tends to increase H by reinforcing the meaning of small differences (sophistication) or by opening itself to rather useless processes (discussion about angels sex...).

On the other hand, if H changes little, and the S cannot change it, it will concentrate on its P horizon.

12.9. H-P arbitration and compositing

S has to arbitrate between H and P. More freedom today reduces the life expectancy, but also the freedom expectancy for tomorrow. However, there are also positive feed-backs. Examples:
- buy P against H: insurance policy,
- buy H against P: chance games,
- identity/safety dialectics, (Cyrano de Bergerac),
- association and cooperation with other systems.

The maxima in L are probably far from extreme choices in favour of H or P.

Another paradox : S cannot chose to die, even to suicide, that is a major limit on its H. Could we find a computation to show when suicide is finally an augmented L for the S ? With for instance a strongly widened H during the period between now and the planned suicide/death time ? Can we build suicidal beings ?

For biological type beings, P and H are framed by the development stages, from conception to old age through education and maturity

12.12. Computation of L

As a simple model, L could be H compounded with its MTBF (mean time to failure) expressed in number of cycles.

utility of reproduction
garbage collector to save internal space

energy resources, space saturation
Unpredictability vs. random
the maturity / double growth

L and ops

Ops are not related to P (not directly)
P as well as ops depend on the clock cycle (if P is assessed with another clock that the clock of S itself)
H depends on a sort of efficiency in E =f(I,E), with or without E. Something like : how many bits does actually impact on during one cycle (and how many bits it takes into account). As far as S may be modelled as a computer system, then my reflexions on ops apply... but are insufficient to express the efficiency of f function for an external observer, ops could be rather objectively assessed, even physically with gates

gap18.jpg

11.9. The L gap

L computation demands a sufficiently precise model of liberty spaces and probabilities definition. This modelling will always be a reducing splicing or factorization. That does not prevent to believe in transcendency, which is what prevents the splicing.

It ensures also the independence of a being controlling its self representation: if it chooses specific description conventions (epistemology), it can define its L as it wants.

Perhaps all comes from the difference between bit and e. The bit is not the informational optimum. Have several gaps : strong (Log2 related to e), weak (neuron billions).

Meaning, somehow, is borne by drift, sin error. Notably the recognition that one is not the other, one is not God. The meaning appearing at convergence between gap and miracle.
- The gap make apparent the teleonomic, the aim ; bad spoils the being, but gives also the meaning of "restore its integrity, completeness". Or sill, near to a L approach, Hegel says : "The determined has, as such, no other essence that this absolute restlessness of not being what it is" (in Logics and Metaphysics).
- Miracle calls back the causal. God intervenes in the world.
The typical convergence is the Salvation : God saves the sinner man, and so gives an meaning to his life : the maximization of L, id the ability to say Yes to But, as God is not currently available directly, one says yes and renounce to everything in order to say yes at whatever represents him : Law, Gospel, Church, Sect, Leimotive, Fuhrer

A key question : when do miracle and sin meet ? How to model that ?
In elementary systems, the gap comes from reliability, evaluated as non correspondence to the demands of a user who has specified the functions. The solution also is external : repair.
With auto-adaptative systems, S may use the tap to mend its actions, to reach its aims.
When S grows, both gap measure and reduction means are internalises.
Meaning emerges when there is a self representation, or a "conscious" representation as well of gap as of solutions.

But the true disorder is the case when the gap perception is painful, even when S is not responsible S cannot do anything to correct it.
Teleology appears as a gap between objective and actual. A split between bit raise and reality.
A gap on survival capacity. Between L and the neguentropy in bits number, for example.  But survival depends not only on internal laws, but also on the environment.

Gap. The symmetrical miracle of sin. On one side, between model and reality, a probability, an uncertainty principle, which little by little pushes me out. The small gap, the venial sin. On the opposite, the very un-probable, the mortal sin.

In elementary systems, the gap relates to reliability, measured as non-correspondence to the needs of an external user which has specified its needed functions. And the solution is also external : repair.

Kinds of gaps :
- Coding and counting gaps. Inherent to code as such, since the number of cases is rarely a power of 2
- Excess of resources compared to the needs ; notably, of computing power in respect to the inputs
- Un-computability of L (and uncertainty in general ? )
- Time gaps

Good gaps :
The 7thy day, God took rest

Bad gaps :
Lack, needs, hunger and thirst.
Death, life duration
Not shared love
Error, Ugliness
Gap and noise

Gap structure : Monodimensional gaps. Gap spaces
Form factors and interstices in multidimensional spaces

Gap metrics : bits, matter, form factors,
compacity (defrag)If the gap become very large, compared to the beings, the concept ceases to be adequate

To create gaps:
have better multipliers inside (genericity, orthogonality)
accept less rich sampling (in time and resolution)
adjust formats, adjust tasks to internal architecture
use indirect addressing

Gap uses :
- Leisure and pleasure<
- Investment (balancing long and short term in L)
- Safety/reliability
- redundancy

Something misses between what is known and what would trigger action (Lyotard has something between constatative and performative strings)
Gap is a normal situation
Sometimes, to be wished : liberty space, affordance. Fundamentally essential to liberty. But a place for Bad. Safransky.
The gap may be greater or smaller.
If no gap, rational choice?
How to fill it ?
- as if (Koestler)
- faith, religion, rite, authority
- indifference, drugs
- art
- art and technique machine
- suffering
- tearing, schize
- error, non-adaeauqation
- sublime

- gap as the place where I can model (Prost)
and the moment of fusion, end of the gap
alternance niche/adventure
sphere/cube, in large dimensions

13. The good

Principle : Is good for a system S what contributes positively to its L. That is autoreferential.

Is there a "goodness" atom as for good ? "This bit should/must be this value". As with truth, that implies that there is something inside the S which assesses the value
- or a list a priori of what the bits should be
- or a system of rules.

But here, the values cannot be directly obtained from an access to the world.

Value of raw matter, of form.

13.2. Self-integration

The central effort of S is to keep itself as a whole as long and as high as possible. Incorporating in that control one's properties and (for humans, in another way, one's fellow humans. Recapitulagte everything "in Christ").

Death prevents absolutely a total integration. Then this task needs some optimism, knowing realistically that I shall never intergrate everything, be it only myself.

I must help the other humans (and things, and machines) to ascend upwards in this integration. Groups, and mankind as a whole cannot hold but if enough individuals aim to that integration. And not by passing into the vacuum of transcendental meditation, but staying in the discursive or designing finite mode.

Rational ethics, with sin punishing in another world, is finally a poor doctrine, insufficient to prevent evil, but preventing from positive action. A strong ethics imply that one assigns targets to oneself, and that one cannot be one's own aim. But one must tend to place the aim as high and as far in the future as possible.

Custodi me a dispersione mea .
Self construction (Kaufmann

Balances

13.5. Types of ethical values

Ambiguity, ambivalence of the tool: "And they shall beat their swords into ploughshares, and their spears into pruning hooks" Isaiah, II 4

Les veilles cesseront au sommet de nos tours ;
Le fer, mieux employé, cultivera la terre,
Et le peuple qui tremble aux frayeurs de la guerre,
Si ce n'est pour danser, n'orra plus de tambours.
Malherbe, Stances à Henri le grand.
X : Technology is neither good, nor bad, nor neutral.

13.6. Bad

Easy solution : everything that draws from the optimum. But suffering ? (See emotion). Everything that prevents S to operate. System excess also. S taking itself for God, which transforms the positive conflict into World War.
Bad as consequence of DR... Safransky.

13.6.1. Fault tolerance

Thesis: a fault tolerant machine is less predicable than another one. More exactly, it is more predictable in general, but highly unpredictable in some rare cases. (similar remark about automation to simplify the HMI).

It is somehow a problem inverse to the Champéry's. In some cases, this might be proven. Hamming autocorrector codes : it there is both failure of the machine and failure of the correction mechanisms.Ex. the alarm light which alights wrongly, or stay close when it should alight.

In beings progress, so their pathology. A simple automaton works or not. A sophisticated one may have unbalances, unsatisfied agenda, aborted processes. Diagnosis and cures are always partial.

13.6.2. Violence

Violence is an extreme form of relation between constructs. Or, more exactly, it is the natural way or competition between simpler constructs, but is not perceived as bad. Rocks and waters, bits and bytes destroy each other all naturally. Violence proper emerges only between constructs sufficiently sophisticated to perceive it as such. As a necessity for conquerors, as a pleasure for sado-masochists, as a way to perfection by ascets.

We can draw a scheme of these processes.
violence.jpg

13.6.3. Death

How "behaves" L when death comes nearer ? If S cannot choose its death, its time or kind, then its liberty (H) is more and more limited when death comes close. Unless suicide, even more for a young person, when the global stakes are at the highest? Perhaps would it be possible to prove that the possibility of suicide is a major component of our L. Can we talk of a negative L for suffering ?

Relation of death with noise ?

More : from which level may I speak of ethics for an being ?
When it has responsibility regarding other beings. Then it must not sacrifice them unduly.
If these beings have a "value"...

13.7. Measurement of “goodness”

The more universal a law, the more it applies concretely to a large number of beings situations. (Extension/comprehension ?)

At the limit, Kant's categorical imperative.
But : "vérité en deça des Pyrénées, erreur au delà,", is considered as weakness, which may be solved with a general principle completed with adaptations.

Optimum Every time on which we get (or aim to) an optimum, that leads to a form of determinism, but if there is an indifferenciation zone.

The dose makes the poison (Paracelsius)

13.10. The right distance

To send a message, from an emitter to a receiver implies the crossing of the space which parts them. This distance must be evaluated according to different dimensions:
- temporal distance, since in most cases, real time is not practical or possible
- logical distance, with divers recording formats, codes, protocols, redundancies
- linguistic distance
- environmental distances; even social and political differences, for humans;
- motivation distances, with knowledge consonances and dissonances (we hear only what we wish to tear)

Thesis : there is a "best distance" between beings for an optimal communication. Fusion is illusion.
- clarity and beauty in the scholastic sense
- orthopical distance in classical perspective
- boxing distance
- ocular accommodation
(Here may be asked the telepathy issue, somehow with a distance inferior to the sum of radiuses).

Must be done at first at low level (physical distance, energy, radio frequency tuning), then go the highest levels (spirit)

INFPA6.GIF

Auto-focus. We start from a notion of neatness, coincidence. From the inside of the representation (but, with border effects, we could have the distance)
good cropping (framing): automatic setting of a zoom
distance/plurality

At start, fix problems, eliminate conflict and threats, then to the deepest resonance.

From a L standpoint, If A and B beings with each their own L, there is an optimal distance which optimizes La + Lb. Each being is for the other one variety provider, but also a reducer. A priori, it is good for B to see A... but perhaps it masks more interesting or important beings. It may be threat;

The message must be more interesting than others (compared with the receiver aims at that time). And must not be a threat (negative L).

Love. Promotion/communion will. Beware to the "small death". Communion is a kind of destruction.

How does distance influence the L of the receiver ?
Longer : reduces variety, augments the death probability, reduces the capture risk

Two factors are a consequence of L : I see, I am determined by ordering and welcome
- too well ordered, I feel a stranger, rejected
- too disorders, disease, fear of dirtiness, no free place to move,

Notion of useful load vs. carrier. Case of viruses. Kinds of membranes.

Not only distance counts, but also angle, appropriate sensor giving the useful information. In fact, we need two distances:
- one, undifferentiated, translating the abstraction level, and the reciprocal determination
- one transmitting the "conscious" distance, differentiated, measurable explicitly in bits.

13.11 Varia

Limits to any rational ethics due to DR.

Good : social integration of individual/global L.
Two maturities meeting, convergence
Standards and biodiversity

14. True

There is that

14.1. Bases

Truth is here a particular case or Good. Its basic definition, transposed for the classical "adaequation intellectus et rei", is the correspondence between a representation and the object represented.

Truth impacts L when contradictions induce doubt, or threats on P. But, in a large way, virtual beings contribute strongly to L, even if they do not correspond to a "reality". Besides, a quantitative evaluation of truth may be pertinent. From this point, the "consistency" of truth is a major contribution to the efficiency inside S as well as to its relations with other beings.

How far does truth contributes to L ?

Indeed, as long as no contradiction appears, the problem of truth does not surge. Beings, images, texts, other beings ... all are a priori taken for their contribution to P or H. Value could be something as the Google ranking and contribution to the present operations of S.

Contradiction, lack of truth, appears when S meets two contradictory assertions, of the type "there is that". These assertions may come
- from inputs (possibly after a reading order on S output),
- from internal "data", analyzed,
- from logical operations using internal and external data.

If the external world is supposed to be reliable, and the address is an absolute one, then a reading at the address may clear up the doubt. In a lot of cases, the address is indirectly or abstractedly given, or the external being is not accessible, and the doubt will remain. S may deal with it either by destroying both the contradictory assertions (with loss of knowledge), or by assigning probabilities into the alternative.

At first sight, doubt augments the unpredictability, then H and hence L. But then it puts a doubt, the risk of contradiction, on any construct using this assertion, and reducing its interest.

Then, the doubt, or loss of information, may impact negatively P, if the assertions relate more or less to favourable or threatening beings. It is dangerous also if it bears upon critical parts of S itself, for instance its own position related to the other beings. Or some parts of it infected by a virus.

In other cases, the conformity with the external world does not matter : S may develop full universes on fictive addresses. That is virtual reality, with its pros and cons.

14.2. A "truth" metrics

With some caution, we can also draw a sketch for a sort of "truth metrics".

Basically, we have a truth atom : "this bit has this value". And may be valued as one bit also. But that does not go farther, but perhaps in logic, where truth stays binary and passes from assertions to other assertions through the graces of logic. Better, let's outline some facts around :
- "this" is in fact an address ; if its value is of only one bit, the system will have at most a two position address system ; then, normally, even a "bit of truth" will actually have at least log2(N) bits for the address, if N is the size of the memory (here taken as synonym of the "word" ; we could add something like sensor addresses, to have a more encompassing notion);
- "this value" will generally be larger than a Boolean or a unique binary digit ; it will more generally a variable, or even an array, a text assertion, an image, a (possibly complex) being;
- correlatively, the address will point not on a bit, but on a byte, page, or larger "record".

- truth is interesting only if it is "falsifiable", i.e. tested as true or false. This implies that we have a truth saying process, which has access to that assertion/being/address in the "real", or "objective" world, and can apply to it a truth criterion ; "adaequatio intellectus et rei", said the scholastic philosophers ;

- here we can distinguish several cases
. the inquiring system has in memory a "concrete" description of "the world", where the value of "this bit" is stored, and may be compared to the described world ; this description is supposed to be the result of a previous inquiry
. the inquiring system has a set of rules giving an "abstract" description or the world, out of which the value of "this bit" may be computed and compared. ; the rules result also from a previous inquiry, but completed by some kind of generalization, theorization, etc.
. a combination of the two kinds of descriptions.

- in elaborate cases, addresses themselves are complex, indirect, depend on a sophisticated model...

- as soon as representations become large enough, the truth test will cease to be Boolean ; the representation/model will be more or less true ; for instance, a numerical value may is approximate to the nth ; a model will be valid in a limited domain ; a portrait will be "rather true to life"... (sometimes, the truth test itself will be approximate, if not inexistent).

Hence we can talk about error, and of error rating/measuring. We can use, for instance, the digital distances (Hamming distance).

Is truth also an aspect of L ?

14.3. Consistency

This metrics leads us to a major advantage of truth : it is always the same, when errors are possibly multiple. If S may rely on the truth of some assertion, it may safely use it over and over in any context. Still more useful may be the possibility of sharing this truth with other beings, hereby affording the possibility of communicating efficiently.

14.4. Limits

Digital relativity implies that, beyond some limits, there is no possible truth test.

There is that. Interesting to elaborate on "there" (address), because it is the "geographical" structure of DU.
absolute address exist in physical world. + relativity problems.

Auto-truth. Coherence. Then global plausibility. And the possibility to navigate inside it. At limit, Anselmian God's proof.

14.5. Varia

Real as friction.
Internal, theoretical development vs. verification through O/I (the inversed Skinner box...)
Transpose the Kantian a priori/a posteriori, and Vienna positivism

Now, "société du spectacle" (Debord). Constructivism. The important is no longer to discover new unknown territories, but to build new territories. There is here some contradiction : how an artefact can be unknown, even to its creators ? That I discovered first in the beginning of powerful Information systems, in the 1970’s: special devices, the monitors, had to be build in order to understand how the computers spent their time!

To be tried: a Kantian analysis, a priori and a posteriori in our DU model. And foundational aspects (Bailly-Longo)

PART 3. DIGITAL HEART AND ART

15. Generative being

Once these principles exposed, DU seems to be rigid. We shall now give it more stuff, and show that this beautiful crystalline frame is both full of holes and outgrown for a lot of reasons. It may seem sad, regrettable... DU is dirty and stressed ! But at the same it is the price of life, the cure to totalitarism. Here, the effects of DR will take all their force.

The general being/automaton scheme IEO is developed in three "phases" : assimilation, core genetics and expression. The term "generative beings" is inspired by Lioret works.

15.1. Basics

The automaton IOE model, that we use largely in this work, can be developed in order to offer a richer presentation of the processes in DU. Biological concepts come then nearer, with of course some very important differences. Inputs and inputs take place in a flow, or multiple flows of digital processes, with in principle no physical adherence (but DR, of course), so that one could say "flow of thoughts" (Bryson/Conan). If that was not too much anthropomorphic, and anyway inexact (even for human beings, thought takes place only at superior levels of operations).

We shall consider that the basic process is, schematically, made of three phases : assimilation, core genetics and expression. The process is nearly sequential and symmetrical when the process is simple. For example, in an acoustic filter, a message router or a text translation by a human being, input beings are first unpacked of their physical conditioning and transport formatting, dealt with according to the function of the process, then conditioned again and sent. Assimilation and expression may be completely disconnected in time but also in structure. For instance the recording of a musical group will operate in a series of sessions involving a few artists and editing specialists, and result in a master record, which may even have no physical existence outside a computer. Then will begin the distribution processes, with a unique original record conditioned graphically and physically to reach a seduce a public as wide as possible, in some cases under several different forms. In the lifecycle of a mammal, and of course of a human being, learning processes take several years or decades, and the output along the professional and social life will conform to very different patterns.

The three phases use "languages", in a rather general sense of structures affording assimilation and expression, and concentrating in the core what we shall call the "genetic" processing, in a way rather different of what is generally understood by genetic programming of bioinformatics. These processes, but for particular cases, generate gaps and losses.

15.2. Assimilation

The global being and the sequence of its bits, may be considered as two extremes, or we could say projections on coordinate axes. The analysis consist in dividing the inputs as far as necessary for the current process.

(Note that, if the process is related to the physical world, a first phase of "analysis" is the sensing and A/D conversion. If the inputs come through a network, the first phase is to "unpack" the message).

A very simple case, for instance, is the sorting or selecting of beings according to a parameter explicitly written in a known place. For instance, files according to their name, extension, last modification date, etc. A less simple case is the same operation but according to non explicit features of the beings, then this feature has to be recognized ; it may be easy (select all the low key images ; or very difficult, if not impossible for machines and possibly also for humans (select the pictures representing a given person).

A first level of analysis is implied by the type of the being, which generally will demand some pre-processing before operating proper, for instance a .jpg image, which will be analyzed as a matrix of pixels for ulterior processing. The analysis process, at large, starting from a file in memory or a stream on a line, will begin with de-streaming and preparation of an appropriate copy in the central memory.

Data files are generally organized in records. The basic analysis process consists in providing the records, one by one, to the application program, which in turn will use its data structure to apply its routines.

In text, or word, processing, starting from a printed on paper document, the basic digitization is a scanned image, a "bitmap" file. Which will be enough to display it on a screen or printing it.

If the document is actually textual, the digitization goes further if we replace the bitmap image with a series of digital codes (ASCII, Unicode). Not only that is a major form of data compression, but it affords also a lot of new processes, on the first rank the character string king of retrieval which any Internet user is familiar with. It also opens the way to syntactic and semantic searches.

The basic code may be completed with codes describing the typographical properties (font, size, colour) and possibly the layout.

Text analysis may stay in surface, if the operations are limited to storage, display or transmission. With modern text processing, however, that may be not so naive, with HTML marks for instance. Deeper analysis will tend to "understand" the text, to access its semantics and get its "meaning". This form of analysis can be complete and generally error free for formal texts, in particular for programs. For more or less "natural» language, the interpretation will not go without guesses and misunderstanding risks, be the processor be a machine or a human.

In biological processes, the basic analogue to digital conversions are operated at nerve terminals and the analysis process goes along the nervous system ramifications up to the decision sites, mainly to the brain for the most advanced animals.

Is analysis a loss of information ? Yes, in general (if the original being is not saved along, of course). And is legitimate in this phase since the processor aims mainly to select what is pertinent for it. And the contrary would be contra productive, if it resulted in artefacts giving a false representation of the being. Analysis is mainly a reduction to elements, losing part of the structural richness (or low level details) of the being or of its image.

When we start from a photography, or an external document, or data acquired through high rate sensors : processing them, apparently, can only reduce their riches. Entropy wins. Even when we do a sharpening operation, for instance, we reduce the information quantity given by the original image. In any case, in KC terms for the system doing the processing. But not necessarily for an "external observer". Since we must not forget that the work will a product of the received data and the whole culture of the artist, and of Roxame as a particular case. In the OE = f(IE) formula, O profits from E. And the multiplication could be creative.

Sometimes, the necessity of reducing the size of information, or to interpreted it with risk in order to conform to the core process and the processor capabilities may be a hard choice.

Let us note, however, that a description may be longer than the original. A typical case is the Naval Battle game, The ships are disposed on a 10.10 matrix. Then 100 bits, or 13 characters on 8 bits are sufficient to code all the positions. If the position of each ship is described by its beginning and its end, that makes 4 integers for six ships (xy positions), two only for the 4 submarines, then 28 integers coded in 8 bis, 184 bits. A textual description ("The aircraft carrier is in B1 to B4...") would still be much longer.

Assimilation may also process through neuronal networks (Bret).

Vision may be defined as the transformation of an image into a text which describes its content. For instance, if the input image is a geometrical figure, and if we consider that random variations or accidents aliases) are only noise, the reduction work is finished when S has found the curve equation (type, value of parameters).

A basic idea, induced by the mere work» analysis", is that it reduces a being in simpler ones. As says the second principle in Descartes Discourse on method "divide each of the difficulties I would examine into as many as parts as possible". That is not always possible : in some cases, dividing would cut fundamental loops. That has been widely said in the 1970's, with notably the books of Edgar Morin. Even analysis may be creative, in particular hen inputs are natural or rather fuzzy, and analysis is by itself a sort of construction. It may even be dangerous, if it generates deceptive artefacts.

Recognition operations are not properly divisions, unless a rest can be defined. Globally, recognition gives a description, what must I keep to find again the original.

For the rest, study the example or equivalence relations. There is the quotient set (the classes). What about the second factor ? It depends on the way the set has been defined. If defined by a list, the rest is a list completed with the indication of the class. The two definition sets may be globally heavier than the original.

Descartes sets a strong hypothesis when he states that cutting gives simpler issues. (Where to put that ? )

In the general case (in practice, for mainly all beings except those specifically designed), this product will let at least one of the "factors" so obtained which will not be reduced under some "arbitrary" or "random" sequence of bits, which we could call a "prime". The length of the longest such prime could be called the "minimum divisor" or the being.

Such a prime may have its source in an analogy with another being. For instance a sample of some sound or a photographic picture

If some approximation or error risk is allowed, this factorization can generally go further, be it
- by the parting of continuous (in bits) parts into several intervals (sampling, grading, quantizing)
- pattern recognition, that is selection of a given pattern among a finite set of patterns ; a classical mode of selection is the obtention of a set of measures and qualitities or the being to be recognised, and the choice of the nearest pattern in the "feature space" of the pattern set.

For graphics and other beings, a factorization by size and pattern is generally possible, in general with loss (pixelization).

Reduction to sophisticated text presentation to minimal text
- written text : divide by imposition, page setting, font. beware, part of the marking may be meaningful
- oral text : divide by prosody, time.

In analysis, criteria are put into factors, and the search starts again on the "rest", that is the works selecting according to this criterion. For instance, the "colour" factor is used, and then one looks for patterns, or luminance. That would be the "canon" research. But it is not always possible, and the search on each criterion may be independent. For instance, all the images of armchairs, then style differentiators.

The repetitive structure, or iteration, is major for most processes. Evidently at the bit processing level, since any digital being is basically an succession of bits. Classical iterative beings are for instance data files, neural streams or living tissues.

It may be useful to consider a persistent being in time as an iterative presence all over the cycles. A practical example would be the successive video images of a fixed being. The digital representation of a integer, whichever the basis, is the iteration of successive powers of the base, multiplied by the particular value at this power. The iteration loop is by itself a fascinating trait of active digital components, with its potential for infinity. And things become still more exciting... and tricky, when the loop includes a reference of the being to itself (recursion).

The replacement of long iterative beings by a type and a couple of addresses is a powerful analysis tool. In music, for example, a Midi code can be interpreted as pointing on a repetitive structure (the sound form) placed all along between the start and stop addresses in the score. It goes also without saying that the complete digital representation of an integer would be an infinite chain of zeros at the left of the meaningful digits. This is not a pure theoretical view. It has sometimes to be dealt with explicitly in programming or hardware design (padding, zero split).

It is also one task of sensorial systems in living beings to replace the repetitive stimuli flows by presentation of stable beings. That function becomes highly complex when the body moves, but also the sensors, eyes for instance (see for instance the books of Berthoz).

The type-address reduction may be considered as using an equivalence relation. The quotient set of the relation is the set of types, and the set of addresses defines the partition. All this process implies that the agent knows what "is the same". To define that includes features of the being (what is important for it, what contributes more or less to its L), and also of intrinsic features of the beings. Sometimes, intermediate criteria may be used, as in compression/decompression algorithms, who use general kinds of "regularities" to reduce the volume of beings, and let them be rebuilt (with or without loss).

Neural network offer another kind of analysis, with bright aspects as adaptability even to the unpredicted.

Languages, natural or formal, are the most wide and powerful set of structures and components. A language has two basic factors : phrase structures and word definitions (codification, nomenclature, dictionary, ADN set of bases, of which the "definition" is transmitted by messenger ARN, but in the frame of a very complex epigenetic process). Languages use a basic iterative structure, the phrase (assertion, command, one could also say gene). The complexity lies inside the phrase content, but also on the fact that a phrase may refer to another or event to itself, thus creating possibly complex loops and recursions.

15.3. Generation

The generative phase (genetic can we say metaphorically) is the creation of a new being by another one, or several ones, that put in this new creation something deep out of them. The generative process is as well :
- natural (from simple cell duplication to the full human process: male spermatozoid, female ovule, crossing, embryology, birth, mother and parent care, society integration),
- artificial (from the logic gate to the full process described here), but the mere automaton formula O = g(I,E) is yet rather expressive

In a clocked digital being, in some way anything is new at each cycle. The question is how deep this generation will start from. The self replication, followed or not by crossover, is the extreme case. In the normal case, day to day life and machine operations, the new being will be only partly new. With the art being as another extreme case.

Once the assimilation done, the being has "all the elements" to make its operations, which can go from a simple logic AND to image compositing, and at the deepest, a complete modification of the elements. But here comes the genetics : not only can the being combine several being or elements acquired from the input being, but it can combine it with its own internal data.

A basic case is the filter applied to a signal, which combine the each acquired value with one or several values (filters of more or less "high order").

At high levels, of depths, of processing, the being will apply all its force to the genesis of a new being. Roxame, for instance, may apply all its set of algorithms to transform an analyzed input image. Then, as is said of a canvas by .... "multiply a spot of nature by a temperament".
Or, in information management system, each client entry can be dealt using all the synthetic strategic computations done on the precedent day, so that the corporation maximizes its yield. (Le triangle stratégique Tardieu).

Another way is to "add a bit", according to "Every thought is first an impious negation, a dissolving analysis" (Andler 1. p 55). Every art also.

What is "crossed" ?
1. nature: genetic code + mother environment + family (which is also a crossing, and not so easy !) then society at large
2. artificial. forms. products and residues.

The generic process, proper, is the generation of "products" from the parsed inputs and the internal resources. Product has here a very general meaning, like in mathematics, for instance. Two kinds of structures are here implied in the process : components and assembly structures (however, the difference is not always sharp, like between programs and data).

Components may by assembled together with a very temporary structure to give components. For example
- the logic combination of two Boolean values
- the arithmetical operations on numbers
- the superposition (subtractive mixing) of two images
- the crossover of genes in life reproduction.

Structures may be assembled with a minimum of components
- the successive application of formal language rules before using any terminal word
- a recursion process on abstract form (?)

Structures and components combine to produce more components or more complex components.
More components :
- repetitive generation of a being
- repetitive emission of a cenestesic signal
- concatenation of strings
More complex components :
- composition of a text phrase
- successive fractal operations on an image
- embryologic process.

We consider as generative mostly the deepest part of these operations. When a "new being" has been generated, a series of operations will be, somehow automatically or without "creativity", executed to give the being its final and deliverable form. That is what we call the expression phase.

Somehow, generation is the way to create energy from nothing, or just matter consumption (but less and less, Moore' law like). We are here at the heart of the mystery and taboos, for that !

Self reproduction = life

The reproduction of a new being identical to its parent or combining the traits of a couple of parents is the basis and (one among others) definition of life. It is more exceptional in artefacts, and negatively viewed, since it is not easily controllable. Then the only current self-reproduction of artefacts is the virus, a plague. Anyway a direct function like O = E for an automaton or any mechanical device is problematic. In fact, self-reproduction is never a purely digital process. The non digital part of the cell is essential. And if natural viruses are nearly pure code, they cant reproduce without a host cell.

The filiation process has some variants :
- generation of the same. cloning,
- modification of the core code
. by central code combination (classical genetics)
. by "deliberate"modification and engineering

There can be something like a father/mother structure, when one S brings input to another S', which makes the assimilation and the genetic crossing plus the expression process (a sort of pregnancy...). That may also help to understand the role of "matter" in the generative process (generalize the scheme of human embryology).

Decision, a generative process

Decision is a kind of generative combination. Basically, it is binary ; it is an alternative in the strict acception or the world. You are accepted or not. You go right or left. Sometimes, there are more than two ways (but not more than 7 or 8; beyond that, the process will be layered, with first a selection of a "short list"...

"True decisions", so to speak, are made in the core of S. Secondary ones are the matter for subsystems and other parts. A kind of genetics between what remains of the input and the motivations, finalities of S, his view to its L in the particular environment of this decision. In fact, all along the assimilation process, a lot of (more or less implicit) decisions are made. They may have been done in the mere design of the decision process itself. The major ones take place in the core, at the end of the concentrating assimilating process, at the origin of the widening expression. At the deepest of the process.

A radical and somehow minimalist kind of decision would be that the result of the analysis reduced to one bit, and the genetics is another bit to be multiplied by the product.

At a basic level, one may consider that the simple conditioning instruction "IF... THEN" is the decision atom (as is the logic gate in physical matter).

he decision may be quantitative, as research of an optimum, or in the "simplexe" qualitative/quantitative scheme of operational research. Here, the decision process may appear as automatic: the system chooses the optimal solution. But its part in the process has been in the definition of the computation. It can be qualitative, more deductive and operating on scales and not on quantitative parameters.

For some, decision is an human privilege, and the word should not be used for automated systems. At most, computers may help humans to decide (Computer aided decision), or form complementary systems to the large operational systems of large corporations. We shall not discuss here the problem of the human exception (see the "Mankind" chapter).

We shall use the word here in the case when there is uncertainty. In particular, for a construct, when the comparative effects on its L of alternatives cannot be completely computed, due to lack of knowledge or more generally to digital relativity. then the choice implies risk... and possibly creativity. This lack of criterion for central choices demands an unpredictable choice, then something new, at least a new yes or no, a new bit.

It is by their succession, their progressive accumulation, that decisions are the brick of constructive processes. We can so read the first chapter of the Bible's Genesis. In such an origins perception, where the first decisions have a simplistic look. There is an above and a below of "the waters". Progressively, matter, marked by decisions, is more and more organizes. Partitions are more and more fine and delicate. The miracle, so speaking, is that the castles of cards, so improbable, win an incredible stability (see here Chauvet).

The process takes place in a decision space : set of alternatives ; there are choices in the formal definition of its perimeter ; inside, it is restrained by continuous or textual functions, valued by weighing factors.

If we have precisely the right number of criteria, we reach determinism. The being is no longer interesting, since it is known in advance. Nevertheless, it could be an innovation, this computation of optimum has not been before. In some way, it is the arrival to classicism, preceded by enthusiasm and followed by boredness.

To have a real choice, an interesting alternative, we must have too much (cornelian choice) or too little (uncertainty) criteria. The unpredictable offset from the canon is part of the aesthetical system. A manually done lace is that small defects let appear the lace-making woman behind. That may be simulated with random errors in mechanical lace-making.

In many cases, there are too many or too few criteria and then
- no solution responds to all criteria, and S must take the less bad, more or less implicitly softening part or the requirement
- several solutions, or a whole space of solutions responds to the criteria, and S must either draw at random or use other criteria (possibly fuzzy, like "feeling", taste, or "nose".

Then decision seems at first look reduce uncertainty to let action go. But on the contrary, it is creative.
It reduces uncertainty in the cycle see/evaluate/act. But, if the action space has some persistence, then a decision "writes". Hence, there are two paces /
- possibility space (affordances), which we reduce to decide optimally, a space to be explored, where we trace a road,
- space in creation, that we define step after step, that we design for ulterior actions and decisions, for our successors; persona building, decisions of the youngster, of the adult, of the aged person.

15.4. Expression

The expression (synthesis, amplification) is a process rather symmetrical to the assimilation. It consists in assembling elements coming from the analysis and from the generative process, and assembling them in new beings, sometimes a simple modification of inputs, in other cases without direct link to the inputs (at least, with the inputs of this cycle).

It starts from the result of the genetic process, which then can be called an "intention". One can say also that this phase produces the representations, for the outside world, of internal beings inside the being.

The expression is the combination of the inner nature of the new being with the nature of the environment.

In some cases, the digitization of numbers may be considered as an external presentation of formal beings. For instance 1,4142135 may be taken as a representation of sqrt(2).

A key point: synthesis of a large number of outputs. Connect that to the "aura of the original" issue (Walter Benjamin). Let us re-visit the industrial reproductivity consequences. Aura and original : when matter matters (but that does not prevent to make sculptures industrially)

E = A + R
if R << A, a non-memory, reactive, oblivious, community, system
if A << R, passeist

I must introduce a "generation rate" G, with new information, or prospective information, created by the processing of hazards A, R,

G is function of : processing capacity, random generation capacity

One may describe G, the Gn processor, as a particular type of data, a program
assessing that there is a general/neutral processing capacity
G, general models, prevision, prospective, augmented reality formalism
find a general structural model for basic cells (Minsky automata ? )

Augmented reality
- with other images
- with models
- with links.

We place:
- diverse links showing that place (zoom on the most basic parts)
- historical travelling and possibly future projection

Historical trend, laws. the E, A,R rates have counterparts in reserves (determined technology and system analysis)

Computing is an enormous Meccano with many pieces, each one in unlimited supply.

Then, the construction of representations may be described generally as a division (or analysis), which demands an appropriate analyser (sometimes "parser"). Symmetrically, the generation of a being similar to B will be an expression, using various kinds or generators.

The expression will not conform totally with the intention. The new being is at this stage a sort of embryo. The baby is the message.

Repetition and concatenation

A major generative structure is duplication, i.e. the application of an iterative scheme to a given basic being. Something like multiplying a signal by a Dirac function in time. Along sequence of concatenated identical strings (possibly the same bit). It may be represented to that string times the number of repetitions.

A more general case is the concatenation of words taken in a dictionary, and in general chosen in accordance with a set of syntactic and semantic rules.

Symmetries etc.

Some representation can be result from another with a symmetry or other geometrical operation, or filter.

A lot of such operations can be listed, and, for a given S ( Roxame for example), constitute a generation system.

The full meaning of the work comes from a combination of the elementary meaning of elements (words, elementary form), of the structures, and the way they convey the meaning intended by the builder.

(And don’t forget that dictionaries as well as grammars are built with bits).

Multidimensional deployment

De-streaming
Theorem of losses during generation.

Load and distance

"Loading" is part of the expression process. The product is formatted and completed by a wrapping adapted to the way it has to go from the emitter to the receiver(s).

The greater the distance between emitter and receiver, the heavier will be the necessary load (envelope) to pass the message through the distance.

This load consists in multiple complements, the nature of which is as diverse as the kinds of distance :
- parity bits, check redundancies, synchronization bits
- addresses, packet conditioning in networks
- physical conditioning (envelops, postal bags, telecommunication modulation, amplification)
- typographical loading (fonts, graphics, colour)
- law conditioning (mandatory mentions on commercial correspondence, on any product)
- cosmetic conditioning (politeness, high style, captatio benevolentiae)
- voice prosody and gestures for oral and visual communication
- brute force (with enemies), ultima ratio regum.

Note : loading increases masses then distances. When we go near to the DR (digital relativity) limits, it becomes impossible to compensate distance by loading. Or, could we say, DR is a consequence of the load/distance principle.

In intuitive forms, and sometimes quantitative (Shannon), telematics design makes a constant appeal to this relation between distance and message loading.
Consequences :
- at emitter level, progressive loading devices, starting from the original sensor or authority up to the channel ; these devices may be more or less specialized automata (modulators, dedicated chips, processors and software) or persons (editors... assistants).
- at receiver level, more or less symmetrical devices
- at channel level, band pass enhancement, protection, adaptation, transcoding... in some way, reducing the apparent (practical) distance.

Dual view of processing from the message standpoint

Here, in general, we describe DU operations mainly on the basis of beings, with the IEO scheme, completed with the assimilation/genetics/expression process. We could try the dual view, putting the messages in the centre: expression, transport (may be considered as a kind of process, and possibly rather complex, with different channels and steps, intermediary storages, etc.) and assimilation. .

One could probably do also the same for storage in memory.

Is it meaningful ? Perhaps the message approach would :
- be more connexionist, neuronal net etc.
- give a better model of "thought" flow

15.5. Kinds of structures and components

Let us try to list the main kind of "languages", structures and components that will be used. Look to the generative grammar, so speaking.

- Let us at start push aside the physical components. The normal digital processing begins with the appropriate conversions in level of energy and digitization (analogue to digital conversion), and end with symmetrical conversion and energy level adaptation (generally, an amplification, since digital devices operate at low consumption levels). The A/D conversion implies a sampling which must follow the Nyquist criteria. In some cases , note Bailly-Longo, a proper conservation of critical properties may be impossible. It would probably possible digitizing processes other than the classical sampling, and giving directly formal descriptions respecting these critical apices).

Assimilation and excretion of matter by living beings is not digital, but can partly by looked at in equivalent terms. For instance, the assimilation of aliments starts with a deconstruction of the original food, followed by appropriate transformations to be totally integrated in the body.

- The type-address couple is also major structure, and anyway the minimum necessary to access and process a being. Put or send tha

The address is the minimal handle which gives access to the being. Frequently, it is given in a relative form, or by a name (identifier). We use everyday URLs and emails which are where necessary (routers) converted in numeric codes. Files in directories are typical of this type of couple. Inside living bodies, the addressing is supported by the neural or enzymatic systems. In geometry (at least in simple cases) the type of a curve is represented by the structure of the algebraic form, and the address in space is brought by the parameters.

As for the type, it is at minimum "bit string". But without a type, processing will be limited. Sometimes, the type may be deduced from the string itself, by noting regularities or recognizing discriminative patters or stings. Frequently, large beings incorporate a header which will indicate the type and give a set of parameters to use the being.

Compression algorithms imply equivalence relations tied to the world nature and the specific traits of the seer (human in particular) perception system. The deeper the reality, the more exciting they are. In some way, find in a bitmap the singular operators that let compress it without loss is exactly understand what is interesting in it, because that let suppress what does not matter.

15.6. Assimilation/expression ratio

A mere comparison, quantitative, between inputs and outputs, suggests a classification of processes.
If there is an output, of any volume, for one bit in input, this bit is a trigger.
If there are more outputs than inputs, some new bits are created in a way or another. For instance an interpolation. Or, of course, any more complex couple of assimilation/expression.
On the diagonal (equality of flows) the process could be a simple transmission, or a permutation, or a combination of reduction and amplification.
If the outputs are smaller, it may be a resume, a simplified bitmap.
If just one bit results of an input, of any volume, it is just an elementary evaluation.

creation_synthesis.jpg

See below Styles.

15.6. Architectures

Our basic scheme is rather "linear", with just the three processes operating one after the other one. More complex processing structures must be looked at:
- "centred" with only one core, but several assimilation and/or expression channels; if it is really centred, topologically, it may be thought are the basic one, if we group the inputs and the outputs; in the same way, a series of processes where, but for the extremes, the outputs of one processor are aimed exclusively to another processor which itself has other inputs, the chain may reduced to one processor;

- parallel ; if two processes are strictly parallel, with separate inputs, core processing and outputs, then that may be considered as separate processors ;

- coordinated, with some master processor driving the processes without really interfering ; a sort of operating system

- layered , with one central driving process , like in robotics, with possibly "hybrid", for instance some inputs given at the same time to several levels ; ISO/OSI communication scheme.

This theme has been dealt with in a abundant literature.

15.7. Processing depth

Intuitively, the concept of processing depth is rather simple. The mere transmission of a being from input to output may be considered as 0 depth. The depth is maximal
- when the input being has been totally changed by the process, when it is impossible to make any bit to bit mapping between the input and the output
- when the totality of the (processor) being has taken part in the process, and possibly been changed also, beyond normal wear.

A logic gate, for instance, operate always at full depth : it changes totally (or not at all) an elementary pair of bits, and the totality of its function is implied in the operation. At the other extreme, the mere change of a file name is a computer does not change the content of the file, and make use only a very small part of the computer capabilities.

Somehow, in some measure, the analysis goes as near as "deep", as near as possible to a string of independent bits, in order to get the maximum effects of the law rn >> r.n.

For instance, starting from a classical HTML coded text, we could:
- strongly reduce the "normal" text, replacing it with a sequence of calls so pre-stored paragraphs;
- suppress any residual textual part with a coding of every word with calls through appropriate hyperlinks; - de-linearize with a system of multipliers
- use reference to type structures (commercial letter, file, DTD, Edifact message) in order to reject even the linear aspect of an HTML text, build from traditional text.

That would let us, at a lower level, replace the HTML anchors themselves, very "analogue" in their graphic presentation, with still more digital codes. That would imply the availability of a full dictionary, for all words. Ideally, we should reach a point where even the analogue succession of bits is replaced by a sort of Lisp type coding. We shall say more in the "depth" part below.

Let us examine separately the being and the processor standpoints. .

Processing depth in the processed being

 A bit per bit transformation of a being may change it totally, but the mapping between pre and post-processing states remains possible. In some way, the semantic distance on each bit may be total, but there is no structural distance. Like, for instance, a negative film and a positive print got from it.

 A program is not changed (0 depth) when interpreted. Nearly totally changed (100% depth) when compiled.

If a being is decomposed in a sequence of similar parts, then processed one by one, the depth in the being is not superior to the length of one part. Examples :
- processing of a signal, one sampled value at a time ; the depth will be greater for filters of superior order ; and total for a Fourier transform.
- batch processing of a lot of unit records.

Processing depth in the processor

Total implication of a human being giving its life for a cause.

If a being is dealt with bit per bit with no memorization, the depth in the processor is minimal (1).

If a being is dealt with as a sequence of parts, the depth in the processor is, as above, the length of an element or several elements. But possibly the processor will have first to store the structure, and May also, somehow in parallel, make some global computing, let it be only the number of unit records

If the totality of the being is first stored and internally represented, with a processing taking its totality into account, the processor depth is the ratio between the size of the being and the total processor size. (...)

An absolute depth would be something like a human being, up to giving its life. Or even the "petite mort" in sex intercourse. The intensity of replication. Core replication in life reproduction,

At the other extreme, a computer never implies the totality of its content to an operation. A system has a classical distribution between fixed functions, versions, active moves, instable regions, noise.
processor_layers.jpg

Then do could be analyzed in layers, as well for hardware (CPU, peripherals) as for software (BIOS, OS, application programs, etc.). And more generally, in relation with the different kind of architectures (layers, centralization, etc.).

By comparison to this classical distribution, we should be able to qualify the parts of the system, from very difficult to modify (hard) or easily modifiable (soft) if not unstable. We could look here for the role of
- dematerialization (localisation, re-wiring)
- virtualization (memory, addressing, classes)
- digitization (reproductible, mix logic/data).

Relations between the two "dimensions" of depth

We could propose a model such as the following figure, with Do as a distance between input and output, and Dp the proportional depth into tye processor.
The ration H/D could qualify, for instant, a kind of process/processor. Note for instance that a deep Dp is useless if Do remains low...

H is the depth.
The quantity of action is k*D*H


The surface, let us say Do.Dp/2 would give a measure of the processing quantity implied by the processing of one element. We could look for a relation with measures in ops, for instance with a multiplication by the flow rates (average, or other combination for the input/output rates).

Depth Limits

The core cannot be reduced to one bit. Still less to a Leibnizian monad. There is always some surface, some minimal lop.

Special attention should be paid to "total self reproduction", like in living beings. And to think more about the fact that, in artefacts, self reproduction is so absolutely forbidden.

A form of total depth would be the transformation of the being in a set of totally independent, orthogonal bits. These bits could then be processed one by one, or combined with a similar being, and the result be used to synthesize the new being. That is nearly the case of sexual reproduction, but for the facts that
- the genome is not operating bit per bit, but gene by gene (a gene being a sequence of two bit equivalent pairs).
- life reproduction implies not only the genome but the totality of the cell.

Limits of parsing
- co-linear with the process beings ; minimal sequence of bits ; the being/flow cannot be totally "destreamed"
- transversal ; minimal operations
- non-recognition, residue.

Here we have to fight against a view, or a dream, that we cannot totally exclude or our minds : the idea that there is somehow and somewhere an ultimate nature of things which would at extreme be beyond words. The Leibnizian monad is a resolute conceptualization of this idea. "Dualist" thinking can elaborate on it. But it is beyond the limits of our study here, where digital relativity or digital quanta impose limits to the parsing, and a minimal size to the "core" of a process as well as a processor. (Shaeffer p. 45).

Depth is also hierarchical. Level in corporation structure. Escalading in maintenance and repair operations.

20.9. Recursive functions evolution

By texte extension
- maths type text
- programming text (given that it is possible to commute with math texts) ; random generator then evaluation loops
- Well formed expressions
- original (not reductible to known assertions)
- interesting
- more interesting than existing data/assertions

16. Emotion

16.1. Basics

At first sight, nothing is less digital than emotion. So more if we consider that digital connotes rationality and explicit formulation. In living beings, emotion refers to the most animal and uncontrolled features, to the sexual impulses, which a rational subject, they say,  should refrain or reserve to highly spiritual functions, and even there not without suspicion. As for artefacts, up to now considered essentially as tools, emotions here are neither desirable nor pertinent.

Research, actually, drills into this topic. See for instance (on the web), the synthetic article of Luca Bisognin and Sylvie Pesty "Emotions et systèmes multi-agents : une architecture d'agent émotionnel".

Some research times explore the domain . The important investments of Sony in its pet robots (the Aibo dog, mainly) do not seem to have met the expected success. However, electronic games will more and more require a minimum of emotional presentation of NPCs (non player characters). The "uncanny valley", indeed, demands that high resolution artefacts behave the human way, and show in on their faces. And, in the art field, one of the poles of this essay, emotion is naturally hoped for, if a program has to become an artist.

Two things are (comparatively) easy :
- show emotional expressions on bodies, and particularly faces, of artefacts (see for instance Designing sociable robots, by Cynthia Breazeal, MIT Press 2002)
- simulate the evolution of "mood" in function of external or internal events, impacting for instance on alertness, tonus, etc. We have started such experiments on Roxame.

It would be interesting to go further on, to model and possibly program deep emotions aspects of our beings. Here, as for meaning, etc. we let the human exception problem for the Mankind chapter.

Emotion is a going beyond the normal depth associated with a kind of phenomenon.
In biological systems, fear, emotion by sex. seems easy. Protection. Self-control
Human culture and progress tends to protect the inner layers. stoicism. empathy cutting, etc.
Artefacts have that protection built in
and, normally, hardware cannot be spoiled by software errors ; but, in practice, virus may compromise the system operations up to prevent the peripherals to work properly, or at all

Roxame's emotion/inspiration must grow, become more precise in the generation process.

In the assimilation phase, choices are made, both determined (partly) by mood, and influencing it
- at start, or the white canvas stress, or the excess stress from a picture
- tiredness
- the logic of factorization must be coherent with the rest of the work
- contradiction or dramatic choices in factorization may affect mood; symmetrically, in the choice of forms, with associated moods.
a mood at start (general strategies, last word result..)
- detection of anomalies and contradictions. in an external model ; in a external generating text, anachronisms for instance. Or suffering by semantics
- possibly advice by seers (generally not in the analytic phase ; at that time the artist works alone).

In the generative phase, translation : meaning of the original, replacement of a language by another, as if existed a trans-linguistic meaning.

Maximal meaning is the disappearing (in a given system), what is the most important for it, bearing on a maximum of patrimony. But suicide and homicide are forbidden here, and then the most important is to allow a maximum of power without passage à l'acte, "be limit" (as say some young people)

One ideal could be to change everything : if all bits are orthogonal, one change suffices. Un seul être nous manque et tout est dépeuplé (Lamartine).
Alles neue und mit bedeutung (Goethe)

16.2. A heart beat

Something to do with, for instance, a heart beat. Then, emotion is what makes the heart beat faster or slower. But what does "heart" mean for digital artefacts ? Clock speed adjustment exists in "energy conscious" computers, with appropriate compilers. They aim mainly to limit energy consumption on embedded devices (cell phones, for instance). We could extend the concept of "energy" to L, and for instance tie emotion to predictable threats (P for "life" threats, H for "liberty" threats) as well as opportunities.

But the chip clock operate at the lowest level, and we could look for some higher level "heart", and so more so that our study of the "living" processor show that the core central process cannot be reduced to bit or even byte level. Somehow also as the respiration cycle... Roxame, or any programs in Processing, are driven by timers of a much longer period than the computer clock.

A good heart cycle level would be the point of all the fundamental aspects of sensation and action could be examined and combined.

Then, the sensing and effecting activities should be adapted to the beat. When heart beats faster, analysis is more selective, action stretchier, possibly brutal; I would say "aliased". With storage of basic elements for ulterior examination. When slower, sensing may be more wide angled, action more subtle.

When heart beats more rapidly, a meaningful practice would be to adapt resolution. Then, interest of the R(160) resolution limited to 160 bits "pixels". But only if we have good ways to climb down ! Then, on following, that has proved difficult. Perhaps not very interesting, related to "histogram" approaches.

Feel beating the heart of Roxame make it sensitive like a violin bow to user interventions, to world events. But not rigidly. Roxame must be able to get out of her phase system when she finds something interesting in what types the user types or mouses in.

And finally, connect here emotion with "resonance", as we shall see is important for beauty and art.

Cycles can be nested, from clock to "heart" to more global moves, to social interaction tempi, to historical trends.

heart_beat.jpg

16.3. Emotion and gap/span>

There are two cases for emotions : pleasure and pain. Which are two "excessive" situations, both with a gap between resources and demands, and with some impossibility to compute an optimal action to maximize L.

Pleasure goes with a sort of idleness, a positive gap. But, idleness may take to boredom as well as to joy. The answer lies partly in I, and "curiosity" or "creativity" in S.

Pain, or at least stress, comes out of a negative gap. The resources are not sufficient to answer the demands. The result will depend on the size of the gap, and the more or less constringent demands. A minimum, the gap may vanish with oblivion of rather facultative demands. At maximum, the gap is lethal.

Positive and negative will generally coexist, with excess here and want there, without an easy compensation.

Emotion may be triggered by an external event, but also from internal matter or coeneshesia (Poincaré's illumination). It is frequently brutal, uncontrolled, unspecified; a spasm, a small death.

The gap (between pain and pleasure ?) is partly fruit of a free decision, as far as the being may add as well as subtract "demands". Generally, the border between imperative need and pure fancy is not so sharp. Then, happiness is the fruit of self-regulation and resistance to addictions.

The most important gap effects play in the core generative functions. But they appear also in assimilation and expression.

In assimilation:
- non correspondence between inputs and what was expected, or necessary; detection of gaps or contradictions in the external world, or in the representations acquired from external.
- beings in input difficult to analyse, missing headers,

In expression, contradiction between commands, lack of resources called for by the synthetic process.

In core, important effects with orthogonal bits. With "stem" bits, not yet specified (char is identical to int, then still not organized in text, still less in bitmaps). Then functions may operate at a maximum of variety. Here are generated other "cores", other "beings", with their possible independence, not yet constrained by expression rules. As Wagner intimate beings with both their names, texts and musical mottos. Emotions can be weaved in the structures at their deepest.

As in chess game : creativity in the middle, while openings and endings are classical an known by experts.

More on under-determination

Undecidability by L. different from the following.

The white page stress.
Phase beginning.
Roxame's inspiration, desire.

One of the normal ways out of undetermination is to look outside. Observe the world to find in it regularities (compressions...) affording to say better, to get more of the essential, but also to create in some way. See Silvio Pellico and his spider in his prison cell.
Some know how to get more out of their internal assets. For humans, it requires some training. .
For digital artefacts, internal assets can be a rich source, but comes a moment when the available sense extraction have used all that there was.

16.4. Problems in the expression process

In generation by adding forms and models, for example, the size assigned, by itself, may cause problems if not crisis. Let us limit the width, for instance, in an 640 pixels wide picture. Under 5 pixels, it is not properly a being, it will not be recognized as more that a dot, and even not seen if not detached on a contrasting background.

From 5 to some 200 pixels, or one third of width, we have beings, from small to big. A lot of small and a good number of large beings can be placed in the image, without disturbing overlaps.

Beyond some 400 pixels, beings are background beings, and only one may be present on a same image.

In the middle, there is an uncanny, critical zone, where it is difficult to place the beings without hiding others, or having a somehow chaotic image, demanding composition choices.

uncertainty.jpg

Pushing the example, we see that a lot of parameters may be fixed for small beings, few for large ones. There is an uncanny region in he middle.

uncertainty_2.jpg

.

16.5. Ubris

Pleasure search and pain avoidance may also rely on voluntary (at the time of the drug taking)  alteration of the L computation function. Ubris may be necessary to emerge out of a grey life or a painful moment.
Young people/adult people computation function. Naivety vs. experience, possibly too much.

The maximal ubris should come in the middle of work construction. At start, too much ubris would give a loose process, a flabby work. At end, destruction.

Rather negative feeling of culpability, rather positive feeling of a possibility of growth, of an opening up and forth in richness of ideas, of feelings, in serenity.

In that way, ubris opens upon creation, as it lowers inhibitions and then opens new spaces, even if they result finally in illusions. An artist knows that well, who, after the inspiration upsurge, sees coming the time or hard and sometimes tedious work. A poet ha said something as "The first line of verse is a gift; the second one is the bearing of a child).

emotion.jpg impression_expression.jpg

recognition.jpg

16.6. Emotion at the heart of Roxame

Poetically : this level where pulsation springs out ot the note, where rythm sprincs from pulsation (frisé/roulé. Quand "ça savonne"), where code parts from image.
Moral or aesthetic.
Bits are not yet specified, char is identical to int
indifferenciation in Von Neumann, then the code ascnetion : Ascci, Unicode, types, classes...
The heart is small but indifferenciated. Thougt more that the clock !

Let us get out of the traditional mental scheme (at start imposed by the neural system), where there are forms, objects, and then color over. Let us control and develop objects and agents bringing both color and design, and music (Wagner leit-motiv). Let us break these borders

conversions.jpg

At heart, Roxame's soul, the central recursion. Constantly looking, symmetrically, its core and the conquest of the outside.

environment.jpg

A recursive Roxame function, with stop conditions and a decreasing function towards this condition.

Each iteration-recursion is an assimilation/generation of new bits.
Ideally, Roxame re-uses each time the whole of herself. She invest herself totally in the recursion
Idea of a puslation starting from a center, with very few bits. A sort of heart rythm.
For a given central power, goes alog with a simplification of see/judge/act.

core.jpg

The core is the region where bitmap and code stay (nearly) undiffenciated.
A maximum of genricity, orthogonality, variety in use of functions

Which are the limits to orthogonality ?
Illumination :
- everything goes white
- heart stops, or beats franticallly
- an action of immediate fixation
- KC decreases drastically (dramatic, see Hammer & Champy)
- will be followed by integration phases, retakes, alignment

Then, thre is another pulsation : >

Social life cycle : interaction, internal seartch suspension.
Heart beat : an operational cycle, where art bitrs are examined/worked on with ALL the available operators, as well bitmaps as codes or even financial operators.

When heart beats more rapidly, the best would be to adapt resolution. Then, interest of the R(160) resolution limited to 160 bits "pixels". But only if we have good ways to climb down ! Then, on following, that has proved difficult. Perhpas not very intersting, related to "histogram" approaches.

Feel beating the heart of Roxame, make it sensitive like a violin bow to user interventions, to world envents. But not rigidly. Roxame must be able to get out of her phase system when she finds something interesting in what types the user.

Qestion : how to make amplification from these central, germinal bits ?
Here, twomodes : bitmap, word.

The quest for origins is not the linear row of precedences, but the inward loops (is that really different, as a recursion has something linear intrisically ?)

Roxame is a woman. Morre or less. And she loves to have men enjoy, and why not also the women. How to translate orgasm ? A particular caseof vibration.

Pudor. Roxame could like to take its pleasure without showing it.

Instinct. Rapid functions, immediateness due tu connaturality but also, behind, a well integrated culure.

Sometimes, goes up back a once. Sometimes, on the contrary, the process must go to the deepest.

The artist, indeed, does not know which emotion. He is not an emotion engineer (even it that is part of his craft). He talks from an interior necessity.

Each time, computation + ubris, able solve the conflicgts of S. Ubris gives the desire to create, create to solve, compute to quieten, integrate to find harmony.

The whige page stress.
Roxame looks for something, does not find anything, feels a void.

Under-determination
Phase beginning.
Roxame's inspiration, desire.
Internal/environment
(lack, holes to be filled, unregularities, which S tries to compress)

In the environment, detection of abnormality, of what does not reduce into known data
emergence

It would be good that Roxame looks for answers to gaps outside. Internet would be perfect.
Lexical definitions are not enough for humans, they add images in the dictionaries.
Otherwise, the only true internal bitmap of Roxame is the content of the program. And somehow the models.

There is an internal/external symmetry. By a reducgtion of its KC, without losing of herself, she earns room in its internal world, and is able to learn new things and store them.
But also, increasing so her own yield, her own density, she enables herself to get more out of a bitmap of given size. In other words:
KC/internal_capacity = bitmap(640-480)  *  internal_capacity/KC

Inspiration, relatively to the external world, is to find in it regularities (compressions...) affording to say better, to get more of the essential, but also to create in some way. An so to escape death. Idea that an absolute compression, nada, would give the perpetual moe.

Internal life = work on onesefl content:
- RAM, disck, effectiveness of web pointers, peripheral sates.

16.7. Pleasure

Generally speaking, what does "pleasure" means for entities in DU ? It is no simply "good", as contributing to the L of the S seer, but appears only when the seer receives something more that its needs.

- placet opens a way for a hierarchy (Maslow type)

What is pleasure for a mechanical machine :
- runs regularly ("tourne bien")
- regular noise
- good yield
- low wear (?)
- harmony and resonance, assorted sizes, appropriate word length,
- receptivity.

One could say : a little below "just" (the optimum), there is a want, a desire. A little above, there is pleasure, small excess. When we are farther, or we die by starvation or (indigesino), or we enter a new cycle, a new state.

Leisure is the condition of pleasure. Otherwise, we are in the "functional", the necessities of life. When the seer object is totally determined, totally absorbed by survival tasks, he has no leisure to thing about beauty. That is true for moral beauty as well as physical beauty.

The want :
- here, there should be that thing
- that thing should be in this place, but is lacking, or is placed somewhere else
- the connexion graph has missing or extra links.

At a given instant, S is "listening", waitinf for some thing from the external word, with some expected features (quantity, quality).

Serious differences between expectations and actual receipts cause suffering and sometimes death.
E is such as it needs I to move favorably. To be "in search" is a part of E. The modal box in Windows, or I/O wait in any sytem.

E needs I at least because, at some times, S reaches its root, or at least the lowest root level accessible. It could probably push forward down, but for that would need tools of efforts exceeding its abilities.

A regular flow, more or less cyclical, of I, ensures a propre opeeratif of S organs.

In some way, the need comes from the fact that :
- there are organs done specifically to process I,.
- these organs deteriorate if they dont operate regularly. then they forward signals (boredom, sensorial deprivation) to the centre of S in order to relauch some activities. 

That definition supposes a subject able to see (or more generally to feel). That includes the animals (flowers, love parades).
Neurosciences, ethology and applied psychology may help.

At which level does pleasure emerge from sheer necessity ? Feline animals, and superior animals in general manage to survive without investing all their energy ant time in "necessary" activities. It would be more difficult to say that of vegetals.
Pleasure (as well as vision) implies a sufficient processing level to support a sort of "conscience".
See the book of the american woman an her robot. (Breanzeel)

Good pleasure is the one with no addictivity (Platon)
pleasur may be creative : art, and even shopping (as a decisional flow) Microsoft : live, learn,work, shop
shopping, and more generally spending, is a selective act
it implies that the buyer has choice
here also, the he as too much or too little money for what he needs or wants
the very rich, de luxe buying. a major possibility for art

Make sense in the gray is not pleasur proper. If not that it reinstate a thought flow.
pleasure does not bring certainty of sense. but inside limits : hunger or sexual desire, or intellecual curiosity orient the gray
there is the hollow time after pleasure (post coitum animal triste)

Simplicity may be a wrong track. Self recognition demands a minimal structural richness. In order to appreciate the elegance of a proof, we must compare with a commonplace one.

Sex and reproduction create resonance. And demand novelty.0
The beloved one is in resonance, but lasting relationships demand creativity.

Reproduction of identical : "c'est tout moi". . Parlez moi de moi, y a que ça qui m'intéresse". Then the multiplicators : copy, whichever be the blob size.

Gap between the possibilities (affordances of the universe)
the artist dispositions (even if the artist is an automaton)
a leisurely community (or a community giving leisure to part of its members)
the artist has, or takes out of other nees, the time to feel. so for the seer

In some way, an agent needs an external world to get pleasure. There a maximum that is a sort of product, in the creation of a work, with extrema using only the I or only the E.

Can a robot feel ?

That the same question as "conscience", and has so scientific answer by the present day. For two reasons at least
- even about another animal or human, we cannot prove that he feels the worls like us, that he has a conscience like us ; its only by education and body similarities that we infer that ; and it is relatively easy to forget that the other man is a brother, and to cut every empathy ; (then, if emathy is so controllable, we can develop it with artefacts also ? )
- a scientific answeer to these questions would be a proof, another machine ; we cannot ask to a machine the question of our non-machine nature

Pleasure depth, level

Pleasure around the core, a sort of rotation around, exhilaration of a sort of gravity-less, indifferency.
Opposed to the sublime, dramatic aspecdt, compulsive.
Hence, from this orthogonal purity, re-amplify, re-anecdotize, for a given seen.
Connecting to processing depty

16.6. Complement

Ex parte objecti. When an object is totally determined by its function, the problem of making it more or less beautiful cannot be posed. This object is just as it should be : it operates, flies, hits, keeps warm. Dot point.

- Animal, primitive communities have no time for beauty.


But the can be inderminations even in case of non satisfactdion of needs. Because we are multidimensional. Then, even somebody very poor may use part of its life for beauty.

Emotion is both passive and active. May come from an external stimuls, or soul inspiration. Or a combination ("illumination" of Poincaré).

Emotion is frequently brutal, uncontrolled, undifferenciated. A spasm. The "petite mort" of orgasm.

Then create for Roxame a permanent situation of crisis, stress, and paint forth from there.

Emotion is borne with the detection of some gap, cant, and at the same time of its non necessary charracter, of our capacity, unsufficient perhaps but real, even more if we feel moved to do something about it. As would say Proust (Prost ?), in that chaos which the word is at its bottom, see something where our hand can settle the situation.

Artistic emotion may be the perception of an error, or more precisely of a gap that the artist feells able to fill, or where he finds a space whereto develop his creativity, so more if the gap has something new. Or simply a blank page, canvas or palette, or computer screen with a graphics editor, or, deeper, an software development platformt.

Gap in the world, but aslo gap in myselft, a contradiction or non completenes of my ideas, or betwenn my convictions and my actions (St Paul : I do what I will not, I do not do what I will), with the possibility of correctif myself.

Rather negative feeling of a culpability, rather positif feeling of a possibilty of growth, of an opening up and forth in richness of ideas, of feelings, in serenity.

In that way, ubris opens upon creation, as it lowers inhibitions and then opens new spaces, even if they result finally as illlusions; An artist knows that well, who, after the inspiration upsurge, sees coming the time or hard and sometimes tedious work. A poet ha sais something as "The first line of verse is a gift, the second one is the bearing of a child).

 

17. Meaning and sense

This could have been dealt with before in the basic semantic aspects. Here, we take in the more dramatic aspect of "make sense", and in particular when the L function cannot be automatically, rationally, computed.

Measurement of meaning: explicit structure with "meaning" (e.g. extension type header). Just the time when the message comes. Intention of the message centre.

17.1. The receiver side

The first meaning of meaning is the "performative" value of a message. By doing what is commanded, the receiver shows that it has "understood". The dog sits, the lift goes to the fifth. And, millions if not billions of times per second, a computer understand what the program commands.

The effect may be invisible from outside. Be a simple storing of data, for instance. Or "evocation" of data from long termed memory.

Not every part of a message is meaningful for a given receiver. And we can separate two cases :
- the relevant information is formally written on the being, directly readable by a machine (barcode, for instance)
- the relevant information must be sorted out of the "natural" features of the being, by processes which may be rather simple (size or weight of an apple) or highly complex (artificial vision and pattern recognition).

Sometimes, meaning is deducted from the origin (and more globally, of the environment) of the message. For instance, the control box in a lift, where the aimed at stories are defined by the position of the button). Idem in computer menus.

One could design :
- a "physical" or "energetic" measure of meaning (a nuclear bomb would be the most meaningful of messages...),
- a duration measure (from "a glimpse" to "he was never again the same"), or even the difference of life expectancy before and after the operation;
- a "number of recipients" measure ; from TV audience to "severed heads"
- the "deads/miles" news interest unit;
- a message mass measure ;
. a bit alone has no meaning by itself. Only its position, if pointed at by some being, will let it be interpreted as some binary value: yes or no, positive or negative, 0 or 1, black or white...
. Several bits, by their relative position, may be interpreted as a number or as key into a table, for example a dictionary, where definitions will give a meaning.
. Several words, if they conform to some grammar, formal or not, can be interpreted as a fact (indicative mode) or a command (imperative).
. Longer texts need less contextual information, if they are not pure random, they bear regularities which let the code or language be identified.
. Champéry model
- virtuality/reality measure. True facts are more meaningful, but myths also (si non e vero..)
- a receiver ability measure ; The more powerful, autonomous, processing and memory rich, with experience and sensitiveness, the more each message may enter S and create resonance, trigger interesting actions. Perhaps letting aside a large part of the input flows. Depth calls for silence.
- a Delta L measure would be the best ; possibly multiplied by the number of recipients ; and the most interesting in case of L undetermination ; then comes a message, which makes sense ; even an new P assessment, a threat, as well as a new medication, or the opening of a new space.

Assimilation/expression opens some issues :
- How the meaning of the input is decomposed into the meaning of the factors ? Is the meaning of a produced the product of the meaning of its factors (+ the meaning of the multiplying operation by itself)
- How do the construction rules are designed to produce only meaningful expressions/phrases.
- External: induction, severed heads, emitter charisma, authority argument.

The richer the machine, the more it can welcome the riches of other beings. Like freedom, seen as the meeting of two maturities.

One could classify interpreting programs according to the "meaning quantity" they can extract from an input (e.g. a bitmap message). The simplest one would give one only bit (light/dark, homogenous/divers). A sophisticated program will recognize a lot of patterns, possibly a text...

A simplistic being cannot find much meaning in the richest message. Thrown to an electric switch, brick and a Bible will have the same binary effect. At best, a simple machine could be transparent, and let out the original...

But a complex being can find a lot of meaning even in a simple message. It can infer from it much more information that is directly contained in it, using context, general knowledge and particular conventions with the emitter.

In sophisticated beings, the meaning may be measured at several levels. For instance, in signal processing, a first level is given by the sampling, with a high number of bits, strongly marked with a noise as well as redundancies. A higher level will provide the meaningful bits, much less many.

But meaning is not only a quantitative question. It may simply mean... something to be done by the receiver, and definite things to be evoked by him.

Random bears no meaning by itself (as well as it is non KC reducible)

On the Champéry model
- when the number of pixels of the image will reach the limit of resolution. field (e.g. an human eye), the look of the curves will change. After some time, the pixel increase is ineffective since not perceived. The phenomenon may be masked if the seer can zoom from the global image into some particular region.
- to test the theory on images with a large number of bits, will demand to pre-select images. On which criteria ? Analogy problem in CERN for reconstitution, with a-plat zones, and borders.

Event

The concept of event, a fashionable topics in the 1970's (Communication 18. L'événement. Seuil, 1972.) is a case of meaning evaluation. An event is something in the flow of inputs ("news") that is considered as meaningful enough to take a good place in a media. The old journalist rule "deads numer/mile" applies : the number of lines assigned to an event is proportional to the number of casualties, and inversely proportional to its distance from the readership location.

An event could be defined as a strongly meaningful temporal variation. Something abnormal (define a distance to normality ?). Or the fact of entering in a critical state ?

Hence, the meaning quantity transported by a message depends on :
- message length
- redundancy rate
- interpretive capability of the receiver
- conventions on codes between receiver and emitter
- shared knowledge and emotional system by emitter and receiver.

One can show this progression on the example on a pair of wires presented to an observer, having no idea of what it transports (a little theoretical, since the mere look of it will generally be informative, colour and size of the wires, in particular, but lets us go) :

- First, it takes prudently contact and look for the context, to sure that there is no very high voltage.
- Then it connects a classical meter, and detect a direct or alternative voltage. Just that, in some cases, could be an interesting signal.
- If voltage is varying periodically with a crest of 220 V and a period around 50 Hz, it understands that it is on the mains.
- In absence of "strong" voltage, it can lower its measurement threshold and look for classical "signal" currents. In some cases it will find classical telephone currents, or ADSL. Or perhaps just white noise.
- Then it can concentrate on "contents", and identify sources (Internet, radio or TV emission, etc.).
Now it can look for meaning, with language identification, formats, etc. then pattern recognition. Or somesthésie coming from the "natural" world.


A more refined model of activity may be looked for.
The simplest one has two states (active, inactive). Meaning is the time spent in active state
The meaning may be infinite if the message remains indefinitely in active state (for example, if the process loops)
or vanish if the system is not even put into active state to receive it

A little more complex model would distinguish : dead, active, inactive. Another case of infinite meaning is the lethal message

17.3. The emitter side

"That is what I mean"... Intentions and finalities, concentrated in the core of the process, get expression into a message. This one has to be build, integrating meaningful elements and receiver element so that the intention of the emitter be as most as possible understood, and possibly a maximum number of receivers.

Here, sense may be as (consequence of , aspect of ) generative power, an inverse of the KC.

The first bit does not cost. You have just to "set" it. But the second one is infinitely costly: you must question the whole or the former organization creates a structure, metabits, perhaps levels.

To give meaning, S must
- have a sufficiently explicit ontology
- be itself present in this ontology
- integrate L into this image

17.4. Meaning and loops

Loop between emitter and receiver
The aesthetic resonance. The interactive applications, games and art works.
The conventions, language.
The shared knowledge (and "at half word" communication")

Ideally, the receiver understands the intention of the sender. The cores or both emitter and receiver have zero mass and extension, and they can reach a pure fusion, with zero mass and extension... an time ! Love forever.

The self loop

Emergence happens when, between inputs and outputs the relation is no longer banal, goes beyond the mere reflex of "signal effect". Meaning comes when there is some level of an "internal reflection". And so more when there is an internal "motor", which may be thought of as interpreting the inputs.

The most interesting things, and possibly emergence, happen in the deepest part of what is analysed and used of the being itself. A basic limit is that the core is not directly accessible, event to the being itself (there are "minimal loops" around the core). .

Moreover, in the image development world (cinema, even games), the R&D efforts seem focussed on superficial aspects (rendering), with little effort on deep things: behaviour for instance. That is changing slowly. It goes a little better in games, where NPC's (called also AI) are more and more sophisticated.

The computer may itself "look" at that image, as does presently the graphist on its CAP (computer aided publishing) screen, or the manager looking in a histogram the meaningful inflexion point that will guide its choice or creativity.

Why look at, re-interpret, re-recognize an image that S has done itself ? That seems useless, since all the elements are known. But...

Practising the plastic arts teaches that the designer cannot totally work in abstraction. He must launch and lean, try and correct. The constraints to be respected are so many and precise that arbitrations are necessary. Structures contradict other when concretized in a limited surface. Too ornate character fonts must be abandoned, even when the diversity of the test or the stylization will encourage it. Then the process of making first a dummy, looking at it to trim ad adjust, detect incompatibilities, etc. These processes, basis of human creation, can be partially automated.

17.5. Semantic distance

Two beings are near each other if :
- they go the same way,
- call for the same action(s)
- may be combined in the same action
- are frequently call for together (possibly for historical reasons).

Cognitive dissonance.

Passive_interactive.jpg

18. Beauty and art

Pulchrum est quod visum placet (Thomas Aquinate)

See Estelle Thibault

This value is particularly important in DU, since the basic needs of "good" are superseded by informational values. Moreover, when the rational computation of L becomes impossible, "making sense" calls for another kinds of references, which conform rather well with what has generally been called "beautiful", from the arisotelician/scolasctican trilogy (integrity, clarity, harmony) to the more modern "original resonance".

Evaluation or beauty, or aesthetical judgment, can be programmed in some measure. A layered model is here useful would it be only to dispel objections about beauty measurement :
- at bottom, rather evident criteria : a white canvas, or a pure white noise on a screen, have a very limited beauty. The same applies to silence or white noise on a loudspeaker.
- when there is a minimum of "substance" in a being, a well distributed use of the available space is a positive criterion, be it for a canvas surface, a gamut of colours or a audio tape pitch range;
- when human beings are the seers or listeners, conformity to eye and ear features, general as well as particular to a given person or group, is a major criterion;
- at a higher level, the meaning value of a work is more important than anything else. Meaning of beings may be evaluated using parsing and pattern recognition algorithms.

In the normal case, all these layers rest one upon another like in the old saying "anima sana in corpore sano". But, in no less important cases, beauty goes with sublime, which, following Schiller is a sort of inversion of values: very little substance (some pencil lines, some solo notes), if perfectly tuned to themselves and to the receiver, may elicit major emotions. Here, we reach a more difficult, but not totally impossible, programming domain.

Finally, for the tenants of "human exception", programming possibilities at lower levels do not prevent to consider as incommunicable the aesthetic judgment of persons.

18.1. Visum

We must generalize "visum" to "perceptum", since vision is only a kind of perception, and in this context, rather anthropomorphic.
But the reference here is to a process : being is not beautiful per se, but in a process of perception. If beauty may be objectively assigned to a being it must be done through the modelling of a "seer".

18.2. Pleasure

Generally speaking, what does "pleasure" means for beings in DU ?
- then good for P en L (pleasure makes long life), but also pleasure in growing (H)
- placet opens a way for a hierarchy (Maslow type)

What is pleasure for a mechanical machine :
- runs regularly ("tourne bien")
- regular noise
- good yield
- low wear (?)
- harmony and resonance, assorted sizes, appropriate word length,
- receptivity.

To be transposed to "flow of though"

One could say : a little below "just" (the optimum), there is a want, a desire. A little above, there is pleasure, small excess. When we are farther, or we die by starvation or (indigestion), or we enter a new cycle, a new state.

Beautiful is an anomaly in relation to functional. What is too perfect (too conform to the formal definition), is cold. Besides, to reach perfection has its price. Art for art is dangerous. And it is not normal that everything be too good. There is a standard proportion of shitness. Oil, impurities in steel, etc.

Leisure is the condition of pleasure. Otherwise, we are in the "functional", the necessities of life. When the seer being is totally determined, totally absorbed by survival tasks, he has

No leisure to thing about beauty. That is true for moral beauty as well as physical beauty.

Beautiful_L.jpg

Beauty as excess : the right measure is not sufficient.
There is here a contradiction to the "functional beauty" (Dassault), if beauty comes from a total determination of the being by efficiency, when pleasure is a kind of indifference. That contradicts also the connection of beauty to fecundity (another form of functional beauty).

The want :
- here, there should be that thing
- that thing should be in this place, but is lacking, or is placed somewhere else
- the connexion graph has missing or extra links.

At a given instant, S is "listening", waiting for some thing from the external word, with some expected features (quantity, quality).

Serious differences between expectations and actual receipts cause suffering and sometimes death.
E is such as it needs I to move favourably. To be "in search" is a part of E. The modal box in Windows, or I/O wait in any system.

E needs I at least because, at some times, S reaches its root, or at least the lowest root level accessible. It could probably push forward down, but for that would need tools of efforts exceeding its abilities.

A regular flow, more or less cyclical, of I, ensures a proper operation of S organs.

In some way, the need comes from the fact that :
- there are organs done specifically to process I,.
- these organs deteriorate if they don’t operate regularly. then they forward signals (boredom, sensorial deprivation) to the centre of S in order to relaunch some activities.

We have experimented with Roxame that some "art critic" is programmable within limits.

That definition supposes a subject able to see (or more generally to feel). That includes the animals (flowers, love parades).
Neurosciences, ethology and applied psychology may help.

At which level does pleasure emerge from sheer necessity ? Feline animals, and superior animals in general manage to survive without investing all their energy ant time in "necessary" activities. It would be more difficult to say that of vegetables.
Pleasure (as well as vision) implies a sufficient processing level to support a sort of "conscience".
See the book of the American woman and her robot. (Breazeel)

Good pleasure is the one with no addictivity (Platon).
Pleasure may be creative : art, and even shopping (as a decisional flow) Microsoft : live, learn,work, shop

Make sense in the grey is not pleasure proper. If not that it reinstate a thought flow.
Pleasure does not bring certainty of sense. but inside limits : hunger or sexual desire, or intellectual curiosity orient the grey
there is the hollow time after pleasure (post coitum animal triste)

Flower, butterfly, sex parade, bonobos
Man can appreciate beauty in nature for different natures :
- directed pleasure, refreshing colour, harmony of landscape
- recognition of beauty criteria (integrity, clarity, harmony) in nature, objective canons
- CK ratio, simplicity

Simplicity may be a wrong track. Self recognition demands a minimal structural richness. In order to appreciate the elegance of a proof, we must compare with a commonplace one.

Sex and reproduction create resonance. And demand novelty. Sex being the pleasure "par excellence" for gender animals. The beloved one is in resonance, but lasting relationships demand creativity.

Reproduction of identical : "c'est tout moi". . Parlez moi de moi, y a que ça qui m'intéresse". Then the multiplicators : copy, whichever be the being size.

Generally, a being needs an external world to get pleasure. There a maximum that is a sort of product, in the creation of a work, with extrema using only the I or only the E.

Which beings can feel ?

That the same question as "conscience", and has so scientific answer by the present day. For two reasons at least
- even about another animal or human, we cannot prove that he feels the world like us, that he has a conscience like us; its only by education and body similarities that we infer that; and it is relatively easy to forget that the other man is a brother, and to cut every empathy; (then, if empathy is so controllable, we can develop it with artefacts also?)
- a scientific answer to these questions would be a proof, another machine; we cannot ask to a machine the question of our non-machine nature

Pleasure depth, level

Pleasure around the core, a sort of rotation around, exhilaration of a sort of gravity-less, indifference.
Opposed to the sublime, dramatic aspect, compulsive.
Hence, from this orthogonal purity, re-amplify, re-anecdotize, for a given seen.

18.3. The Aristotelian trilogy

Integrity

Integrity expresses the fact that the being has all the matter necessary to implement its form, and all the subforms included in the form type: its totality is not severed of essential parts.
Integrity evaluation may seem rather subjective (reference to an object type general model), but the idea is clear. Very often we perceive immediately the absence of a component.

Very small defaults of integrity (bad contact, wrong branching, genetic error, spine in St Paul flesh) may have material consequences on the operations: stop, possibly breakdown of the machine, disease, death. These sensitive points are interesting to be known, not only for the foes but also for control and command. In other places, small errors do not matter.

Eye and brain compensate largely for small defects, sometimes at the expense or realism (optical illusions, artefacts, cultural blindness...).

Some lack of integrity are artistic effects. See the hierarchical model of beauty. Anyhow "style" is always the choice of a categories of faults. See Leech.

Integrity is efficient because it is longer to describe an object with defects: the defect demands a specific description, added to the "type"; by definition, a defect is a gap relative to a canonic description; if there is no default, we the occurrence by simple apposition of anhaecceity; nevertheless, the occurrence may also have positive features; where is the threshold beyond with, quantitatively, a difference relative to the type becomes interesting, positive (see Leech). Small differences are "meaningless", but the case of critical points (in particular, loss of continuity).

Some defects kill the type itself ; and so more easily if there are many types relatively to a pixel (for instance, numbers represented on a 7 bars display ; then, an error on the medium bar changes from 0 to8 ;
I f the code is minimal (n bits to code 2**n beings), an error on one bit causes falsity on the whole.
Afterwards, some error combinations do not matter : in some cases, if I move a whole line over one bit, the Hamming difference may be enormous, but without changing the meaning.

Truth, as absence of error (or forgery) , is a form of integrity

Clarity

In representational arts, clarity means the quality of conformity between the work and the reality; or, better, between the work and the authors intentions.
Clarity : KC not hidden by noise, negative or positive (silences), sharpness

Clarity, with absence of obscure parts, avoids negative descriptions, as well as high energy levels to get a description. Clarity may also be seen as a coincidence between the superior and the inferior level (French garden). Obscurity could be taken as "difficult recognition", which can be expressed quantitatively in time or computation power needed. In some way, a concrete being is beautiful when there is harmony between an idea and its material implementation.

Harmony

A rather simplistic matter of "proportion", ratios, symmetries. Very Greek. See book on architecture

Harmony in construction of the beings to combine the "causes"

Harmony affords shorter text,
. because implicit description is possible
. because the relations between parts follow simple ratios, then descriptible with few bits, be the ratio intervene between the different parts of the being, or between the being and the seer ; in other words, there are redundancies in an harmonious being, which appears as the "natural" deployment of a structure. A circle is more harmonious that an torsed line..

Harmony means that the parts of the work, the patterns, the colours, conform to pleasant proportions.
Internal proportions of the being
Proportions to the seer (and the seers proportions).
Proportions to that author capacities.

Semantic harmony internal to the being (e.g. no anachronism) and semantic harmony with the seer (what he wants, desire to see, is able to appreciate).

Harmony, proportions as well internal as relational (the artist and the seer).

We have first structural harmony, then metric harmony (pressure, temperature). Technically speaking, beauty appears at the tuning moment, setting, trimming time. Up then, the system has to work, the concept to be proven. It only then that beauty finds its place, would be only in order to have the system properly running.

18.4. Resonance

resonance.jpg

The trilogy is favourable to resonance:
- integrity ensures that there is not too much "friction" preventing the resonance,
- clarity reinforces the correspondence;
- harmony is the correspondence by itself, opening the way to resonance.

Resonance would be a meta-passing or the gap between the work and the seer. The hole is not filled, but communications goes between both sides. A little like AC current through a condenser, and not DC.

Does Resonance between cognitive beings implies to beings, each one having its own clock, and reciprocal read/write relations. Then it needs
- an energy source to maintain it ; this source may be week, is dissipation is weak
- an appropriate "mechanical" device.

Art for art is resonance for itself.

The basic self-resonance is the clock itself, at the heart of any system, allowing recursivity. Any resonance is autotelic. Lyotard (enfants 20). Resonance between humans, without reference to anything else. Lyotard (enfants 19) Consensus.

A simple resonance between the will of create in the artist and the public which wants to recognize him as such. Fans make the star as much as the reverse.

Resonance may be triggered by very small beings. Sublime is a deep resonance (possibly from strong beings, dantesque...) I can modify myself, control myself to enter into resonance with another S. Or, on the contrary, I can modify the external S so that it resonates with me.

In the Kaufman's integration process, there is perhaps an intermediary phase between an external disturbing being ant the moment when it is so integrated that I no longer sense it. There is the "aesthetic" phase, during which it resonates. Deals on this way the question of functional beauty (Dassault).

Resonance could use metrics, at least using metaphorically the traditional mechanical (spring masses) or electronic models (resistance/capacity, etc).

Then art could be seen as the computation of what will resonate. Exactly as one builds a sound resonator with electronic components.
For a static being (canvas), the work itself is still, and cannot resonate with itself. It is the seer which vibrates. The canvas is a sort of mirror (or think of another component in an electronic resonator).

What vibrates is the image the seer makes itself of the work it looks at.
Within the seeing cycle, the image of the canvas changes more, there is an evolution, something new at each cycle. The canvas is progressively discovered.
If the resonance goes beyond some thresholds
- drunkenness, and even death by sensation excess
- for mystics, ecstasies,

Resonance and heart beat.

Resonance plays not only between two beings, but within a whole set of beings, e.g. all the electronic components in a interference noise. Also the mob, fusional resonance with many.

Resonance may be complex, implying several resonators (piano strings, FFT), or transformation of a simple resonance (basic oscillator, clock) by a strongly structured system (synthe). There can operate several interaction loops between resonating beings ?

But it is not abnormal that a resonance trigger harmonics, if they find favourable conditions within one or several participants to resonate: alleviate oneself perhaps, but rather give oneself appropriate mass and springs, simplify/complexify (acquire culture to "understand"). Augment the signal energy.
The seer, auditor, matters. Without it, the author is in resonance only with the object, may be narcissistic, anecdotic.
If a public enters in resonance that proves that the author (and the work) have reached something universal.
External expression of emotion. See Breazeel. And all the works in computer graphics about behaviour.

18.5. Originality

Basic needs do not demand originality. That depends on individual psychology (cats/dogs). And, of course, something is original for the being which sees it for the first time.

Relate to generativity/genetics, and also reproducibility (Benjamin).

Originality will come out of the genetics, even it the "parents" are known. But the combination is poor if the generation is, for instance, a simple mix or juxtaposing (cadavre exquis, chimera, basic compositing).

Unpublished is different for a beginner and for an expert, which knows the (whole) existing stock of works. Local/universal-absolute originality.

Resonance implies stability during some (hundreds of ? ) cycles. But, after a time, its erodes and vanishes. Even for humans. But, in some way, humans may create new resonances with the same external beings it they change themselves, augment their richness. At the limit, the mystical person seems to resonate with the nada, because she has powerful resonance aptitudes inside herself. Ascetism is a search for resonances with my body, and as deep as possible (hara, etc.).

Originality will not induce resonance if the semantic distance is too high

Originality introduces an historicicy. A relation order. Any new thing may be different from the others. Possibly not larger. But necessarily so inside a narrowly defined family. But a new artist may be original if he chooses something different (standards sets, neither better nor worse, but different, Michaud, Critères esthétiques et jugement de goût).

Not every resonance is pleasant (Larsen, Parkinson disease, parasitic oscillation in control systems). There must be novelty. Triolet about automats.

At some time, one wants to stabilise. That may be done with frequency assignment (possibly even spectrum or other pertinent parameters) of the resonance.

New bits are (in part) the negation (see the n+1...).
- of nature by reason, formal languages, machinery
- of reason by Art, poetry (vindication of analogy and intuitive thinking modes), non linguistic arts, shout
- of existing by unpublished
- of inertia by recursion
- of individual will of control by market and democracy

For Lyotard, art is also a negation of political speech (avant-garde) by pure freedom. Duchamp is totally autonomous. Free act of Sartre.

That leads us to associate beautiful with structural richness. A being would be more beautiful if its structural riches are high. . The formula is too brutal.
Nevertheless, if we look for originality, we must do something different than all the existing beings, and different in a meaningful way.
Original_resonance.jpg>

To do something else (see Michaud), the artist must know what yet exists
that is true also for the public, the seer must also know that existence
and the mere facts
- of the existence of these anterior works, patrimony, demands to find free places, sufficiently different ; and, after a while, to widen the space with supplementary bits
- that implies, both for the artist and the seer, a cultural level, far from naivety. Brutto, child or insane art have their limits the visit of local painters shows are eloquent .

"Jusqu'au moment où, heureusement, la norme de la mode, de la distinction, du snobisme, la norme comme norme vide prescrivant ce qu'il faut admirer au sein du groupe, nous devient si pensante ou indifférente que le goût doit faire retour, sous une autre forme, éminemment subjective de nouveau - un sentiment qu'il faudra, à son tour, normer. Ce qui ouvre le temps et l'espace d'un nouveau jeu de normes. Un de langage épuisé s'ouvre alors sur d'autres. Ni meilleurs, ni pires. Ni plus avancés, ni moins avancés. Simplement autres. Yves Michaud, Critères esthétiques et jugement de goût.

Originality can be detected by automats. Sacem, for instance, has a software to detect plagiarism in music. Web general access lets come a time when the originality of an image can be tested against everything that is on the web... which is nearly all. Such a challenge would be fancy if Google and others were not on the net.

18.6. Generative power

18.6.1. Basics

That could be the best definition of beauty : a generative power, triggering at the same time emotion and new beings for and into the seers.

Generation occurs all along the process. But specially at core.

Beautiful plays mainly on low weight bits. In the most vital regions, integrity, clarity or harmony deficiencies are lethal. No place for aesthetics. Beauty can emerge only in a free band where energy is sufficiently low, but for sublimation cases (which probably could be modelled even in HD).

Beauty as maximization of the functional mass/organic mass ratio. Or as a maximization of a description efficiency.

We constantly go increasing masses. To make a work of art is to increase the generative, evocative power.

18.6.3. From KC to GP

We can define a "generative power", as inverse of the generic complexity. Instead of the minimal text to describe a given bitmap , it would be the maximum bitmap which could be generated from a given text/description. More generally, the heaviest being that can be generated from the being in question.

The most beautiful possible image. Admiration of simple minds for very minute works.

KC of what the message may evoke, or KC of the effect which is searched for from the seer ? If the spectator (possibly Roxame, or an human) is dense, powerful, an external message will have stronger effects (at least if it is considered as meaningful by a not "blasé" seer),
- because Roxame has an open state of mind (she has worked to free herself),
- because, with her dense part, she knows how to process efficiently the bit flow in input, in particular how to reduce it significantly),
- or because she can add "dense" to dens (extend up the construction).

More formally, the strongest effect that may be obtained on a seer (finite automaton) with a message of a given length. What if the automaton is recursive ? It will elicit an infinite output message. But some ulterior seer may analyse the infinite message and make explicit the recursive function.

How to evaluate this maximal effect:
- number of functions called for,
- number of bits changed in E,
- processing time without repetition (importance here of the equivalence notion),
- output richness (output KC).

We can measure
- effects on humans (psychology)
- meaning as words, then quantity of description, depending on the work and on the analysis capacity of the seer. see meaning.

If there were a purely orthogonal core, then we could add as many dimensions as we would like, and give them any kind of meaning, independently of all the others. But we have seen that the core is never purely orthogonal. Then art has to play around, possibly use the idiosyncrasies of the core pattern/format.

18.7. Style

Style here as a case of originality. Style as a multiplicator. The generation is : subject * style : " un coin de nature vu à travers un tempérament" (Zola).

Can we deal with that as of a parameter, a "stylisation level", let us say between 0 and 100 ?
- 0 : Roxame produces out the original document as such ; a landscape picture, for instance ; totally realist, concrete, objective .
- 100 : Roxame integrates nothing of the original document ; totally abstract, subjective.

But something more multidimensional would be more appropriate.

Style demands some gap, with possibility of choice, and to drift somehow from realism,
- either because the image is not basically utilitarian in respect to this reality ; it is not an identity card photo, an historical record or a geographical map (note that a map is not an air photography),
- or because the image aim implies by itself a transformation.

(Gap is always possible with "sacrifices")

With Roxame : attempt a style quantification on LineTo. I have tried with step/ubris. I should check than on styles other than 0, with 0 ubris, we find back an equivalent of 0 style. Actually, to get a line as straight as possible, there is always some effort.

With Roxame : then a good thing would be to go through a lesser resolution and poorer palette ( Done in 2007 with the image at difgen 100 000). The reduction phase should be managed as smartly as possible, possibly with a meaning conservation/stressing aim. The enrichment must also add something meaningful.

In the traditional plastic arts, these choices (selection enrichment) are defined by
- the available technologies (BW photo, watercolour, pigment set...
- the artist abilities.

Roxame, within its own freedom space, may have an wider determination, and moreover more isotropic.

Roxame: A script, and the random parameters integrated in the process should be the shortest description (then the KC) of an image generated by Roxame. At least during some time, since, soon enough, new operations crush part of the ancient image, or even destroy it completely (blank page). In this case, the script can be cut from all the previous part (unless some data are kept in memory, such as segmentation or classification).

(We have here an interesting case of equivalence on text strings : if a string begins by a blank page instruction (+ desegmentation and declassification) is equivalent to all the strings having anything else before that instruction.)

Classical typology of styles:
- figurative ; photo
- realism : adding non-idealistic features ; the real must not be more "beautiful" than it is ; coarse, vulgar, violent features must be stressed
- hyper-realism : photo with a resolution higher than the retina's, over-presentation or the being as it is or as it should be (reconstructed, in some way), factory models, theoretical design, set of other views, historical data, etc.
- impressionism : sticking not exactly to the real, but to the phenomenon
- symbolism : the graphical being is a pretext for convention and code use and combination, aiming the brain working of the seer.

There is some symmetry relating abstract and concrete ways.

Saturation.jpg

Hyper-realism, for instance, when it paints like they should be, brings into the work the artist's culture, possibly scientific (laws of physics, biology), or experience of the seers reactions to a canvas.

18.8. Aesthetics as the "final" value reference

"L'esthétique, envisagée comme science des structures, mérite de devenir le guide sûr des actions humaines quelles qu'elles soient... Dans ce cas, la théorie des ensembles serait à l'esthétique un peu ce qu'on été les mathématiques classiques à la physique" (Laborit, Biologie et structure)

Koestler writes "as if". Art is a bridge, similar to religion. One does as if, one paints the ideal, even if it is not actualizable "en vrai").

Aesthetics is final for two reasons
- basic needs are fulfilled (but if world crisis), then from need to desire and pleasure
- L not computable
- from "matter" to formal.
Individually, globally.

The role of art is indeed to "not iterate" (Longo, Maurois, Triolet) when there is some gap to original generation and nothing dictates the way. Art is the generative power for itself.

But how does creation play on the L of S ? If its appreciated by others. P linked to pressure and temperature, with a sort of market managed by GOS, or Google. And competition.

Or possibly a kind of internal geodesic for some kind of beings, with a wear when monotonous.

Varia

Aestetic criteria, the wine tasting example

1. Binary
- objective: good/bad or right/wrong (accordant to meal, daytime, budget...)
- subjective: I like it (or not)

2. Scale
very... a little... passable... horrible

3. Quantitative
- alcoholic degree, cost

4. More lengthy descriptive
- the drinking process
. look (luminosity, hue, viscosity (tears)
. smell (1st and 2d nose)
. drink (with phases, 1st shock, deployment, decay, mouth bottom)
. talk : share advices, tell tales around


19. Art

Art and beauty are different topics, even if we limit ourselves to the "liberal arts" and do not take the word in its ancient sense, integrating technology. Art beings are artefacts by definition. But, as seen earlier, we include here the deliberate activities of living beings. Here also, let us try do de-anthropomorphize.

Art, properly speaking, has no other aim than creation of beauty, in the sense defined in the precedent chapter, and pays a particular attention to originality.

19.1. The necessary gap

Art needs a wide gap between constraints and resources. This gap is multifold.

Work_liberty2.jpg

The technical gap

Technically, that places plastic arts (painting, sculpture) in a eminent role, with the framed canvas as emblematic being of independent art. Music, and now all sorts of multimedia need a little more integration into life, but less than design (from industrial and graphic design to fashion). Architecture has, in all ages, drawn an arch between all these worlds, and had to take into account the function of its constructs.

An important way for the artist to grant himself free space it to take liberties with the rules :
- The poetic licence is a well studied case. It frees the poet of the normal language rules, current vocabulary and syntactic rules. Geoffrey Leech drills that spot in his A linguistic guide to English poetry (Longman 1969).
- The representation constraint in painting loses importance after the photography advent. In Le musée imaginaire, Malraux writes "Le sujet doit disparaître parce qu'un nouveau sujet paraît, qui va rejeter tous les autres : la présence dominatrice du peintre lui-même. Pour que Manet puisse peindre le Portrait de Clémenceau, il faut qu'il ait résolu d'oser y être tout, et Clémenceau presque rien".

In digital terms, one could roughly describe the gap as follows : Let the available bit space for the work is ABS (the bits may be pixels, or letters, or notes on a score...). The various constraints on the work (economical, semantical...) demand a minimal bit space MBS. Then, the work can be done only if ABS >= MBS. And the artist liberty is proportional to the difference.

If MBS = ABS, there is no gap for art ; all the resources and all engineering is spent on functionalities, commercial value, etc. Generally, that results in "cheap" and dull beings. Which is of no consequence if this work is a hidden part inside a machine for instance, but is regrettable for housing, where human people have to live in deprivation of beauty.

Nevertheless, in some cases, a particular kind of beauty may emerge from a perfect adaptation of the being to its function. That is perhaps because, even within the MBS, various layouts (let us say permutations) are still possible, and among those aesthetic criteria may find their place. Would it be only some kind of admiration for the cleverness of the engineer.

(Of course, such a coarse model must be a taken as a mere sketch).

Music

Music performance, as an example, consists, starting from the score, to divide it by its typographical peculiarities, then to process it by combination with the interpreter own mood and genius (in particular, in what he tries to understand the composer intentions), then to multiply times instrument specificities.

In music, temporality play a sophisticated role, from the pitch frequency of each note up to the date and time of the performance, through bars, score parts, parts of a concert (with applauding times and pauses..). But, in principle, there are no loops (but perhaps the basically cyclical nature of the instrument).

The above description is not digital globally, but becomes more and more so. Any disk or CD or Internet music player does the same thing basically, of course without "interpretation" proper, though some efforts have been made in this way ( to be checked).

In music, the simplest digitizing consists in the sampling of an analogy signal: acquire its analogy value at regular intervals (sampling rate) and convert this value into several bits (8 or 16, for instance). That suffices for the main forms of processing.

We can go further by aligning the signal on a series of predetermined frequencies (notes scales), or of timbres, or by considering it as the expression of a typical instrument or set of instruments. That what the Midi standard does.

Still further, we can "multiply" a musical theme by ("times") a selected composition structure. It is then possible to compose "automatically" some new scores.

19.2. Style in the gap

The first use of the gap is of course to apply aesthetic criteria (proportions, symmetries... structures explicitly shown or hidden) even when they contradict functional needs or habits. The easy way is to decorate the beings on non functional part. The dream is to find the ideal point where beauty and functionality converge. That may imply a re-thinking of the being from its most basis aims and components. In beings of art like a painting, deep aesthetic aims may demand to sacrifice superficial ones, to let being prettiness to reach beauty, if not sublime.

Here, our description of DU beings a basis for a model of "style".

At the core of the creation process, the aims and constraints of the work to be done have been analyzed by the system. Then takes place the generative process, using the gap to mate these data with the aesthetic rules, and beyond them, with the personality of the artist and the "style" he is practising at that moment.

Some examples :

- In genomics, Christian Gautier, scientist in CNRS, said (in an interview to Asti-Hebdo http://www.ibisc.univ-evry.fr/~asti/Archives/Hebdo/h19.htm) "The common theme leading our research is the writing style specific to each genome. The style possibility stems out of the fact that the building of a given protein may be triggered by several different messages (codon sequence).

- In Oscar Wilde's the picture of Dorian Gray, the painter says he will not exhibit the canvas: "I have put too much of me into it... every portrait that is painted with feeling is a portrait of the artist, not of the sitter. The sitter is merely the accident, the occasion".

A major aspect of cubism (and particularly of Picasso) is to use the simplification of drawing and the available space to combine several views of the same being.
- We once heard a wine maker in Mercurey say that a progress of complexity of wine afforded to offer a perfectly typical Mercurey with the brand's style.
- Though they are more functional than artistic beings, geographical maps are strongly styled, and very different from "natural" aerial photographs. The cartographer selects only what is pertinent, sacrificing in particular local textures and natural colours. Then he can stress the pertinent objects (roads, building) and also add non physical features as administrative borders. He will even, sometimes, sacrifice some details or some geographical exactitude in order to place the toponymy. Hence, the style of different map makers, would it be only by character fonts.
- Hyper-realism is a way of using high resolution image to go beyond normal photographic images and get out if effects of style.
- It is known in programming teams that, even in purely functional works (business applications for instance), it is frequently possible to recognize who has written a given part (for instance, with the choice of variable names).
- Page setting and typographic character design is well known. In occidental languages, 6 bits are enough to identify the upper and lowercase letters, figures and punctuation marks. We then would have a zero stylisation with six black and white rectangles, 3 columns of 2 rows. But these extreme solution is nor really readable by humans. It is readable, but unpleasant with a 7.5 matrix. Everything beyond that may be considered as "style". In software like Word, for Times in 12 font size, the matrix is of 20 pixels height and an average of ten in width. Then the margin is 200/35, one sixth. At an extreme, an illuminated letter in a Middle-Ages manuscript may bring several megapixels, with a 100 000 margin. Of course, at such a level, the "style" concept is largely overspassed, and the letter may become

19.4. Machine as an artis

A new opened progressively: to find novelty no longer in the spectacle of the world or in the exploration of unconscious, but to transfer the creative power itself to various kinds of machines.

The idea found its way first in music, of which the digital nature gave easier opportunities. The basic idea is that of automatic composition, with use of random draws, has found expressions since... The idea is present, for instance in Jules Verne's novel The Day of an American Journalist in 2889 "...what a charm he found in the works of our greatest masters, based, as everybody knows, on a series of delicious harmonico-algebraic formulae! ". It progressed then up to the serial music.

In painting, the idea was formulated in the 1930's by Schillinger, with his project of a "graphomaton". Tinguely built some painting machines at the end of the 1950's, but in his derisive way. Then computers brought the practical way of transforming algorithms into pictures, and a lot have been done, first nearly as a side product of scientific computing (simulation fluid images, for instance), then in a explicitly artistic dynamics. Names as Charles Csuri and Harold Cohen stay as landmarks in the road. In ... Hébert wrought the "algorist" word, and the move is still active, wih the creation of the association Les Algoristes in France, by the end of 2006.

That sets out a lot of artistic and philosophical problems. And one specially: it tends to focus the interest of art creativity in the machine creating process more than on the works by themselves.

As long as art produces work for humans, artistic machines will have to work within the limits of their appreciative capabilities. Then, the real progress will come when superior machine artists will work for superior machine seers, able to resonate fully with their innovations.

19.5. Roxame and Primeval

Is there a "Roxame style" ? Rather a meta-style, due to the limits. Beacause here there is no place for a long reuse of a "manner" (Toffoli, Buffet, Vlaminck, Doutreleau...) since, once a strategy is well trimmed, it is possible to make as many works out of it as one wants, taking the variety not only from random but from external documents. And, when Roxame is stopped by some limit, one can tray to out pass it (e.g. if it is too stiff and skinny, fuzz functions may be added).

With Roxame : attempt a style quantification on LineTo. I have tried with step/hubris. I should check than on styles other than 0, with 0 hubris, we find back an equivalent of 0 styles. Actually, to get a line as straight as possible, there is always some effort.

With Roxame : then a good thing would be to go through a lesser resolution and poorer palette ( Done in 2007 with the image at difgen 100 000). The reduction phase should be managed as smartly as possible, possibly with a meaning conservation/stressing aim. The enrichment must also add something meaningful.

In the traditional plastic arts, these choices (selection enrichment) are defined by the available technologies (BW photo, watercolour, pigment set...) and the artist abilities. Roxame, within its own freedom space, may have an wider determination, and moreover more isotropic.

Roxame as an artist: the two criteria: resonance (with normal people) and originality (surprise to me).

Primeval. An experiment in genetics. The model has been given by Bret.

19.9 Varia

Art as a decisional flow