Common nouns | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Index Berger's Works
Proper nouns A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z| MY IDEAS

Generation
and generative beings

1. Basics

The general being/automaton scheme IEO is developed in three "phases" : assimilation, core genetics and expression. The term "generative beings" is inspired by Lioret works.

The automaton IEO model, that we use largely in this work, can be developed in order to offer a richer presentation of the processes in DU. Biological concepts come then nearer, with of course some very important differences. Inputs and inputs take place in a flow, or multiple flows of digital processes, with in principle no physical adherence (but DR, of course), so that one could say "flow of thoughts" (Bryson/Conan). If that was not too much anthropomorphic, and anyway inexact (even for human beings, thought takes place only at superior levels of operations).

analysis_synthesis.jpg

We shall consider that the basic process is, schematically, made of three phases : assimilation, core genetics and expression. The process is nearly sequential and symmetrical when the process is simple. For example, in an acoustic filter, a message router or a text translation by a human being, input beings are first unpacked of their physical conditioning and transport formatting, dealt with according to the function of the process, then conditioned again and sent. Assimilation and expression may be completely disconnected in time but also in structure. For instance the recording of a musical group will operate in a series of sessions involving a few artists and editing specialists, and result in a master record, which may even have no physical existence outside a computer. Then will begin the distribution processes, with a unique original record conditioned graphically and physically to reach a seduce a public as wide as possible, in some cases under several different forms. In the lifecycle of a mammal, and of course of a human being, learning processes take several years or decades, and the output along the professional and social life will conform to very different patterns.

The three phases use "languages", in a rather general sense of structures affording assimilation and expression, and concentrating in the core what we shall call the "genetic" processing, in a way rather different of what is generally understood by genetic programming of bioinformatics. These processes, but for particular cases, generate gaps and losses.

Necessity of laws to build something of interest.

2. Assimilation

The global being and the sequence of its bits, may be considered as two extremes, or we could say projections on coordinate axes. The analysis consist in dividing the inputs as far as necessary for the current process.

(Note that, if the process is related to the physical world, a first phase of "analysis" is the sensing and A/D (analog to digital) conversion. If the inputs come through a network, the first phase is to "unpack" the message).

A very simple case, for instance, is the sorting or selecting of beings according to a parameter explicitly written in a known place. For instance, files according to their name, extension, last modification date, etc. A less simple case is the same operation but according to non explicit features of the beings, then this feature has to be recognized ; it may be easy (select all the low key images ; or very difficult, if not impossible for machines and possibly also for humans (select the pictures representing a given person).

A first level of analysis is implied by the type of the being, which generally will demand some pre-processing before operating proper, for instance a .jpg image, which will be analyzed as a matrix of pixels for ulterior processing. The analysis process, at large, starting from a file in memory or a stream on a line, will begin with de-streaming and preparation of an appropriate copy in the central memory.

Data files are generally organized in records. The basic analysis process consists in providing the records, one by one, to the application program, which in turn will use its data structure to apply its routines.

In text, or word, processing, starting from a printed on paper document, the basic digitization is a scanned image, a "bitmap" file. Which will be enough to display it on a screen or printing it.

If the document is actually textual, the digitization goes further if we replace the bitmap image with a series of digital codes (ASCII, Unicode). Not only that is a major form of data compression, but it affords also a lot of new processes, on the first rank the character string king of retrieval which any Internet user is familiar with. It also opens the way to syntactic and semantic searches.

The basic code may be completed with codes describing the typographical properties (font, size, colour) and possibly the layout.

Text analysis may stay in surface, if the operations are limited to storage, display or transmission. With modern text processing, however, that may be not so naive, with HTML marks for instance. Deeper analysis will tend to "understand" the text, to access its semantics and get its "meaning". This form of analysis can be complete and generally error free for formal texts, in particular for programs. For more or less "natural» language, the interpretation will not go without guesses and misunderstanding risks, be the processor be a machine or a human.

In biological processes, the basic analogue to digital conversions are operated at nerve terminals and the analysis process goes along the nervous system ramifications up to the decision sites, mainly to the brain for the most advanced animals.

Is analysis a loss of information ? Yes, in general (if the original being is not saved along, of course). And is legitimate in this phase since the processor aims mainly to select what is pertinent for it. And the contrary would be contra productive, if it resulted in artefacts giving a false representation of the being. Analysis is mainly a reduction to elements, losing part of the structural richness (or low level details) of the being or of its image.

When we start from a photography, or an external document, or data acquired through high rate sensors : processing them, apparently, can only reduce their riches. Entropy wins. Even when we do a sharpening operation, for instance, we reduce the information quantity given by the original image. In any case, in KC terms for the system doing the processing. But not necessarily for an "external observer". Since we must not forget that the work will a product of the received data and the whole culture of the artist, and of Roxame as a particular case. In the OE = f(IE) formula, O profits from E. And the multiplication could be creative.

Sometimes, the necessity of reducing the size of information, or to interpreted it with risk in order to conform to the core process and the processor capabilities may be a hard choice.

Let us note, however, that a description may be longer than the original. A typical case is the Naval Battle game, The ships are disposed on a 10.10 matrix. Then 100 bits, or 13 characters on 8 bits are sufficient to code all the positions. If the position of each ship is described by its beginning and its end, that makes 4 integers for six ships (xy positions), two only for the 4 submarines, then 28 integers coded in 8 bis, 184 bits. A textual description ("The aircraft carrier is in B1 to B4...") would still be much longer.

Assimilation may also process through neuronal networks (Bret).

Vision may be defined as the transformation of an image into a text which describes its content. For instance, if the input image is a geometrical figure, and if we consider that random variations or accidents aliases) are only noise, the reduction work is finished when S has found the curve equation (type, value of parameters).

A basic idea, induced by the mere work» analysis", is that it reduces a being in simpler ones. As says the second principle in Descartes Discourse on method "divide each of the difficulties I would examine into as many as parts as possible". That is not always possible : in some cases, dividing would cut fundamental loops. That has been widely said in the 1970's, with notably the books of Edgar Morin. Even analysis may be creative, in particular hen inputs are natural or rather fuzzy, and analysis is by itself a sort of construction. It may even be dangerous, if it generates deceptive artefacts.

Recognition operations are not properly divisions, unless a rest can be defined. Globally, recognition gives a description, what must I keep to find again the original.

For the rest, study the example or equivalence relations. There is the quotient set (the classes). What about the second factor ? It depends on the way the set has been defined. If defined by a list, the rest is a list completed with the indication of the class. The two definition sets may be globally heavier than the original.

Descartes sets a strong hypothesis when he states that cutting gives simpler issues. (Where to put that ? )

In the general case (in practice, for mainly all beings except those specifically designed), this product will let at least one of the "factors" so obtained which will not be reduced under some "arbitrary" or "random" sequence of bits, which we could call a "prime". The length of the longest such prime could be called the "minimum divisor" or the being.

Such a prime may have its source in an analogy with another being. For instance a sample of some sound or a photographic picture

If some approximation or error risk is allowed, this factorization can generally go further, be it
- by the parting of continuous (in bits) parts into several intervals (sampling, grading, quantizing)
- pattern recognition, that is selection of a given pattern among a finite set of patterns ; a classical mode of selection is the obtention of a set of measures and qualitities or the being to be recognised, and the choice of the nearest pattern in the "feature space" of the pattern set.

For graphics and other beings, a factorization by size and pattern is generally possible, in general with loss (pixelization).

Reduction to sophisticated text presentation to minimal text
- written text : divide by imposition, page setting, font. beware, part of the marking may be meaningful
- oral text : divide by prosody, time.

In analysis, criteria are put into factors, and the search starts again on the "rest", that is the works selecting according to this criterion. For instance, the "colour" factor is used, and then one looks for patterns, or luminance. That would be the "canon" research. But it is not always possible, and the search on each criterion may be independent. For instance, all the images of armchairs, then style differentiators.

The repetitive structure, or iteration, is major for most processes. Evidently at the bit processing level, since any digital being is basically an succession of bits. Classical iterative beings are for instance data files, neural streams or living tissues.

It may be useful to consider a persistent being in time as an iterative presence all over the cycles. A practical example would be the successive video images of a fixed being. The digital representation of a integer, whichever the basis, is the iteration of successive powers of the base, multiplied by the particular value at this power. The iteration loop is by itself a fascinating trait of active digital components, with its potential for infinity. And things become still more exciting... and tricky, when the loop includes a reference of the being to itself (recursion).

The replacement of long iterative beings by a type and a couple of addresses is a powerful analysis tool. In music, for example, a Midi code can be interpreted as pointing on a repetitive structure (the sound form) placed all along between the start and stop addresses in the score. It goes also without saying that the complete digital representation of an integer would be an infinite chain of zeros at the left of the meaningful digits. This is not a pure theoretical view. It has sometimes to be dealt with explicitly in programming or hardware design (padding, zero split).

It is also one task of sensorial systems in living beings to replace the repetitive stimuli flows by presentation of stable beings. That function becomes highly complex when the body moves, but also the sensors, eyes for instance (see for instance the books of Berthoz).

The type-address reduction may be considered as using an equivalence relation. The quotient set of the relation is the set of types, and the set of addresses defines the partition. All this process implies that the agent knows what "is the same". To define that includes features of the being (what is important for it, what contributes more or less to its L), and also of intrinsic features of the beings. Sometimes, intermediate criteria may be used, as in compression/decompression algorithms, who use general kinds of "regularities" to reduce the volume of beings, and let them be rebuilt (with or without loss).

Neural network offer another kind of analysis, with bright aspects as adaptability even to the unpredicted.

Languages, natural or formal, are the most wide and powerful set of structures and components. A language has two basic factors : phrase structures and word definitions (codification, nomenclature, dictionary, ADN set of bases, of which the "definition" is transmitted by messenger ARN, but in the frame of a very complex epigenetic process). Languages use a basic iterative structure, the phrase (assertion, command, one could also say gene). The complexity lies inside the phrase content, but also on the fact that a phrase may refer to another or event to itself, thus creating possibly complex loops and recursions.

3. Generation

The generative phase (genetic can we say metaphorically) is the creation of a new being by another one, or several ones, that put in this new creation something deep out of them. The generative process is as well :
- natural (from simple cell duplication to the full human process: male spermatozoid, female ovule, crossing, embryology, birth, mother and parent care, society integration),
- artificial (from the logic gate to the full process described here), but the mere automaton formula O = g(I,E) is yet rather expressive

In a clocked digital being, in some way anything is new at each cycle. The question is how deep this generation will start from. The self replication, followed or not by crossover, is the extreme case. In the normal case, day to day life and machine operations, the new being will be only partly new. With the art being as another extreme case.

Once the assimilation done, the being has "all the elements" to make its operations, which can go from a simple logic AND to image compositing, and at the deepest, a complete modification of the elements. But here comes the genetics : not only can the being combine several being or elements acquired from the input being, but it can combine it with its own internal data.

A basic case is the filter applied to a signal, which combine the each acquired value with one or several values (filters of more or less "high order").

At high levels, of depths, of processing, the being will apply all its force to the genesis of a new being. Roxame, for instance, may apply all its set of algorithms to transform an analyzed input image. Then, as is said of a canvas by .... "multiply a spot of nature by a temperament".
Or, in information management system, each client entry can be dealt using all the synthetic strategic computations done on the precedent day, so that the corporation maximizes its yield. (Le triangle stratégique Tardieu).

Another way is to "add a bit", according to "Every thought is first an impious negation, a dissolving analysis" (Andler 1. p 55). Every art also.

What is "crossed" ?
1. nature: genetic code + mother environment + family (which is also a crossing, and not so easy !) then society at large
2. artificial. forms. products and residues.

The generic process, proper, is the generation of "products" from the parsed inputs and the internal resources. Product has here a very general meaning, like in mathematics, for instance. Two kinds of structures are here implied in the process : components and assembly structures (however, the difference is not always sharp, like between programs and data).

Components may by assembled together with a very temporary structure to give components. For example
- the logic combination of two Boolean values
- the arithmetical operations on numbers
- the superposition (subtractive mixing) of two images
- the crossover of genes in life reproduction.

Structures may be assembled with a minimum of components
- the successive application of formal language rules before using any terminal word
- a recursion process on abstract form (?)

Structures and components combine to produce more components or more complex components.
More components :
- repetitive generation of a being
- repetitive emission of a cenestesic signal
- concatenation of strings
More complex components :
- composition of a text phrase
- successive fractal operations on an image
- embryologic process.

We consider as generative mostly the deepest part of these operations. When a "new being" has been generated, a series of operations will be, somehow automatically or without "creativity", executed to give the being its final and deliverable form. That is what we call the expression phase.

Somehow, generation is the way to create energy from nothing, or just matter consumption (but less and less, Moore ' law like). We are here at the heart of the mystery and taboos, for that !

4. Self reproduction = life

The reproduction of a new being identical to its parent or combining the traits of a couple of parents is the basis and (one among others) definition of life. It is more exceptional in artefacts, and negatively viewed, since it is not easily controllable. Then the only current self-reproduction of artefacts is the virus, a plague. Anyway a direct function like O = E for an automaton or any mechanical device is problematic. In fact, self-reproduction is never a purely digital process. The non digital part of the cell is essential. And if natural viruses are nearly pure code, they cant reproduce without a host cell.

The filiation process has some variants :
- generation of the same. cloning,
- modification of the core code
. by central code combination (classical genetics)
. by "deliberate"modification and engineering

There can be something like a father/mother structure, when one S brings input to another S', which makes the assimilation and the genetic crossing plus the expression process (a sort of pregnancy...). That may also help to understand the role of "matter" in the generative process (generalize the scheme of human embryology).

5. Decision, a generative process

decision.JPG

Decision is a kind of generative combination. Basically, it is binary ; it is an alternative in the strict acception or the world. You are accepted or not. You go right or left. Sometimes, there are more than two ways (but not more than 7 or 8; beyond that, the process will be layered, with first a selection of a "short list"...

"True decisions", so to speak, are made in the core of S. Secondary ones are the matter for subsystems and other parts. A kind of genetics between what remains of the input and the motivations, finalities of S, his view to its L in the particular environment of this decision. In fact, all along the assimilation process, a lot of (more or less implicit) decisions are made. They may have been done in the mere design of the decision process itself. The major ones take place in the core, at the end of the concentrating assimilating process, at the origin of the widening expression. At the deepest of the process.

A radical and somehow minimalist kind of decision would be that the result of the analysis reduced to one bit, and the genetics is another bit to be multiplied by the product.

At a basic level, one may consider that the simple conditioning instruction "IF... THEN" is the decision atom (as is the logic gate in physical matter).

The decision may be quantitative, as research of an optimum, or in the "simplexe" qualitative/quantitative scheme of operational research. Here, the decision process may appear as automatic: the system chooses the optimal solution. But its part in the process has been in the definition of the computation. It can be qualitative, more deductive and operating on scales and not on quantitative parameters.

For some, decision is an human privilege, and the word should not be used for automated systems. At most, computers may help humans to decide (Computer aided decision), or form complementary systems to the large operational systems of large corporations. We shall not discuss here the problem of the human exception (see the "Mankind" chapter).

We shall use the word here in the case when there is uncertainty. In particular, for a construct, when the comparative effects on its L of alternatives cannot be completely computed, due to lack of knowledge or more generally to digital relativity. then the choice implies risk... and possibly creativity. This lack of criterion for central choices demands an unpredictable choice, then something new, at least a new yes or no, a new bit.

It is by their succession, their progressive accumulation, that decisions are the brick of constructive processes. We can so read the first chapter of the Bible's Genesis. In such an origins perception, where the first decisions have a simplistic look. There is an above and a below of "the waters". Progressively, matter, marked by decisions, is more and more organizes. Partitions are more and more fine and delicate. The miracle, so speaking, is that the castles of cards, so improbable, win an incredible stability (see here Chauvet).

The process takes place in a decision space : set of alternatives ; there are choices in the formal definition of its perimeter ; inside, it is restrained by continuous or textual functions, valued by weighing factors.

If we have precisely the right number of criteria, we reach determinism. The being is no longer interesting, since it is known in advance. Nevertheless, it could be an innovation, this computation of optimum has not been before. In some way, it is the arrival to classicism, preceded by enthusiasm and followed by boredness.

To have a real choice, an interesting alternative, we must have too much (cornelian choice) or too little (uncertainty) criteria. The unpredictable offset from the canon is part of the aesthetical system. A manually done lace is that small defects let appear the lace-making woman behind. That may be simulated with random errors in mechanical lace-making.

In many cases, there are too many or too few criteria and then
- no solution responds to all criteria, and S must take the less bad, more or less implicitly softening part or the requirement
- several solutions, or a whole space of solutions responds to the criteria, and S must either draw at random or use other criteria (possibly fuzzy, like "feeling", taste, or "nose".

Then decision seems at first look reduce uncertainty to let action go. But on the contrary, it is creative.
It reduces uncertainty in the cycle see/evaluate/act. But, if the action space has some persistence, then a decision "writes". Hence, there are two paces /
- possibility space (affordances), which we reduce to decide optimally, a space to be explored, where we trace a road,
- space in creation, that we define step after step, that we design for ulterior actions and decisions, for our successors; persona building, decisions of the youngster, of the adult, of the aged person.

6. Expression

The expression (synthesis, amplification) is a process rather symmetrical to the assimilation. It consists in assembling elements coming from the analysis and from the generative process, and assembling them in new beings, sometimes a simple modification of inputs, in other cases without direct link to the inputs (at least, with the inputs of this cycle).

It starts from the result of the genetic process, which then can be called an "intention". One can say also that this phase produces the representations, for the outside world, of internal beings inside the being.

The expression is the combination of the inner nature of the new being with the nature of the environment.

In some cases, the digitization of numbers may be considered as an external presentation of formal beings. For instance 1,4142135 may be taken as a representation of sqrt(2).

A key point: synthesis of a large number of outputs. Connect that to the "aura of the original" issue (Walter Benjamin). Let us re-visit the industrial reproductivity consequences. Aura and original : when matter matters (but that does not prevent to make sculptures industrially)

cccc

(notations in below lines are not conform to our other notations)

E = A + R
if R << A, a non-memory, reactive, oblivious, community, system
if A << R, passeist

 

I must introduce a "generation rate" G, with new information, or prospective information, created by the processing of hazards A, R,

G is function of : processing capacity, random generation capacity

One may describe G, the Gn processor, as a particular type of data, a program
assessing that there is a general/neutral processing capacity
G, general models, prevision, prospective, augmented reality formalism
find a general structural model for basic cells (Markov chains, Minsky automata ? )

Augmented reality
- with other images
- with models
- with links.

We place:
- diverse links showing that place (zoom on the most basic parts)
- historical travelling and possibly future projection

vvvv

Historical trend, laws. the E, A,R rates have counterparts in reserves (determined technology and system analysis)

Computing is an enormous Meccano with many pieces, each one in unlimited supply.

Then, the construction of representations may be described generally as a division (or analysis), which demands an appropriate analyser (sometimes "parser"). Symmetrically, the generation of a being similar to B will be an expression, using various kinds or generators.

The expression will not conform totally with the intention. The new being is at this stage a sort of embryo. The baby is the message.

7. Repetition and concatenation

A major generative structure is duplication, i.e. the application of an iterative scheme to a given basic being. Something like multiplying a signal by a Dirac function in time. Along sequence of concatenated identical strings (possibly the same bit). It may be represented to that string times the number of repetitions.

A more general case is the concatenation of words taken in a dictionary, and in general chosen in accordance with a set of syntactic and semantic rules.

Possibly : losses in expression : contradiction, destructive actions

Symmetries etc.

Some representation can be result from another with a symmetry or other geometrical operation, or filter.

A lot of such operations can be listed, and, for a given S ( Roxame for example), constitute a generation system.

The full meaning of the work comes from a combination of the elementary meaning of elements (words, elementary form), of the structures, and the way they convey the meaning intended by the builder.

(And don’t forget that dictionaries as well as grammars are built with bits).

Multidimensional deployment

De-streaming
Theorem of losses during generation.

Load and distance

"Loading" is part of the expression process. The product is formatted and completed by a wrapping adapted to the way it has to go from the emitter to the receiver(s).

The greater the distance between emitter and receiver, the heavier will be the necessary load (envelope) to pass the message through the distance.

This load consists in multiple complements, the nature of which is as diverse as the kinds of distance :
- parity bits, check redundancies, synchronization bits
- addresses, packet conditioning in networks
- physical conditioning (envelops, postal bags, telecommunication modulation, amplification)
- typographical loading (fonts, graphics, colour)
- law conditioning (mandatory mentions on commercial correspondence, on any product)
- cosmetic conditioning (politeness, high style, captatio benevolentiae)
- voice prosody and gestures for oral and visual communication
- brute force (with enemies), ultima ratio regum.

Note : loading increases masses then distances. When we go near to the DR (digital relativity) limits, it becomes impossible to compensate distance by loading.

Thesis : Or, could we say, DR is a consequence of the load/distance principle.

In intuitive forms, and sometimes quantitative (Shannon ), telematics design makes a constant appeal to this relation between distance and message loading.
Consequences :
- at emitter level, progressive loading devices, starting from the original sensor or authority up to the channel ; these devices may be more or less specialized automata (modulators, dedicated chips, processors and software) or persons (editors... assistants).
- at receiver level, more or less symmetrical devices
- at channel level, band pass enhancement, protection, adaptation, transcoding... in some way, reducing the apparent (practical) distance.

8. Dual view of processing from the message standpoint

Here, in general, we describe DU operations mainly on the basis of beings, with the IEO scheme, completed with the assimilation/genetics/expression process. We could try the dual view, putting the messages in the centre: expression, transport (may be considered as a kind of process, and possibly rather complex, with different channels and steps, intermediary storages, etc.) and assimilation. .

One could probably do also the same for storage in memory.

Is it meaningful ? Perhaps the message approach would :
- be more connexionist, neuronal net etc.
- give a better model of "thought" flow

9. Kinds of structures and components

Let us try to list the main kind of "languages", structures and components that will be used. Look to the generative grammar, so speaking.

- Let us at start push aside the physical components. The normal digital processing begins with the appropriate conversions in level of energy and digitization (analogue to digital conversion), and end with symmetrical conversion and energy level adaptation (generally, an amplification, since digital devices operate at low consumption levels). The A/D conversion implies a sampling which must follow the Nyquist criteria. In some cases , note Bailly-Longo, a proper conservation of critical properties may be impossible. It would probably possible digitizing processes other than the classical sampling, and giving directly formal descriptions respecting these critical apices).

Assimilation and excretion of matter by living beings is not digital, but can partly by looked at in equivalent terms. For instance, the assimilation of aliments starts with a deconstruction of the original food, followed by appropriate transformations to be totally integrated in the body.

- The type-address couple is also major structure, and anyway the minimum necessary to access and process a being. Put or send that there.

The address is the minimal handle which gives access to the being. Frequently, it is given in a relative form, or by a name (identifier). We use everyday URLs and emails which are where necessary (routers) converted in numeric codes. Files in directories are typical of this type of couple. Inside living bodies, the addressing is supported by the neural or enzymatic systems. In geometry (at least in simple cases) the type of a curve is represented by the structure of the algebraic form, and the address in space is brought by the parameters.

As for the type, it is at minimum "bit string". But without a type, processing will be limited. Sometimes, the type may be deduced from the string itself, by noting regularities or recognizing discriminative patters or stings. Frequently, large beings incorporate a header which will indicate the type and give a set of parameters to use the being.

Compression algorithms imply equivalence relations tied to the world nature and the specific traits of the seer (human in particular) perception system. The deeper the reality, the more exciting they are. In some way, find in a bitmap the singular operators that let compress it without loss is exactly understand what is interesting in it, because that let suppress what does not matter.

10. Assimilation/expression ratio

A mere comparison, quantitative, between inputs and outputs, suggests a classification of processes.
If there is an output, of any volume, for one bit in input, this bit is a trigger.
If there are more outputs than inputs, some new bits are created in a way or another. For instance an interpolation. Or, of course, any more complex couple of assimilation/expression.
On the diagonal (equality of flows) the process could be a simple transmission, or a permutation, or a combination of reduction and amplification.
If the outputs are smaller, it may be a resume, a simplified bitmap.
If just one bit results of an input, of any volume, it is just an elementary evaluation.

creation_synthesis.jpg

See below Styles.

11. Architectures

Our basic scheme is rather "linear", with just the three processes operating one after the other one. More complex processing structures must be looked at:
- "centred" with only one core, but several assimilation and/or expression channels; if it is really centred, topologically, it may be thought are the basic one, if we group the inputs and the outputs; in the same way, a series of processes where, but for the extremes, the outputs of one processor are aimed exclusively to another processor which itself has other inputs, the chain may reduced to one processor;

- parallel ; if two processes are strictly parallel, with separate inputs, core processing and outputs, then that may be considered as separate processors ;

- coordinated, with some master processor driving the processes without really interfering ; a sort of operating system

- layered , with one central driving process , like in robotics, with possibly "hybrid", for instance some inputs given at the same time to several levels ; ISO/OSI communication scheme.

This theme has been dealt with in a abundant literature.

12. Processing depth

Intuitively, the concept of processing depth is rather simple. The mere transmission of a being from input to output may be considered as 0 depth. The depth is maximal
- when the input being has been totally changed by the process, when it is impossible to make any bit to bit mapping between the input and the output
- when the totality of the (processor) being has taken part in the process, and possibly been changed also, beyond normal wear.

A logic gate, for instance, operate always at full depth : it changes totally (or not at all) an elementary pair of bits, and the totality of its function is implied in the operation. At the other extreme, the mere change of a file name is a computer does not change the content of the file, and make use only a very small part of the computer capabilities.

Somehow, in some measure, the analysis goes as near as "deep", as near as possible to a string of independent bits, in order to get the maximum effects of the law rn >> r.n. (then we are no longer analog)

For instance, starting from a classical HTML coded text, we could:
- strongly reduce the "normal" text, replacing it with a sequence of calls so pre-stored paragraphs;
- suppress any residual textual part with a coding of every word with calls through appropriate hyperlinks; - de-linearize with a system of multipliers
- use reference to type structures (commercial letter, file, DTD, Edifact message) in order to reject even the linear aspect of an HTML text, build from traditional text.

That would let us, at a lower level, replace the HTML anchors themselves, very "analogue" in their graphic presentation, with still more digital codes. That would imply the availability of a full dictionary, for all words. Ideally, we should reach a point where even the analogue succession of bits is replaced by a sort of Lisp type coding. We shall say more in the "depth" part below.

Let us examine separately the being and the processor standpoints. .

Processing depth in the processed being

 A bit per bit transformation of a being may change it totally, but the mapping between pre and post-processing states remains possible. In some way, the semantic distance on each bit may be total, but there is no structural distance. Like, for instance, a negative film and a positive print got from it.

 A program is not changed (0 depth) when interpreted. Nearly totally changed (100% depth) when compiled.

If a being is decomposed in a sequence of similar parts, then processed one by one, the depth in the being is not superior to the length of one part. Examples :
- processing of a signal, one sampled value at a time ; the depth will be greater for filters of superior order ; and total for a Fourier transform.
- batch processing of a lot of unit records.

Processing depth in the processor

Total implication of a human being giving its life for a cause.

If a being is dealt with bit per bit with no memorization, the depth in the processor is minimal (1).

If a being is dealt with as a sequence of parts, the depth in the processor is, as above, the length of an element or several elements. But possibly the processor will have first to store the structure, and May also, somehow in parallel, make some global computing, let it be only the number of unit records

If the totality of the being is first stored and internally represented, with a processing taking its totality into account, the processor depth is the ratio between the size of the being and the total processor size. (...)

An absolute depth would be something like a human being, up to giving its life. Or even the "petite mort" in sex intercourse. The intensity of replication. Core replication in life reproduction,

At the other extreme, a computer never implies the totality of its content to an operation. A system has a classical distribution between fixed functions, versions, active moves, instable regions, noise.
processor_layers.jpg

Then do could be analyzed in layers, as well for hardware (CPU, peripherals) as for software (BIOS, OS, application programs, etc.). And more generally, in relation with the different kind of architectures (layers, centralization, etc.).

By comparison to this classical distribution, we should be able to qualify the parts of the system, from very difficult to modify (hard) or easily modifiable (soft) if not unstable. We could look here for the role of
- dematerialization (localisation, re-wiring)
- virtualization (memory, addressing, classes)
- digitization (reproductible, mix logic/data).

Relations between the two "dimensions" of depth

We could propose a model such as the following figure, with Do as a distance between input and output, and Dp the proportional depth into tye processor.
The ration H/D could qualify, for instant, a kind of process/processor. Note for instance that a deep Dp is useless if Do remains low...

yield13.jpg

H is the depth.
The quantity of action is k*D*H


The surface, let us say Do.Dp/2 would give a measure of the processing quantity implied by the processing of one element. We could look for a relation with measures in ops, for instance with a multiplication by the flow rates (average, or other combination for the input/output rates).

Depth Limits

The core cannot be reduced to one bit. Still less to a Leibnizian monad. There is always some surface, some minimal lop.

Special attention should be paid to "total self reproduction", like in living beings. And to think more about the fact that, in artefacts, self reproduction is so absolutely forbidden.

A form of total depth would be the transformation of the being in a set of totally independent, orthogonal bits. These bits could then be processed one by one, or combined with a similar being, and the result be used to synthesize the new being. That is nearly the case of sexual reproduction, but for the facts that
- the genome is not operating bit per bit, but gene by gene (a gene being a sequence of two bit equivalent pairs).
- life reproduction implies not only the genome but the totality of the cell.

Limits of parsing
- co-linear with the process beings ; minimal sequence of bits ; the being/flow cannot be totally "destreamed"
- transversal ; minimal operations
- non-recognition, residue.

Here we have to fight against a view, or a dream, that we cannot totally exclude or our minds : the idea that there is somehow and somewhere an ultimate nature of things which would at extreme be beyond words. The Leibnizian monad is a resolute conceptualization of this idea. "Dualist" thinking can elaborate on it. But it is beyond the limits of our study here, where digital relativity or digital quanta impose limits to the parsing, and a minimal size to the "core" of a process as well as a processor. (Shaeffer p. 45).

Depth is also hierarchical. Level in corporation structure. Escalading in maintenance and repair operations.


13. Recursive functions evolution

By texte extension
- maths type text
- programming text (given that it is possible to commute with math texts) ; random generator then evaluation loops
- Well formed expressions
- original (not reductible to known assertions)
- interesting
- more interesting than existing data/assertions