Common nouns | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Index Berger's Works
Proper nouns A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z| MY IDEAS

Measurements and values

1. Introduction

"When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind" Lord Kelvin.

Several parameters can be noted as able to characterize digital beings in order to study their laws of development and communication. That could lead to a sort of thermodynamics. Among these:
- mass ; could be purely the number of bits owned by a being ; for a one dimensional string, supposed without "holes", mass is equivalent to the difference between its beginning and en addresses ; in more than one dimensions, and if we do not restrict beings to be full rectangles, then actual mass will we smaller, possibly much smaller, than the surface of the rectangle ;
- distance; there are differing definitions, or level of definitions, from basic ones (difference of start addresses in a one dimensional digital universe up to meaning/semantic distances, volumes.
- density and load ;
- pressure and attraction ; a good approximation of pressure would be the Google ranking ;
- temperature ; a good approximation would be the being frequency of use ;
- energy ; that is a classical in information theory ;
- orthogonality ; this point refers to groups of parameters ; they are orthogonal if they are independent ; a string of bits has a maximal informational value if each one of its bits has equiprobable values, and in particular if they do not depend on each other ; orthogonality is also tied to the notion of dimension and space metrics
- yield.

That opens the way to the infinity issue.

Some general laws seem to be at hand, for instance the laws of "right distance" and of "optimal load according to the distance".

This part gives a sort of underground basis for the digital universes. They propose ideas for a theory than a theory properly.

2. Generalities

Assign a measurement or value to a being is to compute a number, or a "qualitative" parameter,
- using directly available data, like its address, bit length, type, latest known updating (I transpose the parameters of an ordinary operating system, but it seems pertinent here), in some cases, loading the being itself,
- computing a value from the digital matter (e.g. mass, basic complexity).

Measurements may be binary : value of one bit, good/bad.
If they are taken in a small set, or scale, one will say they are "qualitative" (but a richer meaning is looked for below).
If they bring more than 4 or 5 bits, they will generally be considered as "quantitative", the number being an integer or a real number (with the restrictions on this kind of number in any finite digital system).

The data so obtained will be properly called "values" for the inquiring system if they have an impact on its L. This impact may be evaluated directly, related to a particular need of S, or by reference to other evaluating systems : standards and norms, market, expert advice.

We will take "value" as an evaluation of the being contribution to the L of another being. Possibly a very general and abstract one... but we cannot have something like an "absolute" or "true" value of an being by reference to GOS.


2.1. Quantity and quality

The opposition between quantity and quality is a double issue : type of measurement and orthogonal dimensions. We are near the to be / to have opposition.

Quality and quantity are not so different for an automaton, if not sophisticated, because it will process them through representations of a few bits. But a human being approaches quality and quantity along really different ways. Quantitative evokes some meter, some external evaluation process, when qualitative calls for touch and feel, if not hear, smell and taste.

Quality refers to a scale, with few levels, and fuzzy limits, with an order relation. A set of verbs.
Quality is evaluated according to a rather specific (if not personal) process, when quantity refers normally to rather direct measurement process or even device.
Classical sensors are binary or one-dimensional. Vision, generalized, will produce more "qualitative" evaluations, the kind of things we have started to explore with the "critics" parts of Roxame.
Some logical operations (inferences) are possible.
The maximum is around three bits (eight levels, in correspondence with Miller's law). Beyond, it is possible to have some sub-scaling (weakly, rather, a little, typical, much, very much, extreme...). Quality may induce equivalence relations, and sorting of elements.

Quantity refers to numbers, strictly speaking. The scale may be as dense as one wishes, neat, formal, etc.; here we have not only an order relation, but a numeric structure, with possibly arithmetic or algebraic functions. But quantity is frequently pointed at with non numerical terms : little, much, more, the most... several..

A priori, the more bits we have on operands, the wider is the set of really different operations.

In some way, the change is purely quantitative as long as we don’t add bits (though, psychologically, we refer more to decimal digits). It is qualitative (or structural) if we add bits, with a lot of intermediary cases.

We shall now concentrate of basic measurements about digital beings. They are inspired at large by mechanical and thermodynamical models.


3. Mass

The simplest measure of structural mass (which is for us very near to complexity), is its number of bits.

More many are the beings, more the global mass is heavy, and more it is useful to structurate, in order to take profit out the regularities and redundancies (scale economy), the more the structural cost is profitable. It would be interesting to have quantitative law.

Mass protects. It is a stability factor. It affords redundancy. But is paid by a lack of nimbleness.

Thesis : the mass of two beings is the sum or their masses, but if one is contained in the another one.
In practical operations, the grouping of two beings will let loose interstices, and the sum will be bigger.

In a being S, we have m(S) >= m(I) + m(E) ... + m(O)... if inputs and outputs have dedicated spaces.




4. Distance

4.1. Basics

If the universe is is one-dimensional, the simplest distance between beings it the material distance, i.e. the difference of their start addresses It is necessarily superior to the size of the first being, but if the second is contained in the first.

As long as the mass of a bit is taken as unitary, and whithout holes; the volume is identical to its mass. But we could add weighting.

For a processor, if we stipulate that the inputs I, are at one end, the state E in the middle and the outputs O at the other end, we could say, in first approximation, that the being mass makes the distance between inputs and outputs. We could also introduce the centres I and O regions.

With more than one dimension, we may use the classical Euclidean distance (sum of the squares). A noticeable point is that the distance between points of a being is not linearly proportional to the being mass, but varies accordingly with its square or cubic root.

If the universe has an infinite (or indefinite) number of dimensions, with one dimension per bit, {0, 1}N, we have no Euclidean properly (but we could transpose the N distance, or go to Hamming distance).

See notes on spaces with an infinity of dimensions : sphere of null volume, but non null diameter.

The "distance to centre" or "to periphery" of a being may be useful. For instance the distance between two internal beings as the sum or their distances to the centre (geographic distance "à la française", where one must always go via Paris ).

Addresses may be richer than the simple N set at the beginning. In particular to shorten them and make the work easier for GCom. Then we may speak of "soft" distance, by indirect addressing (distance in the addressing tree, in a directory, the number of levels that must be ascended then descended to go from an object to another), the address length (branch). The organisation "structure" of the bits recreates a sort of localization, some "matter", at low level, even if we assume that we work with pure information. The distances are tied to the "operating system", GCOS. .

4.2. Time distances

In a digital being, the core problem of distance is the number of bits that an automaton may move in one cycle, which is the basic time unit.

Hypothesis on communication : even if GOS does the job, how does it communicate with beings? That is also an aspect of Digital Relativity.

4.3. Formal and content distances

The Hamming distance (number of bits differing from one being to the other one)  in bits is more functional that the material distance. It is a content distance. We may also have a distance by morphing between two images (used in pattern recognition). But it is not properly semantic.

If each being has one bit only, there are only two values for the distance : they are identical or opposed. If the beings are longer, the issue is more difficult (more rich). The identity case still exist, but becomes rare. Then we have the case where the being lengths are different. Moreover, in Hamming, two messages strictly complementary have the maximum distance ; but that is shocking, because it cannot be a coincidence. We should come to a distance defined as the KC.

We are going to "meaning" differences, with processing depth, etc. (see below).

Thesis : If the distance between beings grows, the intermediary being has more and more work to do. Logical, formal distance, with different codes, storing formats, logical levels, protocols, delays, redundancies.

Thesis : The more functional are the distances, the more they are "external" to the being. The transfer time computation may in itself become very heavy.

Lexicographic distance : in the dictionary, Aaron is very far from Zyzyn. Phonetic variants.

Two beings are near if :
- they call for the same actions, evoke the same images
- may be combined in the same action.
- they evoke each other, or are frequently evoked together.
Motivation distances, with cognitive consonance and dissonance (we hear only what we want to hear), will and policies distances.
Sociometric distance. Here, more pi-computing, neuronal networks, etc. Number of intermediaries, or hierarchical levels, to go through.

Thesis. Distance is greater if there are many beings in communication. That results immediately from the spatial volumes, but also address lengths, etc. More interesting on cognitive distances, standard requirements, etc.

Let us not forget the semantic distance.

5. Density

In the physical universe, density is defined by relation to matter :
- mass/volume
- mass/space .
Then, digital density is the ratio binary mass/physical mass. See Incarnation.

If we have one dimension and connected beings, density = 100%. But with more than on dimension, there are "holes", and the density can be anything e.g. in a raster, or the useful bis in a data base. We could use something like a "format factor".

An important point for one being is the ratio length (of the being)/content. Trivial case 1/1 : the being is its own container. Or it is the same being, or it is actually a content (???). In this latter case, we must at least deduce the window border. (Notion or minimal, or standard border...).
This problem is not dealt in classical maths, with open or closed intervals, since here we are in discrete universe. (or else..).
There can be beings which are containers and nothing more.
Other cases : small numbers (7), large numbers (blob, binary large beings).

We could do with Complexity/mass.

The bit case : interesting here. By construction, a bit is saturated by one input and "saturates itself" by one output.

Somehow, density may be driven back to Kolmogorov/Delesalle, by computing the useful bits for a processor by the smallest being which would render the same services.

6. Pressure

We propose to call "pressure" on a bit (or being) the total mass of the beings pointing on it. It could be compared to the Google pageranking.

To be more precise, we could quantify pressure with two parameters :
- the size (mass)  of beings pointing on it
- the distance of these beings (the pressure is less strong if they are far located) ; but which kind of distance ?

We must also add that a being exerts a pressure on its "internal" bits. But that may simply be part of the external pressure on the global being, as long as the being does not explicitly points on its own parts. A "passive" being exerts no other pressure than the external one. An active beings with its IEO functions exerts a rather strong pressure on the corresponding parts. Could be more or less strong if we split the basic functions.

Thesis. When a being becomes large, internal pressure is weaker around the periphery. Unless we had some kind of attraction by sheer mass (like gravity).

We must elaborate a way of transmission of pressures from boundaries into the whole being. And more generally, inside the being.

Thesis : The stronger the pressure, the more digital the being ant its contents. Historically, that may be observed in information systems. Social pressure, standardization pressure. Efficiency of sharing of resources. But we should find abstract laws.

A line is more "a line" when it is long. E.g. because it  gives a greater "thinness", related to its lengths, when drawn in a bit raster.

Then, entropy principles. Crystallization when pressure and temperature are strong on a stable S :


(Hypo)thesis : crystallization happens when, after a liquid phase at high temperature and pressure, temperature lowers at constant pressure.

One kind of explanation : when a being becomes large, its useful part decreases to one half, one tenth, on thousandth... then, for economy reasons, it is compressed. Compression and abstraction have similar aims.

Chunking is a kind of structuration if not of cristallization.

Some decompression algorithms are very simple multipliers (number of pixels of such colour, from this pixel/). Different types of compression :
- by gravity, when there are many bits, necessity of a compressing the centre, densification, class generation,
- data and images compression,
- statistical with or without loss ; if without loss, with or without semantic loss,
- data analysis,
- programs (pkzip, image formats)

"Negative" or "reactive" pressure

That is the number of beings (possibly with sophistication like for pressure) on which it points. Or we could say "power" of a being, if we combine with flow rates.

Attraction is
- stronger if two beings are near (in practice, how does it work?)
- stronger if to being are similar (it they are totally identical, they could be merge)
- stronger if masses are heavy.
When a being becomes larger, attraction may become insufficient to keep it together, and the being dissociates (another form of Gödel...)

Pressure distribution

Abnormalities may mark the pressure matrix, if we consider models with noise, or relativist.

Law : In a close system (even in DU globally if one may think of it), the total of negative pressures is equal to the total of emitted ones. But the distribution will be far from uniformity.

One could say that pressure is a kind of constraint, preventing or slowering certain moves, with narrow passages, form contradictions...

7. Structures related to location

Inside the universe, and any being (so bigger, so more so), pressures (masses, densities)  are not equally distributed. (Then in a raster there are “hollow” pixels. “not addressed” pixels).
That gives a valuation on addresses. The proximity of a high pressure being is positive.

Thesis : From a strictly material standpoint, the value of a being (or simply of a place in DU) depends both of its length and its distance to high pressure beings. Like the law for real estate...

A large and cold being may be a nuisance to its environment, since it makes their communication lines longer.

A major problem : how far can a being modify another one ? That may be defined in GOS... or in beings dedicated to management.

8. Temperature

The temperature of a bit, or being is :
- the frequency of its change of state by unit of time,
- or the number of variations observed during a given number of cycles ("medium frequency").

We could rather talk of "medium frequency". The word "temperature" is not so good, since it would put at high temperature the low weight bits of my model IEO. Then we should combine frequency with the bit importance, and have something like: temperature = pressure. Frequency...

Extension to a being: weighed frequency of its change of state. Possibly forget its internal clock.
Hench, sketch typologies, crossing with pressures.
Something like : P.V = Kt



At constant mass, a rise in pressure causes a rise in temperature.

About frequencies, we could distinguish an apparent frequency and a "deep" frequency, with reference to complexity.
For a being, we could also define temperature as a ration pressure (emitter, received..)/binary mass.

Or : temperature = transfer rate/capacity (we get frequencies)


I must work on this point.
We could also talk of "white" energy. An interpretation of P (see L).

10. Digitization level, structuration level (to be edited)

A major feature of beings is that their features (...) change radically with their size. For example, unique bit offers both a maximal cut (the bit opposes radically to the infinite fuzziness of matter, of continuity), and a minimal cut, since its meaning comes only form the context. In the case of a world with a unique bit, this one would be in opposition and relation with the analogue substrate. When the number of bits gets higher, the break from the environment widens, in order to ensure the beings coherence. In some way, each bit becomes more digital. And we could perhaps define a digitization level of one bit by the length of the strings he abides (see below masses and pressures).

With a unique bit, the cut is both
- maximal: this bit opposes to the structural mass, potentially infinite, of the continuum
- minimal: this bit gets it's meaning exclusively from the context, in this case, the analogue referential itself.

When the bit count rises, the cut is stronger and stronger, in order to hold on, to assure the message coherence. In some way, each bit becomes more digital. We could even define for one bit a digitization (and abstraction) rate, using the length of the strings where it is included (but where do we stop the measurement: byte, record, disk surface?). See mass, processing depth.

When there are few bits, we have to deal with problems of grain, moiré, Nyquist,

With some bits (data, signal), fascinations of numerology, the mere shape of digits design, letters and ideograms.

The break with the original situation parts wider on when increases the number of bits. Digitization and cut reinforce themselves reciprocally. With a unique bit, the cut is both :
- maximal, with the bit oppose to the structural mass, potentially infinite, of the continuum,
- minimal, since that bit gets its meaning only from the context, which here is the analogue referential by itself.

When the number of bits grows, it is more difficult to hold all the bits as one being.
But each one becomes "more digital".
A "digitization factor" of a bit (or abstraction) could be defined from the length of the binary chain he is part of.
But where to stop the chain, the being: byte, record, recording surface (a disk partition ?). See depth of processing.

Then the bit yield decreases. In a plan, you cannot draw an infinity of infinite lines which be meaningful. Then, the meaning growth is not equal to the basis exponential function, and indeed much lower. See also my Champery study (meaning/bit number ratio).

Similarly, demographic growth calls for artificial. Non only for economical reasons, but, more deeply, because the free existence of more and more connected individuals is synonym with larger structural mass, then abstraction and, of course, of artificiality, in particular as a break with Nature.

Law. The growing formalization entails the growth of automata. Automation is not a global factor of unemployment ; indeed it is a fundamental condition of any demographic growth. We have here a loop:
- demographic growth causes a growth of structural mass
- growth of structural mass entails automation, which in turn, by its economical effects, the reduction of hazards and a better public health, augments the population.
(Dehumanizing ...)

We could try to search for a quantitative relation between human beings and DU beings multiplication. That would suppose a good way of measuring the quantity of automata and their structural mass. We could even build a model with the structural mass of one human being as a reference unit, and the automata mass referred to. Is that realistic ? It seems in contradiction with the human transcendence postulate.

But the growth of DU beings is also a break with its dependence of other beings (humans included). Digitization parts the work from their author, ant both get a reciprocal autonomy : autonomy of messages, of machines, of systems. The maker also gains autonomy as it gets rid of the tasks precedently necessary to ensure its living.

A being can be more or less "structured", or we could say "parsable". This parameter is 100% for a being/processor
- totally usable by the processor
- down to the last bit
- all meaningful
- without redundancy nor useless bits.

Structuration level is near to orthogonality level.

It is null :
- no structure usable by the processor, which can only store, transmit or destroy (possibly, just not perceive).

Examples :
- a bitmap is very poorly structured for any program other that a graphical one and knowing the format ; even there, factorization of a bitmap is rather limited,
- the formats include a header nearly totally structured, an empty part then bits ordered well enough to get a display,
- a program is nearly totally structured for a processor equipped with the appropriate language ; nevertheless, there can be blanks, comments,
- generally, replace the original message by a more structure one gives a shorter one, by redundancy reduction, but not always (naval battle, image decompression)

A given being may considered more structured by some processor than by another one. In other words, the processor nature is an evaluator of structuration degree of the messages it receives. Abstractly: a language corresponds to an automaton, and vice-versa.

Executability and structuration are not the same.

One could say that “more structured” is a synonym of “more digital” (see for instance 16)

A data file is less structured than a program (to be checked), and more than a bitmap.
A vision program detects more structures in a bitmap than a mere display routine.
Example of nesting with progressive structuration, the bitmap of program
- pure bitmap
- OCR (Optical character recognition), text structure, usable by a text editor
- program read as such by the compiler

Law. At extreme, a being totally structured for a processor, vanishes. The non-mapping, non-transparency level defines its existence level...

Everything that is formalizable is automatizable.

Un caractère fondamental de notre civilisation moderne: l'influence prépondérante et croissante du langage formalisé et du raisonnement abstrait, au détriment des communications informelles, de l'action directe de l'environnement et de l'apprentissage par l'expérience vécue. Bruner (cité par Lussato.). Parallel : Toffler.


Other classical opposition, near to digital/analogue. More pertinent to HMI.

11. Programming and coding

In programming activities, the compiled binary code is much more digital (see above) than the source code, in spite of the functional identity.

The coding of identifiers, uses frequently, for instance, in France, the NIR (aka "numéro de sécurité sociale"), decimal digits ( 13 in our case). But eight should be sufficient, since French citizens are less than one hundred millions. Which would allow a large third digits less. The gain could be larger with a pure digital code: 28 bits would suffice, against the 104 bits (13 times 8) in the present system. But we would lose the directly meaningful parts of the code (gender, birthday year and district).

In any algorithm (sequence, phrase...) an even in a mathematical formula writing, subsist always some traces of analogue representation. Idem in any digital hardware, at least residually (remember the problems with the Cassandra machine).

Cam systems may be considered as an analogy form of programming.
beings are interesting, giving an external reference as well for action (methods) as for memory.

Programming is by itself the building of a program analogue (functionally) to an action. But it is also a break, mainly due to the control structures, with a maximum if we to the Lisp radical structures.

A compiler is more digital than an interpreter.
Here also, the theme of an optimal granularity : size/number of classes Vs. a global assigned size (SAP vs. specific code, etc).


Is that a measure ? May be measured between two bits. Two variables.
Then : count the variables (identified as such) and their respective orthogonalities, then use some way of adding/combining them.

When interacting with the other, we want orthogonality, by factorization of we what have in common (null sine), and of what is reciprocally indifferent (sine 1). Then we can usefully debate on obliques to make progresses on common and be ready to fight on basic differences.

Thus we profit (phi sine) and lose (sphere/cube).

Orthogonality with 10G human beings.
Dissolve determined beings, towards an ideal pure indetermination (Lyotard)
A the same time, an enormous stock of knowledge, determinations, differences, giving meaning to differences, both equiprobable and meaning bearer.

Two orthogonal bits are meaningless for one another.

13. Granularity (to be edited)

66 6 - size of the granules (in bits), related to the size of the being
- kind of free space/free move related to the shape of the grnules


14 Varia to be developped and edited




when the size of objects grows, some function reaches a maximum.


total capacity N
nb number of bits of each object
N/Nb = number of objecs in volume ?