Complexity
0. Basics
This concept became fashionable in the 1970's, with systemics (or systems analysis). It is a quite tricky notion, related in particular to variety, energy, nesting and recursion.
One of the most pleasant approaches is the Kolmogorov complexity, which can be briefly described as: the complexity of a being is the length of the smallest string which permits to (re)build it. One positive aspect is its comparative operationality, and its correspondence to intuitive notion of complexity. Evidently, that can be computed/reckoned only for a given processor/generator.
Law. To be proven. Complexity is the reverse side of liberty (as far as we can measure it, for instance with a L function). It has its costs (exclusion, lengthening of all processes and decisions)
- This concept is important for digital art aesthetics. But not sufficient to cover this range of issues. See our Uncanny peak, a tentative review of the topic
- To make it short : the Shannonian theory of information brought a suggestive basis to research on aesthetics ; complexity may be defined roughly as the quantity of information contained in a message ; in programming and generally computing science, complexity may be evaluated by the time and space memory necessary to process an algorithm ; more deeplly, Kolmogorov and others proposed to define the complexity of a document (any bit string) by the shortest prrogram able to generate it ; aesthetic approaches make use of two parameters, let's say a raw complexity (defined as above) and an "effective" complexity, which remains to be clearly defined, but could be considered as a balance between order and chaos.
- The last advances (as far as we know) are in [McCormack & d'Inverno 2012].
< Think Complexity. Complexity Science and Computational Modeling. by Allen B. Downey. O'Reilly Media 2012.
< Think Complexity. Exploring Complexity Science with Python. by Allen B. Downey. O'Reilly 2012.
- See [Berger-Lioret], several pages.
< Simplicity Theory, by Jean-Louis Dessalles.
- Programming : [Aho], [Harel], [Delahaye], [Roxame]...
- In interaction design see [Janet Murray], notablu about procedural complexity.
< Visual Complexity. Mapping Patterns of Information. by Manuel Lima. Princeton Architectural Press 2011. Beautiful images, but no deep reflexion, categorization, quantification, theorization. Just a pleasure for the eyes.
< Abstraction and complexity. by Lev Manovich. 16 pages in [Grau].
- A major concept for realism in synthesis images ([Ebert], commenting, see there an advtantage for fractals).
- Example in music "As your skills improve, your songs will inevitably become more complex and your computer will be made to work harder".
- See shortcut.
- Simplicial complex . A set of points, in a space, with topological properties. See Wikipedia or [Zomorodian] p.23 seq.
Let's add now some complementary views.
1. Material/functional complexity
In functional riches, let
us distinguish :
- useful mass, the set of operations that can be done with the system : going
back up to the useful, we shall go towards the user intentions (half word,
prevenance). It is normal to tend constantly to upgrade the useful complexity
of a being or of family of beings, would it be only to increase their actual
use (market)
- use overload, that is the complexity of operations that must be done to use
the system ; this complexity has a rather something negative ; at least, the
lower, the better it is. Then we shall tend constantly to reduce it by two ways
:
- On the one hand, conformation to the user structures and habits (for an English man, a one foot became a 30,48cm hat with metrics system...) of the same order: Huffman coding, with short codes fro frequent actions; but there are limits to this conformity (or conformity rate), then we must use looping, alter-ego conformity; id. creating automata into S, automatic clutch etc. (alter conformity/prothetic). That points also on the abstract/human polarity.
The use complexity cannot pass some thresholds (competency, rationality or the user) and in dynamics, if there are constraints on reaction delays (vehicle driving, combat, decision in general).
In parallel, automata
creation increases the material mass, where we can also distinguish
- object resulting mass (complexity of exhaustive description or manufacturing
exhaustive description); in itself, this complexity rising is good (but
possibly for stability), if it is not too expensive.
- manufacturing/building charges.
The rising of these two
(four) masses will converge in the rising of technical beings. We must raise
at the same time :
- the ratio useful mass/use mass, which we could call "relative
simplicity". The need of functional simplification has both economical
(labour productivity) and a social imperative at world level: not to make
artificially crippled (then excluded) people, or the least possible.
Oehmichen has said
interesting things about separation of normal use and maintenance functions.
Study, according to different cases, the more or less progressive character of
these curves evolutions, or of a hierarchization.
There will be both automatic and human decision flow growing.
2. Example. Algebraic forms
Let us develop an example or relation between complexity and being mass, using algebraic forms and commenting "intuitively".
Let ax and ax +b be two lines. The second one is described by a longer model than the first one (four signs instead of two). And though it be not more complex "in itself" (both are straight lines), it is in some way less accessible, since it does not go through the origin.
The x2 parabola may is more complex "in itself" that the straight line. And, the higher the degree, the more numerous appear particular points and roots.
Let us now study methodically forms of growing length. . .
With one symbol
- x, the origin point, or an hyperplane containing the origin
- a : empty set, but if a=0, then it the total space.
(If we had just one bit, coding the 0,1 pair, then we would have all the space
or nothing at all).
Two symbols
- ab reduces to a
- ax and xa reduce to x or 0
- xn gives a parabola or a similar curve (actually, there is here a hidden third symbol, to put n s an exponent)
- xy, union of two hyperplanes
Three symbols
- abc reduces to a
- abx reduces to x
- axy reduces to xy
- xyn reduces to xy
- axn reduces to ax
- xmn : reduces to xn
- x/a reduces to x
- xyz : union of three hyperplanes
- x + a : point -a and hyperplan ; here we can speculate on the number of different operation symbols (+, -, *, /....)
- x/y is no a polynome
near zero, a chaotic region
when it is too big, unusable, a motor or a hierarchy is needed
(x;y;z)
x-a;z
x-y;z
ax + by
axy +z
ax2 + y
x2 + y2
xn + yn
axn + y
ax + yz
If taken with sufficient resolution to avoid aliasing (and probably even if we take the minimum number of pixels to have the curve shape recognized by a human being), then, even a rather simple geometric form will need a lot of pixels then o digits in the file, even compressed. Then, the algebraic representation is good way to ), at least if we accept approximation of the concrete position of points Then it is correct to take the string lengths of the formulas to measure the complexity of curves.
In practice, curves as well as equations are looked at with the same eyes and the same parts of the cortex. And our brain has wired geometrical function in it, which make the curve shapes appear "intuitively" simple, when the algebraic forms will look painful to understand, even for math educated person.
Note the reference to
infinity by formulations such as a n€N
or a + bx2 + ... + anxn
but even there, there are limits on the writing.
Natural limits on the symbol sets "as long as one want", and the
graphic plays (index, exponent, etc.).
Padding is of any length,
then the meaningful expressions are the quotient of expressions by the
equivalence relation of padding. The implicit padding varies according to
operations :
if aa is a2, then the padding is 1
if a+a = 2a, then the padding is 0 + or more exactly
000000000000+
for 147,3, the padding is made of zeroes at right and left
by constant factors in the case when form is supposed to part of an equation
with 0 at right
3. Another model
A given device type
increases its complexity by multiplying its parts, and differenciating the
types or parts (or devices).
But we reach a threshold, beyond which complexity grows by association of same
type assembly. There is no longer type differenciation properly. Variety and
complexity migrate to a superior level.
Examples : atoms, molecules, organisms, societies biotype.
or : elementary electronic circuits, high integrated circuits, boards, systems.
Or: unit, system, and network.
Actually, beyond some limit, over-performances at a given level are aberrations more than progress ; they are exceptions, if not degenerescence in the type : superman, transuranian atoms, supercomputers, WLSI.
An automaton generates (may generate) an infinite flow of complexity. It is only through reduction that we will focus on its true structural complexity (times the fundamental clock complexity).
Systems theory as a need to confront more complex beings.
4. Mastership/control
The term "control" has to be defined. Measurement of if (gap/offset vs. expectations?), relations to L. Reference to Ashby law of required variety. To get/enhance/restore control, three types of actions are possible :
4.1. Action on the master
- chunk length growth, and chunk number (physical ascetics, meditation,
training),
- thought and will structuring (Jesuit "id quod volo", simplify
oneself),
- competence and reflexes acquisition,
- "assimilate" the system
4.2. Action on the system to be mastered
- global action, useless
things suppression (elements as well as relations), general simplification
- action on the system as it is seen by the master ; there may be a growth in global
complexity, for the benefit of adding new parts specifically designed to
simplify the master view. For instance, a sorting of the beings may introduce
artificial structures, which as such add complexity, but offer transparency to
the master (wysiswyg).
- structuring, hierachisation
- delegation, modularity.
5. Interface development
- master prostheses
- system prostheses: dashboard; explanation routines in an expert system
- framing
- resolution reducing (fuzz, pixelization, convolutions...)
- intermediate representations
- progressive Cartesian segmenting
- recursive meta-complexity.
5.1. Delegation/distribution of complexity for control
This opposition could be
completed, or simply implies, an energy transfer to the system, with at least
"amplifiers". And physical mobility (see my text on three sorts of
prostheses, integrated, worn, similar, or integrating. Conformity loops around
the user system, autonomy works by symmetry.
With conformity of the prosthesis, simplicity is reached
trough "conformation" of the system to its user(s): specific ergonomy
of the workstation, task analysis, user model inside the system, soft controls.
Conformity relates to user anatomy, habits (for an English man, a one foot hat
is better that a
Good side: appropriation,
grafting easiness, handicap compensation, learning easiness
Bad side: costs, design delays, local and conjunctural factors hardened, fault
sensitivity (that last point to be checked).
There are limits to that style of conformity :
- physical constraints of the system, user competence, and time constraints
Then we are obliged to use "alterity" conformity, that is the
creation of automata inside the machine.
With autonomy of the prosthesis, simplicity is looked for through autonomy of a controlled
device. . Beginning with replacing as much as possible open loops by closed
loops, using appropriate sensors, etc. automatic clutch, for instance. An
important topic : system autonomous clock. Titli rules apply.
Good side : generality, performance, hardware interface reduction
Bad side : loss of control (L'apprenti sorcier). Excessive reduction of
workload causing loss of attention by the person in charge. Loss of stability.
Faultive loops (Larsen).
Autonomy of the system is termed "closing" (closed loops). Transparency problem.
The transfer of control to subaltern
automata augments the organic complexity of the whole. (not always, since it
reduces the interface complexity).
We can here distinguish
- induced complexity of the object (exhaustive description of complexity of the
object itself or its manufacturing process)/
For sharing, in the
human/system case, see Valenciennes
model.
Common problems: manufacturing complexity and costs. Operating systems example.
5.2. Saturation for the observing being
Main representation
technologies.
For instance, different representation modes. We can find
- better algorithms (in abstracto)
- new regularities, and use them.
Let M be a being.
At start, elementary generator + length of M (see Delahaye)
Then we have a generator : total length = reduced M +generator
then a series of generators, which combine with each other
Every knowledge acquisition is a gain. It means that some reduction has been achieved. "Control" is to reduce the (functional) complexity of what one controls. Id creation of new classes, of better multipliers/dividers;
Computability is of course a concept that verges on practical problems of conceiving computer algebra systems and implementing algorithms. Neither will we go into any technical details here, nor is it our aim to discuss space-time optimal solutions. On the other hand, every attempt to define computability also raises foundational and philosophical questions as well.. All we are interested in is a reasonably rigorous definition of computability that is based on common sense insofar as it reflects the capabilities of today's computers. Thomas Becker and Volker Weispfenning. Gröbner bases, a computational approach to commutative algebra Springer 1932
6. Relation with other concepts
6.1. Programming and complexity
Programming is a sophisticated way to delegage/subcontract complexity and control
A program is a kind of
being. Theoretical problem: do exist programs P such as k(P) > b(P) ? N, but
it can be seen from an external point of view (?)
(KC greater than P length).
One can do better in cycles number, than b(E)/b(I)
We can as example take the case when the observer S' is small in respect to S
or O.
Limited rationality of S', required variety of Ashby. In this case, S may
saturate S'. Then S' can never predict O (a fortiori, if I' < O).
Definition : for S', to understand S, to have S explained, is to have explicited S in S', a simple solution being a copy, but a model will in general more profitable and interesting (and the copy is frequently impossible)...
For example also: lets have
the two functions E(I,E) and O(I,E) of S, and an initial state E.
Question : in which cases can we go upstream from O to I and E ? In general
rule : no.
Trivial :
a) E is supposed constant, E = O
b) E = f,depfunction
c) Bayesian ?
6.2. Concretization (Simondon) Vs. complexity
Concretization, as seen by [Simondon] does not necessarily entail a reduction in organic complexity: there are fewer components, but their multi-functionality may augment the manufacturing complexity. CAD/CAM and numerically controlled machine tolls should help to define (if not to measure) the complexity of components Vs. the generating programs complexity. But these measures have their own limits.
The Simondon concretization
forbids large scale adaptability, but makes it easier on certain range. It is
not really a progress, more a local solution.
Nevertheless, concretization leads often to beautiful beings, in contrast with
Meccano built beings. Here, beauty is reached again only if the object is very
large Vs. parts sizes. With serious limitations in Lego/Meccano constructs.
One could try to apply this to digitization, with the bit as the ultimate
component. And low levels as bytes or pixels.
Beyond some point, there will be both concretization for some beings and more
and more detailed analysis of the functions.
6.3. Complexity/stability
A priori : instability
grows with complexity. That is frequently asserted.
Perhaps should we formulate a definition of stability ?
If the risks/threats are known, then appropriate defensive devices can improve stability. But they could add new risks and instabilities. Something gödelian here also.
Positive coupling. In a number of cases, complexity increase is used to enhance stability. Some complex structures are very stable. One could event say hyper-stable, for instance to compare hardware and software.
From a geometric
standpoint, a trajectory is said :
- asymptotically stable if all the sufficiently near trajectories at t=t0 tend
towards it asymptotically when t goes to infinite (independence)
- neutrally stable if all the trajectories near enough at t= t0 remain
sufficiently near at all the ulterior instances (but without necessarily going
nearer asymptotical (periodicity, for instance))
- unstable if the trajectories near at t=t0 become more distant when t goes to
infinite (divergence) (trajectory of the state variables in the variable
space). Bertalanffy 1968
7. Complexity of hardware and software
Out of a first analysis :
- if the network of components is built at
random (e.g. by stochastic combination of any physical components in a
given electronic family), the probability of getting a stable system decreases
rapidly with the number of components. In the case of electricity, we shall
probably have a rapid degradation due to short-circuiting of electromotive
forces, destruction of components by overheating... until we reach the
stability of a dead system.
- if the network of components is built according to appropriate rules, the growth in complexity will tend to increase stability inside a given perturbation state (for Chauvet, it is a specific feature of life); but at the price of reduced reliability, life duration, etc; as soon as we go beyond the limits of that general space, we have specialization, with its pros and cons.
This space of perturbations
may be:
- physical (heat, cold, and radiations),
- teleological (needs, computer uses).
By itself, an elevation of costs reduces survival hopes of a device, in a given
state of the market, hence its "stability" versus the perturbations
of that market.
Actually, there are two thresholds:
- the system can work only over a minimal level of complexity ; over this
level, reliability may me enhanced by redundancy, fault detection and fixing
and generally "fault tolerance".
- beyond another threshed, complexity by itself becomes a hindrance to
maintenance and fault fixing, and the failure probability grows with the number
of elements, etc. Unless, perhaps, we may do a sort of jump towards some
meta-system, or a new "generation" of systems. That may happen in epistemology
(Copernican upturn, present problems of evolution theory).
Thesis: for a given useful complexity, a lower functional complexity results in a growth in material complexity, but one take into account the adaptation rate (conformity, ergonomics).
Thesis. An increase in material complexity
to simplify use results in creation of situations, hopefully rare, where the
expert becomes even more necessary, and the fault fixture more difficult. That
is not too hampering when the product is cheap enough to be replaced, and there
is no risk. The worst case are those where :
- fixing is difficult and restricted to experts,
- the fault consequences are serious (human lives).
Typical case : nuclear plants.
Varia to be edited
See several articles in the Berger/Complexity-Beauty directory.