In his influential paper
Universality and complexity in cellular automata,
Stephen Wolfram proposed a classification scheme which
divided cellular automata rules into four categories -
according to the results of evolving the system from a
"disordered" initial state:
His proposed categories were as follows:
Class I (homogeneous)
Evolution leads to a homogeneous state (e.g. 1D CA:
0, 4, 16, 32, 36, 48, 54, 60 and 62).
Class I cellular automata evolve after a finite number of
time steps from almost all initial states to a unique
homogeneous state, in which all sites have the same value.
Class II (regular)
Evolution leads to a set of separated simple stable or
periodic structures (e.g. 1D CA: 8, 24, 40, 56 and 58).
Class II cellular automata evolve after a finite number of
time steps from almost all initial states to a series
of fixed configurations or oscillators.
Class III (chaotic)
Evolution leads to a chaotic pattern (e.g. 1D CA: 2, 6, 10,
12, 14, 18, 22, 26, 28, 30, 34, 38, 42, 44, 46 and 50).
Evolution of infinite class III cellular automata from almost
all possible initial states leads to aperiodic ("chaotic")
patterns. After sufficiently many time steps, the
statistical properties of these patterns are typically the
same for almost all initial states. In particular, the
density of non-zero sites typically tends to a fixed non-zero
value.
Class IV (complex)
Evolution leads to complex localized structures, sometimes
long-lived (e.g. 1D CA: 20 52 and 110).
With class IV cellular automata, complex behaviour, stable
or periodic structures which persist for an infinite time
and propagating structures are formed.
Wolfram's classification scheme was influential, and appealed
to common sense.
However, unfortunately, it fails to capture the idea of
universal computation. It was originally thought that
systems capable of universal computation would be found among
Class IV automata - since only they exhibited interesting
behaviour and signal propagation mechanisms such as gliders.
However, as
Eppstein pointed out
rules in all of the classes actually support gliders, and some
non-class IV rules also look as though they exhibit universal
computation e.g. see here.
Since they contain gliders, universal computation may well
show up among the other classes as well.
Since universal computation is such a fundamental feature of
cellular automata, it seems unfortunate that Wolfram's
classification scheme doesn't tell you much about it
with any certaintly.
What is Eppstein's classification scheme?
Eppstein describes his classification scheme
[here].
Here's a brief summary:
No pattern expands
If no pattern can never expand, no gliders
exist, and the rule is not universal. A similar phenomenon
occurs with rules which remain within a finite bounding
box - though they may compute functions which only
require bounded resources to calculate;
No pattern contracts
If no pattern can ever shrink, no gliders exist. However
universal computation could still occur in other ways;
for instance the boundary of an expanding pattern could
simulate the behavior of a 1d universal automaton.
Both expansion and contraction possible
Only in the remaining cases can gliders
and universality exist. Our investigations show that a large
fraction of the remaining cases do indeed support gliders;
much more work would be required to show that they are
universal.
Eppstein's classification scheme seems formulated to
overcome some of the issues with Wolfram's scheme -
and it does successfully identify at least a few
cases which do not exhibit universal computation.
What other classification schemes are there?
Automata may also be classified according to their linearity.
No linear automata are universal.
There's also a class of non-linear rules that
are also proven non-universal:
Contributions by: Harold V. McIntosh <mcintosh@redvax1.dgsca.unam.mx>
It's generally accepted that the following "elementary" (2 state, radius 1)
rules have typical class IV (or "complex") behaviours, because they feature
emergent coherent propagating structures (analagous to gliders in Life, but
1-D). These gliders move and interact on periodic backgrounds; collisions
between gliders produce new gliders.
rule 54 (and the equivalent rule 147). Rule 193 (and equivalent rules 124,
137 and 110).
Who first introduced this particular classification, began by denying that
(2,1) held any class IV's, although in a supplement to an article which was
published in Scientific American, some of the above Rules may have been
mentioned as worthy of interest.
The fundamental difficulty here is that the "Wolfram
Classes" are not well defined, even by Wolfram himself.
Taken at their face value, it has been proven that they
can't be; in articles by Culick III, as I recall. We are
somewhat in the situation of the Supreme Court justice who
declared "I can't define pornography, but I can sure
recognize it when I see it!"
Rule 54, which Prof. Wuensche mentions, has much in common
with Rule 18, which was studied by Peter Grassberger in
considerable detail at one time. The point with these rules
is that although they have fairly well defined stable
densities, there also seems to be a law of "conservation of
discontinuities." In Rule 54, this manifests with the
occurrence of large triangular holes on a background of
mostly small triangles and antitriangles. With Rule 18, the
effect is much more subtle.
If we apply the analysis described in the "about xlcau21"
series to Rule 54, the mean field curve, especially the
composite curve for several generations, has a superstable
fixed point a little less than 50 self-consistent contours
for pairs show a linear relation between 1 and 11 yielding
about half as many pairs as singlets. The n-block
probabilities are less clear, an indication that the
program's presentation could use some further work.
On
the ancestor side, although the rule is balanced, that is
only for the individual states. Like pairs have an extra
ancestor, unlike pairs lack one. The discrepancy grows, but
not violently; the eigenvalue of the T-matrix is around
2.23. The T-matrix does have an interesting pair of zero
rows; also a (different) pair of zero columns.
The story with Rule 193 is not overly different, except that
the rule is asymmetrical and the field supports right-angled
triangles rather than isosceles triangles. It is also true
that the rule is slightly unbalanced from the beginning, and
that using the minimal equivalent, Rule 110, the final
frequencies of live cells are somewhat larger.
The problem with the Wolfram Classification is that it is
much like the Christmas Story. It seems simple and easily
understood and makes for a kind of common reference when
talking about cellular automata. When subjected to closer
scrutiny, all kinds of questions and inconsistencies emerge.
It does not help that Wolfram seems to have been inspired by
a model from chaotic dynamics and Smale's "strange
attractors" which may not be all that close to the actual
behavior of cellular automata.
Suffice it to say that it is easy enough, especially at
first sight, to find structures such as those Wolfram
described: some evolution IS chaotic, other fields seem to
freeze up rather quickly. So far, there does not seem to
have been much progress beyond the recognition of "gliders"
in the mysterious realm called Class IV; for instance,
although Wolfram seems to have expected their imminent
discovery, not too many "glider guns" seem to have been
found, and his whole idea of finding a cellular-automaton
computer has fallen by the wayside.
Wolfram's most important contribution to cellular automata
seems to have been his way of looking at automata - first by
setting random fields loose to see how they evolve, and
second, systematically contrasting all the different rules
with one another. He may just have been lucky, to have tried
this at a time when suitable computer facilities had just
become available. The work of von Neumann centered on the
behavior of a specific automaton constructed for a specific
purpose. Although other automata were considered, all were
discarded which did not further the goal of self-
reproduction.
Conway's work with Life represented a different approach, in
that "marginal stability" seems to have been his goal, with
reference to the possibility of proving theorems about the
evolution, but once a rule meeting that requirement was
found, further development has mostly been oriented toward
exploiting that one single automaton.
Granted that even for one dimensional (2,1) automata it is
possible to find some with behavior which seems
tantalizingly regular, it would be nice to know that we are
not seeing "cloud castles" and "dragons" and all the other
paraphenalia of a small boy's imagination. So rather than
worrying too much about what to call everything, attention
ought to be directed to making more accurate predictions, on
the one hand, and chartacterizing what we observe, on the
other.
However, all these poetical ideas could quickly become
theological. Instead, we should realize that Prof. Wuensche
has shown us that there are still some nice complications
among the seemingly tame "elementary" (2,1) rules. Not
entirely forgetting its resemblance to Rule 18, we need to
notice that the spatial sequence (1000)*, according to Rule
54, has a temporal period of 4 also, and that it is an
extremely common feature of well aged configurations.
Besides the regular regions, there are slip lines, where two
microcells are joined out of phase. Sometimes the
interstices are larger; rarely are they extremely large save
as the consequence of an initial configuration. Holes tend
to fill by regular crystallization at the boundaries, but
there is always the chance for an inconsistency as the gap
nears completion.
The boundary regions are so conspicuous there is a
temptation to think of them as "particles" moving hither and
yon against a disregardable background. Sometimes the
movement is considerable and consistent, suggesting
"gliders" although for Rule 54 they tend to stay in place.
If Rule 54 particles move, it is more by diffusion than by
having a momentum. There also exists the possibility that
the particles collide, opening several avenues of further
evolution. They may disappear (pair annihilation), cross
over (solitons), or create some new kind of interstice
(particle creation or transmutation).
The scale of the field that is examined tends to influence
its interpretation, so it should be chosen as large as
possible to get the full effect of the dynamics of
imperfections. The accompanying[?] figure, shows only a
moderate scale, to keep its size within reasonable limits.
As usual, the name suggests the procedure to be followed for
decoding it.
Some rules map zero fields into one fields and conversely;
they are necessarily odd numbered rules less than 128.
Should there be extended runs or gaps, they will tend to
alternate, but with boundaries which can move around. This
makes for another environment in which to perceive Class IV
when the mood strikes.
Once the discussion of de Bruijn diagrams begins, there are
other aspects of would-be Class IV rules which could then be
described.
References
References
S. Wolfram. Universality and complexity in cellular automata. Physica D, 10:1-35, 1984.