A New Kind of Science: The NKS Forum > Pure NKS > Class three behavior from class four rules
Author

Registered: Jan 2004
Posts: 357

Class three behavior from class four rules

What happens when you combine elementary class four rule 110 and its mirror reflection, rule 124, in the same cellular automaton?

You might think that the behavior had to be as complex as the rules themselves. After all, combining the two in one cellular automaton is certainly not making matters simpler.

However, it turns out that the behavior can be less complex than the rules themselves. In fact, it can be class three random behavior.

For instance, the attached graphic shows class three random behavior from simple initial conditions and the mirror set of elementary rules 110 and 124.

What is happening is rule 110 executes on every odd time step and rule 124 executes on every even time step. So output from rule 110 is input to rule 124 and output from rule 124 is input to rule 110.

The effect of this oscillation is to dampen out the irregularity and make the result more uniform. So it is simplifying the behavior that normally occurs when either rule is employed by itself.

Question: Is this an example of programming class four rules to emulate class three behavior by virtue of the principle of universality?

Lawrence J. Thaden has attached this image:

__________________

Report this post to a moderator | IP: Logged

06-09-2006 12:45 PM
IanLuxmoore

NEW ZEALAND

Registered: Apr 2006
Posts: 10

If class 4 behaviour is on the border between order (class 1 and 2) and chaos (class 3) then wouldn't this suggest that combining the 2 rules has pushed it over the edge into "chaos" (ie random behaviour) therefore increasing it's "complicatedness" (to avoid the term complexity). This intuitively seems feasible.

Personally I would order the rules 1,2,4,3 for this exact reason, and Wolfram notices this as well - for example on page 242 he mentions this several times.

Report this post to a moderator | IP: Logged

06-10-2006 03:18 AM
Jason Cawley
Wolfram Science Group
Phoenix, AZ USA

Registered: Aug 2003
Posts: 712

No, it is not an example of programming a 4. That refers to changing only the initial condition used, not the rule itself.

What you are looking at is what I call an ICA - I for interactive - with a periodic interaction condition. An ICA is the "machine with input" form of a CA, with a bit (or more) fed in at each step that switches the rule it applies. This can be thought of as a more general rule with 2 halves of its rule table specified by the ECA rules, and an addition "vertical" or "global" bit that effects all cells on a given step, fed in by an outside sequence.

There are 65536 such basic ICA rules. You can readily generalize it to any number of rules - effectively allowing the external bit-stream to have any number of colors. Then you can treat the interaction condition - the sequence of global bits used - as a kind of separate initial condition.

In programming terms, instead of a NestList of repeated application of the same function to its prior results, you have a FoldList that feeds in an external variable at each step, as well.

There are a number of threads on the forum about ICAs and code for running them. In general they give the same classes of behavior as ECAs, but also many finer gradations between them. In particular, the ability of the interaction condition to "nudge around" even a class 2 behavior, makes it easier to mix class 3 or 4 rules with something stabler.

You get a much finer sense of gradation between behaviors, as you tune the portion of the interaction condition applying one rule, vs the other. For some mixes the IC is obvious, though - it makes alternating bands in a pattern that otherwise lacks them e.g.

You can find cases where the ICA cross of 2 class 2 rules is complex. That is the largest change I've seen in them, in rule class terms. It arises because there are many class 2 rules that have enough going on to support complexity, but don't "naturally" because they send previous steps to steps that lack the patterns needed to cause their more complicated behaviors.

That shows up as short transients where more seems to be going on, when started from random initials - but it then dies out. If, in the period of time the transients typically last, you switch to a different rule, you can send the next state to something that still has transient effects. So interleaving 2 such class 2 rules can sometimes give class 3 looking randomness.

You can also get a lot more stuff that looks 4-ish, from mixing stability via a 2, with more complicated behavior from a 3 or 4. You get local structures from the class 2 steps, but they are altered in complicated ways by the interleaved class 3 or 4 steps. As you tune the portion of 3 or 4 steps lower, you get something that looks 2 stable, but can be nudged around in minor ways.

There is little doubt that many ICAs are universal. Obviously they are, trivially, if one of the component rules is.

When the interaction condition is constrained to be exactly periodic with period 2, though, you won't reach the full range of ICA behaviors, that are possible with a general, unrestricted pattern of application of the first or second rule. Such rapid switching tends to make class 3 behavior more common.

ICAs were a generalization of the GCA idea, which also switches the rule on each step, but gets the bit to use from some function over the system's own previous state. G there stands for "global". The GCA set up was dreamed up by Wolfram and Paul Davies over dinner a few years ago, to explore different sorts of "emergence'.

The ICA set up was invented at the 2004 NKS summer school at Brown, in part to explore claims by Peter Wegner that interactive computation reaches fundamentally more than the space of classic computables. They are useful but I believe that extreme claim for them is inaccurate. It has precursors in the "non-deterministic Turing machine" set up, which Wolfram discusses in NKS but was doubtless around before, as well.

For every sequence of switches a GCA switch function might give rise to, there is an ICA sequence giving the same behavior externally. Since a GCA is constrained to be a single valued function of the system's previous state, the reverse is not true, and ICAs are a strictly larger space than GCAs. To see this notice that a GCA must cycle if it hits the exact same state, while an ICA might get an external 0 the first time around and an external 1 the next time around, and so fail to cycle.

GCAs are still a natural model for some systems, and ICAs are a quite flexible set up and a natural model for many others. The search spaces are larger because the dependence on the externals is more involved and there are more cross rules, rising as fast as increasing neighborhood size would.

You can also generalize the idea to have different rules applied at each (position, time) step in the evolution. This makes the interaction condition a whole array rather than a list of bits. A standard ICA then corresponds to feeding in uniform steps {0,0,0,...0} then {1,1,1,..,,1} etc. And a standard CA to feeding in {0,0,0...} at every step.

You can use that set up to implement a probabilistic CA as well, with all the "coin tosses" effectively "pulled out" and "pre-generated". Naturally it is easier to implement the last with a random call inside the step function, but the formal ability to separate them shows a direct correspondance between generalized ICAs and nondeterministic rules.

I hope this helps.

Report this post to a moderator | IP: Logged

06-10-2006 03:38 PM

Registered: Jan 2004
Posts: 357

Thanks, Jason.

I see from the threads you directed me to that Alastair Hewitt posted this very same ICA on March 8, 2005.

You mentioned summer school. Are there proceedings published?

By the way, I wish you well in your NKS 2006 presentation next week!

__________________

Report this post to a moderator | IP: Logged

06-10-2006 07:14 PM
Alastair Hewitt
Harvard Extension School
Cambridge, MA

Registered: Feb 2004
Posts: 35

Hi Lawrence,

I've been doing some investigations into ICAs for the last year or so. I've been looking for CA behavior that will essentially maintain a fairly consistent "band of randomness" that neither grows or shrinks. This ICA is about as close as I've got with a simple set of alternating rules (this uses a sequence of 40 rules).

From a basic understanding of entropy etc, I believe it fundamentally impossible to achieve a non-repeating pattern that neither grows or shrinks. However, the closer you get to this goal, the more interesting the internal behavior of the band of randomness. I've been experimenting with alternating a pair of rule sequences (like the one that grows and the one that shrinks) to arrive at a constant band. This can be done using basic data compression code to measure the amount of randomness in the current CA rows. Sending the binary data from the last few rows through a compression algorithm results in a compressed data output. The ratio between the size of the input data and the output data is a measurement of the randomness (more precisely - predictability). When the randomness grows, the rule sequence that shrinks the randomness is chosen, and vice-a-versa when the randomness shrinks.

I was hoping to have some interesting results for this year's conference, but I still need to do some more work on it. If you imagine trying to herd cats, then you'll have a good idea of what I'm up against! One big problem is trying to "connect" the two rule sequences; they tend to have very different periodicities (even with the same number of rules). Another issue is the randomness measure is really a measurement of how unpredictable the pattern is relative to the compression algorithm being used. A process of "evolution" in the ICA pattern can ultimately expose the internal behavior of the compression method and make the whole system unstable.

Did you see this thread? Peter Wegner's work is well worth checking out (even if his conclusions are a bit of a stretch).

Alastair

Report this post to a moderator | IP: Logged

06-12-2006 05:02 PM

Registered: Jan 2004
Posts: 357

Alastair,

What data compression routine do you use?

__________________

Report this post to a moderator | IP: Logged

06-13-2006 03:45 AM
Alastair Hewitt
Harvard Extension School
Cambridge, MA

Registered: Feb 2004
Posts: 35

Most of what I do is in Java, so I use the standard Deflator/Inflator in java.util.zip. These in turn use the open license ZLIB library (also used in the PNG image standard). However... to keep things simple I set it to HUFFMAN_ONLY, which just does Huffman Coding. You still get a good method for measuring the predictability without introducing too much additional complexity.

There's a comprehensive section on compression (including Huffman coding) in NKS

Alastair

Report this post to a moderator | IP: Logged

06-13-2006 12:32 PM

Registered: Jan 2004
Posts: 357

Defining Complexity

Ian,

Your suggestion that the classes should be ordered {1, 2, 4, 3} reminded me of a thread on the forum in which Tony Smith, from Australia, expressed gratitude over new found interest in research on the edge of chaos.

You can find his remarks at: “The rise and fall of Langton’s Lambda”. What is interesting is the link in which he presents the same order of the four classes as you do.

I agree with you that the behavior of the ICA with rules 110 and 124 does, to use Jason Cawley’s terminology, “nudge” over to random class 3.

However, notice on pages 557-558 how the NKS book addresses the question of complexity in this regard. If I read it correctly, it is saying that class four is more complex than class 3.

Also on page 1069, discussing complexity theory in the 1970's, Wolfram says:

"It was noted that both very ordered and very disordered systems normally seem to be of low complexity, and much was made of the observation that systems on the border between these extremes -- particularly class 4 cellular automata-- seem to have higher complexity."

__________________

Report this post to a moderator | IP: Logged

06-16-2006 08:50 PM
IanLuxmoore

NEW ZEALAND

Registered: Apr 2006
Posts: 10

It is a tricky one lol - I think there is some ambiguity with regard to terminology. Complexity implies something that "complex" doesn't. Complexity has almost by default taken on the meaning of stuff that is between order and chaos thanks to books like Waldrop's Complexity etc. However it really depends on your point of view as to how "complex' something is.

Class three is much more "complex" than class four if you are trying to describe the results to someone in detail.

Class four is much more "complex" than class three in terms of what you can do with it.

Report this post to a moderator | IP: Logged

06-17-2006 05:05 AM

wolframscience.com  |  wolfram atlas  |  NKS online  |  Wolfram|Alpha  |  Wolfram Science Summer School  |  web resources  |  contact us