Wolfram Science Group
Phoenix, AZ USA
Registered: Aug 2003
If I only had ten minutes with someone unacquainted with the subject, I don't think I'd try to explain what a CA is or how it works. I'd stay more general than that.
The successes of modern science have largely been built on mathematics, finding relationships in nature that follow the various forms described by mathematical functions and equations. Many problems can be addressed directly this way. With some problems, involving large numbers of interacting elements, we can extend the same methods by focusing on only some overall properties we are interested in - averages and other statistical measures - provided the relationships among the elements of the system are simple enough. We leave out a lot of other details this way, but get to keep using mathematical techniques. But with some systems, these details matter even for the eventual overall behavior. And traditional mathematical methods therefore don't manage to crack them, really.
Wolfram looked for generalizations of mathematics that might be able to get further, with such cases in mind. The idea is to use computer programs rather than mathematical equations to describe the formal relations within the system. In an overall way, this is a familiar idea from computer modeling. Wolfram then asked the question, how simple can these programs be, and still generate the various sorts of behavior we see in natural systems? Do they need to be highly involved and specific? Math was useful because the same simple form - a line or a repeating wave e.g. - was seen over and over in system after system. Simple things tend to be general things, and what we know about them from other cases and formally can thus be applied in other places too. Can programs simple enough that could be true of them too, generate complex behavior like we see in nature, especially the sort traditional math wasn't letting us crack?
Rather than starting with very complicated computer models and trying to make them marginally simpler, Wolfram started at the other end. He took some of the simplest possible computer programs. And he enumerated whole types or classes of them. And then looked through those types to see what sort of behavior they could generate - much as one would look through conic sections or trigonometric functions or the simplest differential equations, expecting to find patterns general enough they could occur in a wide range of real systems.
He thought he'd have to dial up the allowed complexity of these classes of programs to get any interesting new behavior. He found, instead, that even the simplest full class of cases he looked at - a class of rules called cellular automata - already included resulting behaviors as complicated as anything he'd seen. This might have been something special about the first rules he looked at. So he examined a large range of other rules, each different in set up and details, dropping feature after feature of the original class. Sometimes he had to allow a few additional relations among elements, but he found the same variety of behaviors again.
The range of behaviors reached by simple programs thus has two surprising characteristics, found empirically in computer experiments. One, they can behave in surprisingly complicated ways. And two, the ways they can behave are remarkably stable across changes in the underlying rules, and don't seem to depend on those details in any fundamental way. Any sufficiently involved set of rules for simple programs can generate the same large classes of behavior - evolution to a fixed point, stable cycles, nesting, seemingly random complexity, and complicated interaction of localised structures - for some choice of the particular rule. With the details of the patterns seen sometimes also dependent on the initial conditions, sometimes largely insensitive to those.
The idea is then to use simple programs to model behaviors in nature that exhibit similar patterns of overall behavior. Including some that have proved hard to address with standard mathematics. Wolfram showed a number of simple examples of this - in crystals, in fluid flows, biological forms and pigmentation patterns. And he is convinced it is much more general than those, a basic way systems natural or formal can and do behave. Perhaps including fundamental physics.
Wolfram then asked in effect why this should be so, what underlying formal facts might account for it? What is it that even simple programs are already able to do, that allows them to generate such complicated behaviors, and that makes the same sorts of behavior crop up again and again in system after system? What do they all have in common, despite their detailed differences? Wolfram ties this back to significant formal results in the 20th century, about the idea of universality in a formal system. Which came out of investigation in the foundations of math, and in turn helped make the computer revolution possible.
Universality is what lets a practical computer with a single hardware set-up emulate all kinds of behavior by varying its program or software. The program corresponds to the initial conditions. And the hardware corresponds to a particular rule. A rule qualifies as universal if there is some way of programming it aka some set of initial conditions, that can get it to produce the output of any other computable system. And what Wolfram found is that even programs not explicitly set up to have the property of universality do in fact have it. And not highly complicated ones unlikely to occur anywhere but in systems we engineer - as was already known - but in even the simplest classes of rules. So simple, we can readily imagine natural systems behaving according to them.
So Wolfram speculates that the reason we see the same complex behavior in so many classes of system, is because they are all across the threshold of universality. They are systems of equivalent internal sophistication, as it were. You can't limit what they are going to do beforehand without putting constraints on their initial conditions, because with full choice of those you could program them to behave in now one way, now another. He thinks the variety and complexity we see in natural systems reflects the same formal variety he saw among simple programs. And that even some of the simplest of these already have enough going on, that they could account for the most sophisticated things we see.
If this is right, it would provide a basis for a significant extension of science, from systems behaving according to simple mathematical relations as we've used in the past, to systems behaving according to simple programs. And this extension would take us all the way to the most involved behaviors we see, without needing to use programs with enourmously complicated, special set-ups or thousands to millions of lines of special code. To make this happen requires a pure study of computer programs parallel to the study of pure mathematics in the past. And an experimental attitude toward what simple programs can do - since it can be arbitrarily hard to see what even a simple program might do, just by looking at its rules.
The gist of it, then, is that it turns out computer programs can go farther than traditional math as a formal basis for science, and one can get a lot just from the simple ones. The new "pure mathematics" of future science based on this idea, will be the formal study of the behavior of simple programs. The new "applied science" of this idea, will be using simple programs to model all sorts of natural systems - and potentially also to engineer all sorts of technological systems.
We didn't invent computing. For a few decades we thought so, because the only systems we knew about that did, were systems we specifically engineered to do it reliably, on set tasks, and in ways predictable for us. But then we learned a bit more about computing, including how simple it can get and still meaningfully be called that. And noticed that nature has been computing - if seemingly arbitrary things - all along (just as planets had been moving in ellipses long before Kepler noticed). So in a sense, we find retroactively that we discovered rather than invented computing, or that we have just adapted a natural phenonemon to our purposes. With NKS, we have discovered the correspondance between what we thought we had invented not long ago, and what nature has been doing all along.
Report this post to a moderator | IP: Logged