Registered: May 2006
NKS Implications for Practical Applications
I'm interested in the insight of NKS experts, ideally Stephen Wolfram himself, on the following analysis. I am an AI researcher with experience in automated search, so my comments come from that context.
A fundamental insight of NKS is that a a very simple structure (i.e. a simple set of rules) can produce unbounded complexity. This insight is characterized as being exciting for science and engineering because it means that we might be able to "mine the computational universe" for these simple structures, and from them derive wonderful and complex processes and artifacts that would otherwise be impossible to discover or design.
Yet based on my knowledge of search, I would draw the opposite conclusion. The NKS insight, which is genuinely novel and interesting, seems to actually have negative implications rather than the positive ones touted.
The main reason is that "mine the computational universe" is another way of saying "search." That is, if there are indeed very useful simple strucutres out there that produce complexity, then our next task is to find them. Finding a particular instance among many possibilities means search.
It seems that the message of NKS is that somehow we are now at an advantage in this search by virtue of the fact that what we need to search for is smaller than we might have expected. In other words, in the parlance of search, the number of dimensions in the search space may be much smaller than we'd imagined. And that is supposedly good because that makes search easier.
Let's look at CAs for illustration without loss of generality. Wolfram points out that a few of the 256 8-bit CA rules have the special property of genuine complexity. Rule 30 is an example. However, there are two very important facts that are left out:
1) Most of the 256 rulesets of this size do _not_ have this property and thus are not like rule 30.
2) The particular complex system codified in rule 30 is only _one_ such system, and likely not the one we are looking for. For example, we may be looking for a building architecture or a musical piece. Even though rule 30 is interesting, it is probably not the rule that generates exactly what we want.
Fact #2 is not an endictment against NKS- it simply means that we must search for the rule that actually gives us what we want. Hence the importance of search, or "mining." However, fact #1 is highly concerning because it means that there are massive discontinuities in the search space.
In general, an effective search requires the search space to be roughly correlated and orderly. In other words, if I shift one bit in my ruleset, ideally the CA that results is discernably related to the one I had before. That's how search works- it's based on the assumption that taking one step doesn't land me in another universe.
However, it appears that in the world of interesting CAs, in fact taking one step _does_ land me in another universe. This is very bad news for search, because it means you cannot search this space systematically.
The fact that the rule set is small that produces rule 30 does not help. The cost in search is mostly through _evaluation_ rather than through the size of the ruleset dimensionality. That is, in order to test any rule, I must run the CA, and then convert whatever results into a simulation of the substrate I'm interested in. For example, if I am interested in building architectures, then I have to convert a run of rule 30 into its correlated building architecture and then run a series of tests for whatever criteria I care about. All that running, conversion, and testing means that for any candidate ruleset that I check, I have a significant computational expense, regardless of the size of the ruleset.
Well that's not particularly surprising, because that's exactly why people run search algorithms: They want to minimize the number of candidate solutions that must be checked.
Yet discontinuity in the search space implies that you _can't_ minimize the number of solutions you check because there is no meaningful relationship among neighboring CA rulesets.
In fact, the whole premise of NKS, i.e. that there are these diamonds in the rough, is actually an endictment against searching through such a space. In a good search space, the complexity of rulsets of the same size would be correlated, not wildly unrelated. When you moved up to a larger ruleset, you would get a commensurate boost in the complexity of the patterns that are generated. That would be a correlated landscape, the kind that is searchable and makes sense.
However, here Wolfram is saying look we have a completely uncorrelated landscape where among the very smallest rulesets there are totally idiosyncratic jewels that bear no relation to their neighbors, and no relation to the size of the ruleset.
The only way to search such a landscape is by pure random coinflips. There's just nothing else to go on. So that means if you have 256 rules and one of them is what you want, you will have to evaluate on average 128 rules!
That doesn't sound that bad, but imagine a space of 7 trillion rules. For example, Stephen Wolfram mentions that there are 7 trillion such CAs with 3-color rules. Ok, so let's say among those 7 trillion is exactly the rule we want to generate some amazing solution to a particular problem. Then, if blind search is the only feasible approach, we will end up evaluating about 3.5 trillon solutions before we find it. For any nontrivial problem, that will take until the end of the universe.
So I don't see how this can be useful in a practical sense. What would make more sense is if somehow you could generate more complex systems by adding more rules, i.e. by expanding from 8 to 16, or something like that. Then if the new, higher-dimensional space has a relationship to the old lower-dimensional space, you would be able to leverage what you learned in the 8-rule search in order to now step up to a new level of complexity. That's a logical, feasible search space with a meaningful heuristic. That's actually also how complexity is generated in most well-understood systems- you start simple and build up from there.
However, what NKS is claiming is that in fact the dimensionality of the ruleset is uncorrelated to the complexity of the solution, AND the complexity of the solution is uncorrelated among neighboring rulesets of the same dimensionality. In other words, the complex solutions lie essentially randomly distributed throughout the spaces (plural because there are spaces of different dimensionality) of rulesets.
So it seems like this all boils down to something perhaps interesting but not useful from a practical perspective. All it says is that if you blindly try out several trillion random numbers, one of them will win the lottery. Granted, that one that wins the lottery is in some sense incredible, but no principled method has been provided to reach it. In fact, on the contrary, evidence has been provided to show that it is likely there is NOT such a method (because the space is uncorrelated). Therefore, the implication is that there is no practical way to leverage this insight. All that we have learned is that magic is out there, though we will likely never find it.
Report this post to a moderator | IP: Logged