[Review article - NKS and social impact] - A New Kind of Science: The NKS Forum

Pages:1

# Review article - NKS and social impact

Posted by: Jason Cawley

(Newspaper article in the Vancouver Republic by Kevin Potvin )

Abstract -

Stephen Wolfram's A New Kind of Science, has still not been looked at critically or historically. It should be: it just might be as important a book as the author annoyingly says it is.

The whole thing can be found here.

http://www.republic-news.org/archiv...in_politics.htm

A brief review of the article from me follows, in the next post.

Posted by: Jason Cawley

While I think the writer is right to raise the subject and in the long run NKS may have a significant impact on social thought and through it on social realities, there are a number of conceptual problems in the article that prevent the writer from seeing which way such tendencies are likely to go.

First, he treats random and equations as opposites. But there are equations that tell us how randomness typically behaves - they are called mathematical statistics. Gaussians occur when randomness of some characteristic and limited scale accumulates in an uncorrelated, serial fashion, for example. Perfect randomness has a certain simplicity to it, in other words, that equations can capture. True, they typically involve idealizations. And as the size of samples fall, the math itself tells us that the results can be farther and farther from the overall infinite limit prediction. Statistical math underspecifies small discrete cases - it says in effect "we can't and won't know until we see".

What is more interesting about complexity is that there often are correlations among micro elements rather than pure independence. Even where compressed mappings - like averages - do show effective overall independence or something close to it, micro ensembles may not. Statistics works by treating details of arrangements as inessential, by abstracting them away. When a system is actually complex rather than merely random, the particular micro arrangement matters.

In laymen's terms, you can't understand programming by taking your computer's temperature. The temperature is an average over a whole ensemble of microstates, and if all you want to know is the pressure of a gas, that average is all you need to know. Randomness is in fact what allows the average and only the average to remain significant. The states of a computer's memory chip, on the other hand, are complicated, not just random. The detailed arrangement matters, not just an overall average. Because the role of each bit is functionally slightly different and correlated by some rule the computer is operating under, they don't just add up or readily cancel.

This is not adequately described as "random" rather than "equational". It is other than equational, yes. But not because it is random. Rather, the details of the micro arrangement can matter to the overall behavior. We say, the system is programmable. By that we mean, slight tweaks to some initial condition can make the overall system do arbitrarily different things, not on the level of averages but on the level of particular arrangements. It is flexible rather than merely random. NKS hypothesizes that most of the apparent randomness we see is in principle flexible in this way. We just typically do not see how to control the micro ensembles, and track overall averages as easier to grab onto. A limitation of our analysis, not of the internal dynamics of the system.

All of that addresses his thinking randomness is something specific to NKS or something OKS and traditional math could not deal with. It dealt with it, but by averaging random results - it used the overall orderliness resulting from perfect of lack of coordination to predict average outcomes. NKS suggests instead not that randomness would occur - something OKS fully recognized - but that details of particular ensembles can matter, that details are not in general uncorrelated or uncoordinated, that averaging is throwing away relevant information.

At a later point in the article he speaks of what he imagines would be new in an attitude towards economics that embraced randomness, and speaks of it as "maximizing the use of resources". He doesn't notice that this is OKS thinking. Maximizing is locating the zero of a derivative. It relies on fixed points in continuous maps. And the underlying motivation is an implicit linear model - if some resources aren't being utilized, then presumably any adjustment that uses more of them will produce better outcomes - because it is presumed that output is a linear function of inputs, and using more inputs will therefore improve things.

Where is attention to micro ensembles supposed to enter? He doesn't seem to know. The classical economics answer is "by equalization of marginals" - the derivative of value with respect to every varying input is to be equalized across all concrete alternatives for use of every resource. That is supposed to specify a unique micro ensemble as the best possible coordination among competing uses. In reality, all OKS economic theory can say is that feedbacks should naturally appear that may act as forces toward such an allocation whenever the present one is far from it (in the form of arbitrages or profitable trades etc).

On the macro level, OKS econ assumes details do not matter and so it averages, counting on randomness and lack of micro-correlation to produce statistics. On a micro level, OKS econ assumes the details not only matter but are tuned so exquisitely that every opportunity for improvement is wrung out of the microstate by hyperaware maximizers alert to every conceivable direct or indirect substitution. All competent economists know that neither is true, that neither reflects the actual calculations or mental models of practitioners, or captures real economic dynamics as the process of using limited information to arrive at a particular coordination. But they lack the formal methods to deal with the more intricate case.

NKS suggests that the proper formal method is to model economic systems as ensembles of simple programs, which when aggregated may be programmable rather than merely random. Economists who have hitherto mostly taken the economy's temperature may instead get some insight into which bits do what on the circuit board. This is not the same as accepting randomness. It is realizing that particulars of microstates can matter, that the system is flexible rather than merely random. Yes, predicting what flexible systems do can be hard, and the writer may think that is what "random" implies. But it is not so - randomness is in fact one thing OKS exploits to predict - a form of order on a meta-level - by restricting its attention to overall averages and the like.

He also tosses in some thoughts about the role of uncertainty that seem to me rather too pat. Basically he is trying to paint OKS and math as the empire of self assured certainty, which is inaccurate and a slander. And to ascribe all possible virtues to uncertainty. This is a prejudice of philosophical skepticism, which thinks little or nothing can be known but imagines the world is populated by hindbound non-skeptics who refuse to admit this. NKS does not say "nothing can be known", it says something more interesting than that. It says "here you can know deductively, and there you can experiment or manipulate but not know beforehand everything the experiment might do". With a definite line drawn between them. This is not a claim about the non-existence of knowledge - quite the contrary.