A New Kind of Science: The NKS Forum : Powered by vBulletin version 2.3.0 A New Kind of Science: The NKS Forum > NKS Way of Thinking > Computer architecture
  Last Thread   Next Thread
Thread Post New Thread    Post A Reply
Kovas Boguta
Wolfram Science Group

Registered: Oct 2003
Posts: 38

Computer architecture

Im browsing one of the computer achictecture books by Hennessy and Patternson and come across this quote -

"Pitfall: Microcode implementing a complex instruction may not be faster than a sequence using simpler instructures"

Of course this is why RISC is actually useful. But one wonders what the NKS explanation of this is?

They are of course thinking of instructions and implementations that they engineer. But how is this true in the space of all possible simple programs? What is the landscape of this space?

This would be an interesting experiment to set up.

But its also interesting to think of it in terms of nks way of thinking. What does it mean for there to be on average no faster way of doing a computation other than with a set of simple primitives?

It may seems to imply that the computation is irreducible. But it might just be the case that the in the world of instruction sets, most possible cases happen to be quite decently covered by a few primitives. (or at least as well covered as they can be under the circumstances).

So im not sold that it "means" the computation is irreducible, but it does seem like it ought to mean something.

Everything is an expression.

Report this post to a moderator | IP: Logged

Old Post 12-10-2004 07:43 AM
Kovas Boguta is offline Click Here to See the Profile for Kovas Boguta Click here to Send Kovas Boguta a Private Message Visit Kovas Boguta's homepage! Edit/Delete Message Reply w/Quote
Jon Awbrey

Registered: Feb 2004
Posts: 558

RISCy Theorem Proving


I ran into a similar problem when I was just
beginning to write theorem proving programs.
For example, there are axiom systems for
propositional logic that use umpteen
axioms, some that use eight, and some,
like the one that comes down to us
from C.S. Peirce and Spencer Brown,
that uses only four initial axioms.
There might be additional oomph in
the more complicated axiom, but
the overhead bottleneck comes
at the point where you have
to decide which axiom to
use to advance a proof,
and many times you are
reduced to trying each
of them at random, so
the degree of the
branch point is
the critical

Jon Awbrey

Report this post to a moderator | IP: Logged

Old Post 12-10-2004 02:18 PM
Jon Awbrey is offline Click Here to See the Profile for Jon Awbrey Click here to Send Jon Awbrey a Private Message Visit Jon Awbrey's homepage! Edit/Delete Message Reply w/Quote
Post New Thread    Post A Reply
  Last Thread   Next Thread
Show Printable Version | Email this Page | Subscribe to this Thread


wolframscience.com  |  wolfram atlas  |  NKS online  |  Wolfram|Alpha  |  Wolfram Science Summer School  |  web resources  |  contact us

Forum Sponsored by Wolfram Research

© 2004-16 Wolfram Research, Inc. | Powered by vBulletin 2.3.0 © 2000-2002 Jelsoft Enterprises, Ltd. | Disclaimer | Archives