Grand Challenge: Journeys in Non-Classical Computation

The fourth and last review of Wetware related Grand Challenge proposals; we take a look at the ‘Journeys in Non-Classical Computation‘ project. This proposal differs from the rest of them in that it does not propose any direct goals, but rather journeys down some of the less traveled roads of computer science to see where they lead.

The proposal has a range of suggestions but in essence, it encourages computer scientists to take an out-of-the-box approach when seeking solutions for problems in computer science. This includes taking into account a lot of things that Wetware discusses, including: genetic algorithms, biomimickry, DNA computing, nanotechnology, chaos theory, quantum computing, fuzzy logic, cellular automata, parallel processing, artificial immune systems, evolving hardware and emergent properties.

Grand Challenge reviews
Here are the individual Wetware related Grand Challenge reviews:

Computer science has for the last 60 years (more or less its entire life), been based on the computation theories usually associated with computing pioneers Alan Turing and John von Neumann, what the proposal paper calls “classical computation theories” (pretty young to be called classical already).

The ‘Journeys’ proposal does not suggest that the classical methods should be abandoned, and acknowledges their great success. It rather suggests that computer scientist also look outside this classical frame for solutions. Even though it’s probably longer than most of you will read, I think the proposal’s list of paradigms that define classical computing and possible non-classical extensions, give the best possible picture of the aims:

    1: The Turing paradigm

      classical physics: information can be can be freely copied, information is local, states have particular values. Rather, at the quantum level information cannot be cloned, entanglement implies non-locality, and states may exist in superpositions.

      atomicity: computation is discrete in time and space; there is a before state, an after state and an operation that transforms the former into the latter. Rather, the underlying implementation substrate realises intermediate physical states.

      infinite resources: Turing machines have infinite tape state, and zero power consumption. Rather, resources are always constrained.

      substrate as implementation detail: the machine is logical, not physical. Rather, a physical implementation of one form or another is always required, and the particular choice has consequences.

      universality is a good thing: one size of digital computer, one size of algorithm, fits all problems. Rather, a choice of implementation to match the problem, or hybrid solutions, can give more effective results.

      closed and ergodic systems: the state space can be pre-determined. Rather, the progress of the computation opens up new regions of state space in a contingent manner.

    2: The von Neumann paradigm

      sequential program execution. Rather, parallel implementations already exist.

      fetch-execute-store model of program execution. Rather, other architectures already exist, for example, neural nets, FPGAs.

      the static program: the program stays put and the data comes to it. Rather, the data could stay put and the processing rove over it.

    3: The output paradigm

      a program is a black box: it is an oracle abstracted away from any internal structure. Rather, the trajectory taken by a computation can be as interesting, or more interesting, than the final result.

      a program has a single well-defined output channel. Rather, we can chose to observe other aspects of the physical system as it executes.

      a program is a mathematical function: logically equivalent systems are indistinguishable. Rather, correlations of multiple outputs from different executions, or different systems, may be of interest.

    4: The algorithmic paradigm

      a program maps the initial input to the final output, ignoring the external world while it executes. Rather, many systems are ongoing adaptive processes, with inputs provided over time, whose values depend on interaction with the open unpredictable environment; identical inputs may provide different outputs, as the system learns and adapts to its history of interactions; there is no prespecified endpoint.

      randomness is noise is bad: most computer science is deterministic. Rather, nature-inspired processes, in which randomness or chaos is essential, are known to work well.

      the computer can be switched on and off: computations are bounded in time, outside which the computer does not need to be active. Rather, the computer may engage in a continuous interactive dialogue, with users and other computers.

    5: The refinement paradigm

      incremental transformational steps move a specification to an implementation that realises that specification. Rather, there may be a discontinuity between specification and implementation, for example, bio-inspired recognisers.

      binary is good: answers are crisp yes/no, true/false, and provably correct. Rather, probabilistic, approximate, and fuzzy solutions can be just as useful, and more efficient.

      a specification exists, either before the develop-ment and forms its basis, or at least after the development. Rather, the specification may be an emergent and changing property of the system, as the history of interaction with the environment grows.

      emergence is undesired, because the specification captures everything required, and the refinement process is top-down. Rather, as systems grow more complex, this refinement paradigm is infeasible, and emergent properties become an important means of engineering desired behaviour.

    6: The “computer as artefact” paradigm

      computation is performed by artefacts: computation is not part of the real world. Rather, in some cases, nature “just does it”, for example, optical Fourier transforms.

      the hardware exists unchanged throughout the computation. Rather, new hardware can appear as the computation proceeds, for example, by the addition of new resources. Also, hardware can be “consumed”, for example, a chemical computer consuming its initial reagents. In the extreme, nanites will construct the computer as part of the computation, and disassemble it at the end.

      the computer must be on to work. Rather, recent quantum computation results suggest that you don’t even need to “run” the computer to get a result!

A thought provoking list of ideas indeed and as you can see the reference to nature and biology is all over it, or as put in the proposal paper:

    Many computational approaches seek inspiration in reality (mainly biology and physics), or seek to exploit features of reality. These reality-based computing approaches hold great promise. Often, nature does it better, or at the very least differently and interestingly. Examining how the real world solves its computational problems provides inspirations for novel algorithms (such as genetic algorithms or artificial immune systems), for novel views of what constitutes a computation (such as complex adaptive systems, and self-organising networks), and for novel computational paradigms (such as quantum computing).

    Meta-heuristic search techniques have drawn inspiration from physics (simulated annealing), evolution (genetic algorithms, genetic programming), neurology (artificial neural networks), immunology (artificial immune systems), plant growth (L-systems), social networks (ant colony optimisation), and other domains.

The proposal paper goes on to explain each of the proposed paradigm changes in more detail. If you’re intrigued by the above list, I propose you read it. There is a lot of interesting thoughts and ideas. At the end of the paper, six journeys in non-classical computing are proposed:

  • Non-Classical Physics – Quantum Software Engineering
  • Non-Classical Refinement – Approximate Computation
  • Computing in non-linear media – reaction-diffusion and excitable processors
  • Artificial Immune Systems
  • Non-Classical Interactivity – Open Dynamical Networks
  • Non-Classical Architectures – Evolving Hardware

All of these journeys use some or even most of the proposed non-classical methodologies, and some of them might even be seen as Grand Challenges of their own.

This is probably the most extensive proposal of the four. It holds a lot of highly interesting concepts and ideas, some of whom I believe are new, while others have been around, mostly as niche subjects, for some time. I find the proposal sometimes mixing up the computation with physical and biological mechanics, for example I find it far-fetched to say that maggots eating flesh are thereby “performing a computation” or that distillation is a computation to sort chemicals. Unless of course one views the entire universe as one giant supercomputer.

This proposal is in my mind the most interesting of the Grand Challenge proposals as it hits the Wetware nail right on the head. However, because of the lack of clear goals, I think it is unlikely that it will be selected as THE Grand Challenge.