Index  Comments

I've been mulling over alternative machine designs amongst others lately, and the cellular automaton repeatedly comes up as a design clearly different, but also mostly useless. Follows is a basic idea for a cellular automaton acting as the memory of a greater machine, to defer only some computations.

Being one-dimensional, each unit of memory would only be able to communicate with the two closest to it, and my basic idea has been careful to restrict functionality, so that everything would be local. To increase efficiency of my design, an individual unit of memory should be as large as is feasible.

I can imagine, say, eight states for each unit to be in, with the inert state being identity, merely returning its result verbatim whenever probed. To control signal propagation, it would be important to have a waiting state; with this, the following states could be entered, and other machinery could then wait for the waiting end to revert to identity. All signal propagation could be local, using a mechanism in which one unit can be told to enter a state, most of which would naturally effect those paired nodes. Controlling the direction could then be achieved by simply blocking either direction.

So, the greater machinery could cause changes to the memory by placing one end in the waiting state, telling the other end to propagate a signal towards it, and then having any other units wait for the waiting to end. An edge case is one node being both start and end, and could be handled in hardware or software. My design also permits making two ends wait on a signal begun somewhere in the middle.

These other states would simply induce the nodes propagated to into the same state, with the waiting state handling this specially. The third state would copy its contents to another node. The fourth state would increment from its previous node, thus placing APL's iota function into the hardware. A decrementing state is unnecessary, as propagation works in either direction; hardware should make it easy to reverse memory in O(1) time, by translating how it's accessed as needed. I've not thought a great deal about other states this memory could be made to encapsulate, sans that of sorting itself.

The memory could be made to sort itself, by swapping based on comparison, but the prime issue is the waiting state, as sorting doesn't progress linearly as needed. There are two primary options: a new signal which would only propagate while memory is sorted, and which would end the waiting state upon reaching it, whereas the waiting state would otherwise refuse to revert to identity when sorting; or precisely timing the worst-case behaviour of the sorting and simply waiting until that time is used.

This is thus a basic computational memory that is obviously somewhat useful, by virtue of having its states selected for the purpose, rather than derived from more base rules. It's very interesting to consider other specialized memories, such as linked structures, but I leave this to another writing.