Interesting perspective in computing by Jaron Lanier.
I like it because it sounds like computing approaches might begin converging on how the nervous system approaches problems. He takes on the dogma of elegant thought/theory by giants like Shannon, Turing, von Neumann, Wiener, pointing out that most of their viewpoint was constrained (to a degree) by sending signals down wires, which forces a particular temporal perspective, of getting single datapoints over time.
A paraphrase: "If you model information theory on signals going down a wire, you simplify your task in that you only have one point being measured or modified at a time at each end…At the same time, though, you pay by adding complexity at another level….which leads to a particular set of ideas about coding schemes in which the sender and receiver have agreed on a temporal syntactical layer in advance…You stretch information out in time and have past bits give context to future bits in order to create a coding scheme….In order to keep track of a protocol you have to devote huge memory and computational resources to representing the protocol rather than the stuff of ultimate interest. This kind of memory use is populated by software artifacts called data-structures, such as stacks, caches, hash tables, links and so on. They are the first objects in history to be purely syntactical…..With protocols you tend to be drawn into all-or-nothing high wire acts of perfect adherence in at least some aspects of your design….leads to…. brittleness in existing computer software, which means that it breaks before it bends."
So just as we neuroscientists are learning that a 1 or even a 2-compartment model of a neuron is not enough, that distal vs basal dendritic inputs have vastly nonlinear interaction effects, that neuropeptides are abundant and very important functional components in neural circuits, that neurons are strongly influenced by ephaptic coupling, we come to that same conclusion, there is a lot more going on in your brains than wire transmission down axons, lots of volume transmission. this leads to a system that has a constant minor presence of errors or noise, and to a world of approximation and guessing.
Another paraphrase: "The alternative, in which you have a lot of measurements available at one time on a surface, is called pattern classification….The distinction between protocols and patterns is not absolute-one can in theory convert between them. But it’s an important distinction in practice…you enter into a different world that has its own tradeoffs and expenses. You’re trying to be an ever better guesser instead of a perfect decoder. You probably start to try to guess ahead, to predict what you are about to see, in order to get more confident about your guesses. You might even start to apply the guessing method between parts of your own guessing process. You rely on feedback to improve your guesses….you enter into a world of approximation rather than perfection. With protocols you tend to be drawn into all-or-nothing high wire acts of perfect adherence in at least some aspects of your design. Pattern recognition, in contrast, assumes the constant minor presence of errors and doesn’t mind them. I’ve suggested that we call the alternative approach to software that I’ve outlined above “Phenotropic.”…The goal is to have all of the components in the system connect to each other by recognizing and interpreting each other as patterns rather than as followers of a protocol that is vulnerable to catastrophic failures. One day I’d like to build large computers using pattern classification as the most fundamental binding principle, where the different modules of the computer are essentially looking at each other and recognizing states in each other, rather than adhering to codes in order to perfectly match up with each other."