ANALOG, THE REVOLUTION THAT DARES NOT SPEAK ITS NAME

GEORGE DYSON

Science historian; author, Turing’s Cathedral: The Origins of the Digital Universe

广告:个人专属 VPN,独立 IP,无限流量,多机房切换,还可以屏蔽广告和恶意软件,每月最低仅 5 美元

No individual, deterministic machine, however universal this class of machines is proving to be, will ever think in the sense that we think. Intelligence may be ever-increasing among such machines, but genuinely creative intuitive thinking requires nondeterministic machines that can make mistakes, abandon logic from one moment to the next, and learn. Thinking is not as logical as we think.

Nondeterministic machines—or, better yet, nondeterministic networks of deterministic machines—are a different question. We have at least one existing proof that such networks can learn to think. And we have every reason to suspect that, once invoked within an environment without the time, energy, and storage constraints under which our own brains operate, this process will eventually lead, as Irving (Jack) Good first described it, to “a machine that believes people cannot think.”

Until digital computers came along, nature used digital representation (as coded strings of nucleotides) for information storage and error correction but not for control. The ability to introduce one-click modifications to instructions, a useful feature for generation-to-generation evolutionary mechanisms, becomes a crippling handicap for controlling day-to-day or millisecond-to-millisecond behavior in the real world. Analog processes are far more robust when it comes to real-time control.

We should be less worried about having our lives (and thoughts) controlled by digital computers and more worried about being controlled by analog ones. Machines that actually think for themselves, as opposed to simply doing ever more clever things, are more likely to be analog than digital, although they may be analog devices running as higher-level processes on a substrate of digital components, the same way digital computers were invoked as processes running on analog components the first time around.

We’re currently in the midst of an analog revolution, but for some reason it’s a revolution that dares not speak its name. As we enter the seventh decade of arguing about whether digital computers can be said to think, we’re surrounded by an explosive growth in analog processes whose complexity and meaning lies not in the state of the underlying devices or the underlying code but in the topology of the resulting networks and the pulse frequency of connections. Streams of bits are being treated as continuous functions, the way vacuum tubes treat streams of electrons, or neurons treat pulse frequencies in the brain.

Bottom line: I know that analog computers can think. I suspect that digital computers, too, may eventually start to think, but only by growing up to become analog computers first.

Real artificial intelligence will be intelligent enough to not reveal itself. Things will go better if people have faith rather than proof.