Computer scientist, UC Berkeley, School of Information; author, Search User Interfaces
We will find ourselves in a world of omniscient instrumentation and automation long before a stand-alone sentient brain is built—if it ever is. Let’s call this world eGaia for lack of a better word. In eGaia, electronic sensors (for images, sounds, smells, vibrations, all you can think of) are pervasive, and able to anticipate and arrange for the satisfaction of individuals’ needs and allow for notification of all that’s happening to those who need to know. Automation allows for cleaning of rooms and buildings, driving of vehicles, monitoring traffic, making and monitoring of goods, and even spying through windows (with tiny flying sensors). Already, major urban places are covered with visual sensors, and more monitoring is coming. In Copenhagen, LED-based streetlights will turn on only when they sense someone is biking down the road, and future applications of this network of sensors might include notifying when to salt the road or empty the trash, and, of course, alerting the authorities when suspicious behavior is detected on a street corner.
In eGaia, the medical advances will be astounding—synthetic biology makes smart machines that fix problems within our bodies; intelligent implants monitor and record current and past physical states. Brain-machine interfaces continue to be improved, initially for physically impaired people but eventually to provide a seamless boundary between people and the monitoring network. And virtual-reality-style interfaces will continue to become more realistic and immersive.
Why won’t a stand-alone sentient brain come sooner? The amazing progress in spoken-language recognition—unthinkable ten years ago—derives in large part from having access to huge amounts of data and huge amounts of storage and fast networks. The improvements we see in natural-language processing are based on mimicking what people do, not understanding or even simulating it. It’s not owing to breakthroughs in understanding human cognition or even significantly different algorithms. But eGaia is already partly here, at least in the developed world.
This distributed nerve-center network, an interplay among the minds of people and their monitoring electronics, will give rise to a distributed technical-social mental system the likes of which has not been experienced before.