Sean Gallagher, for Ars Technica:
We’re already seeing some companies offloading an autonomous “brain” to the cloud. And it may not be too long before the same sorts of services used to build mobile digital assistants like Siri, Google’s Voice Search, and Cortana are helping physical robots understand the world around them. The result could be a sort of “hive mind,” where relatively inexpensive machines with some autonomous systems share a common set of cloud services that act as a group consciousness. Such a setup would allow a group of machines to constantly improve, adjusting operations as more experience is added to the collective memory. Theoretically, bots like this could not only interact with more complex environments, but they could engage people around them in a way that resembles a co-worker more than a calculator.
There’s no reason that a central nervous system has to be entirely local. As the article states, one of the greatest challenges for the human brain is simply perception. In fact, Gill Pratt, Program Manager of DARPA’s Defense Sciences Office, says “It’s very difficult to fit a computer with the size, weight, and power that you need to achieve really good perception onto a robot.” the article also mentions the matter of onboard power requirements, which a cloud brain would certainly alleviate.
On the other hand, we’re already familiar with some of the challenges the cloud presents, latency being a big one. It’s one thing to outsource a brain, but having to wait for it would be a great way to simulate senior moments.