Material: Misanthropic Principals – Matt Jones

Wednesday, February 19th, 02020 at 11:11 UTC

Former Principal at BERG, now working at Google in London on various top secret projects as Principal Designer in Google Research & Machine Intelligence. Matt Jones’ talk will look at some of the ideas around raw computing. If the Web were a material, he’d brute force his way to the best answer, maybe without even understanding how he got there — and that’s OK.

Because…

We’ve known Matt Jones for many years now, from way back in the early Dopplr days. He was one of our first choices back in 02016 when we started to plan Material Conference. We’ve followed his time at Dopplr, BERG and now Google and all the crazy ideas and acronyms that have come out of it.

From all the discussions we’ve had at events, it is clear that he and his colleagues have their finger on the pulse of what’s happening, while at the same time, creating the future as well. It’s one thing to watch, but totally another to participate!

We wanted to get Matt’s thoughts about the Web and the direction that he sees it taking. He goes into the architecture of how he sees technology and the Web evolving over the next few years, mostly at the cost of speed, energy and data.

Time and time again, we come back to the life-cycles of the Web and Internet. In the early days, it was main-frame computers and dumb terminals. This was because computers were expensive, so we centralised it all in buying one smart machine. These dumb terminals allowed you to remotely login, check your mail, or even BBSes. The next way was the desktop computer. Hardware got cheaper and cheaper, so we started doing more and more locally on our own machines. Then the pendulum swung again and we moved to cloud computing. More and more tasks were done on the Web rather than locally because connectivity became faster and cheaper.

Again, the pendulum is swinging and Matt explains where it is heading. We’re more of an Octopus than a Spider.

Octopuses  Web

Matt’s most recent focus has been on working on A.I. systems, but he’s the first to say that it is machine learning and it isn’t magic. These algorithms get to their answer through learning and examples rather than explicit programming. They are learned systems. It isn’t as cool a name, but that’s the reality of A.I.

Some of this is done through ‘supervised learning’. This is a technique were you feed ‘good’ examples, like professional photos, into the tool and it ‘learns’ the qualities of what is ‘good’ or desired and what is ‘bad’ or undesired.

From this they made their first simple product, Google Clips. It is a very simple camera without a view finder. It is recording and taking photos via A.I. based on what it has been trained is a ‘good photo’. There is a traditional shutter button, but that is actually just a ‘vote’ button to help train the algorithm to ‘take more like this’.

There were several interesting side-effects from this new paradigm shift. The expected is that you take more, ‘better’ pictures, but the unexpected was that you tended to be in the photos more. If you had to worry less about the composition, timing and location, you spent more time on the other side of the lens in the pictures. Which is the goal if you want to try and capture family memories.

With such a simple, ‘dumb’ device, all this smarts had to happen locally. This is no ‘smart’, internet-connected, cloud computing device. All the processing happens on the device. Which means no data is sent back to Google for analysis, it is private and kept on device, just for you.

This concept is sometimes called “Machine Learning at the edge”. Meaning that with this graph of interconnected devices, all the smarts is not centralised, but now at the edges of the graph, on each of the nodes, which is its own device. This is achieved due to advances in TPUs. We know CPU (Central Processing Unit) and even GPUs (Graphical Processing Unit), now devices are including TPUs (Tensor Processing Unit). These are tiny, energy-efficient chips that are optimised for machine learning.

Right now, data cognition is like the old power station approach, big and centralised. Google and others have massive data centres where everything is sent to, correlated, compiled and analysed. This model is expensive, bandwidth and processor intensive and not very energy efficient. But times are changing and this architecture of the Web is changing. It is moving (or being blended with) a federated computing/learning style. The computations and analysis are happening on device. The local machine learning runs the algorithms and builds new models of the world, then at night they ‘dream’ and securely and anonymously share their learnings.

This reduced the computing power centrally and pushes it to the edges, saving bandwidth, energy, while keeping data secure and private. These new models can then be recombined, updated and redistributed. All these edges sensors can learn from each other while keeping control of their own raw data.

It is this concept of “Other minds” like octopus intelligence. Peter Godfrey-Smith describes it more like a jazz band rather than a single entity. Everyone is learning, waiting and riffing off of each other rather than having some central authority.

When we think of these TPUs, locally on devices doing machine learning, we trick ourselves when we think they are like a brain. If we consider them more like other, more simple functions, like a knee, then we can better articulate their use. Deep down, we have lizard-brain reactions to thinks. We don’t ‘think’ about it, it is just a reaction that we have learn or is built-in deep. TPUs know a ‘good’ photo simply because they do, it isn’t a matter of ‘thinking’ about it.

With time, the cost and size of computing an algorithm locally on a TPS versus the energy usage is only going to get wider and wider.

The physics of moving data round the network takes a lot of energy. The minute you have to spin-up a radio signal to transmit the data, you lost any gains you achieve with a centralised, big computer. WiFi, Bluetooth and others are energy intensive, even beyond all the privacy issues and latency of sending the raw data off the device.

TPUs will only get more and more efficient and certainly beat radio signals for energy usage and cost.

So how does all this Octopus style thinking impact the Web?

The mantra: Be As Smart As A Puppy is one way to look at A.I. and Machine Learning. Rodney Brooks said “AI is terribly over hyped, if you put the 50 smartest AI Technologist in a room for 50 years, they’d be lucky if they could make anything as smart as a puppy.”

Everyone loves puppies, so they are a pretty good heuristic of how well you could program your ‘intelligence’, but they also offer another way of looking at the future of the Web and our interactions.

Working animals extend our abilities, sensory abilities. They are companion species. Not slaves to our whim, but partners in our actions. If we start to think of these devices not as sentient beings, but rather extensions of our own sensors. They might have some autonomy, but they are becoming more and more a part of our being.

Research has suggested that spider brains are made-up partly of the web they spin. Their view of the works is integrated with non-biological components. That sounds a lot like us. From our most simple cyborg adaptation, reading glasses, to more complex devices like smart phones. They allow us more powers than we have without them.

Now we are shifting to the individual devices that are smart and work in conjunction to the cloud. We are seeing more and more examples of this tandem human-machine processes.

Advanced chess, sometimes called Centaur chess is where a human and computer work together. Google’s Smart Compose tries to be helpful when writing email replies. These tools are designed to help humans make the decision. They are not autonomous, they are only as smart as a puppy, but they can nudge us and help.

Could this be were the Web is heading? Right now we think of things in terms of Web Pages and Web Apps. We read and consume or we interact. What if there is yet another method? One were we work in tandem with a companion. It was there to be our extended eyes and ears over Web distances?

You can view all the video recordings and subscribe to the Material podcast on the Material Archive site.