Yesterday, at the ongoing IEEE’s International Solid-State Circuits Conference (ISSCC), Yann LeCun, Facebook AI Research director, presented a paper that touched upon the latest trends and the future of deep learning hardware.
ISSCC is a five days event happening in San Francisco, where researchers present the current advances in solid-state circuits and systems-on-a-chip. LeCun in his presentation highlighted several AI trends company should consider in the coming years.
Here are some of the highlights from his presentation:
Machines should be given some “common sense”
With the advancements in deep learning the computer understanding of images, audio, and texts has improved. This has allowed developers to build new applications such as information search and filtering, autonomous driving, real-time language translation, and virtual assistants. These advancements, however, are heavily dependent on supervised learning, which requires human-annotated data or reinforcement learning.
LeCun believes that in the next decades, researchers should put their efforts into making machines learn just like humans, by mere observations and occasional actions or in short, by self-supervised manner. To do that, researchers need to find a way to put some level of “common sense” in machines. For this, we need deep learning architectures that are much larger than the one we have currently.
LeCun, in his paper Deep Learning Hardware: Past, Present, and Future, wrote, “If self-supervised learning eventually allows machines to learn vast amounts of background knowledge about how the world works through observation, one may hypothesize that some form of machine common sense could emerge.”
Empowering machines with human-like capabilities will allow them to make complex decisions. These machines could help in very critical issues like detecting hate speech and inappropriate content on Facebook, enabling virtual assistants to infer context like humans, and more.
Ahead of the presentation, LeCun, in an interview with Business Insider said, “There are cases that are very obvious, and AI can be used to filter those out or at least flag for moderators to decide. But there are a large number of cases where something is hate speech but there’s no easy way to detect this unless you have a broader context … For that, the current AI tech is just not there yet.”
Machine learning chips that can fit everyday devices
LeCun is hopeful that in future we will see computer chips that can fit in everyday devices such as vacuum cleaners and lawnmowers. With the machine learning chip incorporated, any device will be able to make smart decisions. For instance, a lawnmower will be able to recognize the difference between weeds and garden roses. Currently, we do have mobile devices with AI built in them to do things like recognizing a user’s face to unlock the device. In the coming years, more work will be put in to make mobile computing chips more sophisticated.
LeCun also spoke about the need for hardware specifically designed for deep learning. The current hardware restricts developers to use batches of data in the learning and optimization phase of machine learning models. This will change in the coming years. “If you run a single image, you’re not going to be able to exploit all the computation that’s available to you in a GPU. You’re going to waste resources, basically, so batching forces you to think about certain ways of training neural nets,” he said.
A new programming language for deep learning, which is more efficient than Python
LeCun believes that deep learning now needs a new programming language which is much more efficient than Python. In an interview with VentureBeat, Yann LeCun said, “There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python.”
He believes that the imaginations of AI researchers and computer scientists tend to be tied to hardware and software tools available. “The kind of hardware that’s available has a big influence on the kind of research that people do, and so the direction of AI in the next decade or so is going to be greatly influenced by what hardware becomes available. It’s very humbling for computer scientists because we like to think in the abstract that we’re not bound by the limitation of our hardware, but in fact, we are.”
To know about the other trends LeCun shared, check out the Facebook AI blog.