Qualcomm recently released a white paper titled, “The Future of AI is Hybrid.” In the paper, they outline a clear case that for AI to develop to its maximum capabilities, it needs to be processed both on the cloud and the edge. Computing at the edge would improve issues like cost, energy use, reliability, latency issues, privacy—all of the things that make scaling and growing a technology difficult. And they’re right: for AI to optimize fully, it needs more than one partner, more than one solution. But the greater lesson here is: that’s true for all technology moving forward.
When we hear the term “hybrid,” many of us think of hybrid cars—cars that run on both gasoline and electricity. We in the tech space eventually grabbed that term to refer to things like hybrid cloud —a situation where companies may process some of their data on the public cloud, private cloud, or data center in some type of mix. The goal in creating these hybrid models in technology was the same as it was with hybrid cars—to reduce energy consumption, improve costs, enhance performance.
The hybrid cars grew in popularity because they allowed users the enjoy the best qualities of both types of cars—gas and electric. Gas engines allow the hybrid to refuel quickly and move longer distances before needing fuel. The electric side helps cut emissions and save money. A similar concept is true for AI. AI needs somewhere powerful and stable for model training and inference, which require huge amounts of space for processing complex workloads. That’s where the cloud comes in. At the same time, AI also needs to happen fast. For it to be useful, it needs to process closer to where the action actually happens—the edge of a mobile device.