Offline or Edge AI?
While business applications can benefit from being hosted on cloud infrastructure, consumer applications running on IoT are starting to move in the other direction with more AI-based engines running on local hardware.
There are a few arguments for running an application this way:
- It’s more secure. Information isn’t being streamed back and forth from a service where it can be intercepted. Data also isn’t stored external to the device. Fewer places stored means that there’s fewer places where data can be compromised.
- It’s faster. There should be no latency in data transformer since everything’s running locally.
- It’s more reliable. Since there’s no Internet connection, there’s nothing worry about outages or users not knowing how to connect.
If we assume that running AI on the edge or offline means that we train the AI somewhere else with a large database, get a classifier or encoder as a result, and then run that with a small footprint on a device that doesn’t have a means of Internet communication, then the drawbacks are that the device will never improve from external learning. Maybe a feedback mechanism can be built in allowing the device to still learn, but it won’t be to the same extent as a cloud hosted service. If the device’s purpose is to “pass the butter”, then maybe it doesn’t need to continuously earn.
However, an edge AI device might still have connectivity and can send and receive data from another service that uses this for retraining and revising the classifier / encoders that run locally. The device might still work offline but doesn’t need to stream information for real time interaction. Data might still be an issue but eavesdropping, less so.
There are other potential variations of the Edge vs Offline. There can be one way edge, in which the device can receive updates to its models but can’t send any data back. The benefit is that the device can improve over time from cloud training.
There’s also hybrid devices that stream to cloud when online and then run offline when disconnected. Android voice typing circa 2012 reminds me of this.
Security, speed, and reliability are going to be the driving factors of running more Ai locally. This will become more important as IoT devices need more coordination to run.