How the current AI works?
To understand what is tiny AI let us first understand how the current AI solution works. Talking on a surface-level procedure, first, the data is collected (a large amount of data !) from sensors or other sources, then this collected data is sent to centralized cloud servers which process and analyze the data by various and required algorithms and we finally get our result. Pretty straight forward, isn't it...
What is the need for Tiny AI?
Well let's get one step beneath the surface, once the data is collected it has to be sent to the centralized cloud servers which require the data-sending unit to be always online. Since the data is multi-source and has to reach the center from everywhere, the bandwidth has to be broad to minimize traffic congestion. The algorithms that process these data at cloud data centers are powerful and hence require heavy computational power all the time which emits a lot of carbon into the environment and limits the speed and privacy of such AI applications. Therefore we need something smaller ( or tiny! ) which consumes less power and works as efficiently as the way current AI works.
How Tiny AI solves these problems?
Tiny AI promises to pack considerably enough computational power in small chips so that the AI tasks can be executed on the device itself without the need to transfer the data to centralized cloud servers. This of course doesn't mean that the centralized cloud servers will fade completely, higher AI models that require heavy and complex algorithms will still have to make use of these data centers.
Tiny AI has already started to reach the end-users, recently Google announced that "Google assistant" can now run on users' phones without sending requests to the servers. Other tech giants like Apple, IBM, and Amazon are also researching in this field.
Benefits and challenges
Talking about benefits, Tiny AI clearly reduces a lot of energy consumption and makes the whole process more secure and faster, since the data will be collected and processed on the same device. This will ensure reduced traffic on the centralized cloud servers path which will result in faster execution of other AI tasks. IoT (Internet of Things) will be more coordinated with lots of other possibilities.
However, implementing all this is very challenging especially when it comes to shrinking the algorithms to reduce energy consumption that too on a small device! Clearly, there is no room for errors with things like healthcare sectors and automated vehicles so it has to be highly accurate and fast. Also "with great power comes great responsibility" with the possibility of AI being handy and easy to access, things like "deepfakes" may rise and it may be easier to fool automated security systems. There has to be a well-planned workaround to avoid such potential harms.