Tiny Artificial intelligence

Share to

What is Tiny AI?

Tiny AI, also known as Tiny ML in reference to machine learning (ML), is currently a big focus for AI researchers. The aim is to reduce the size of artificial intelligence (AI) algorithms; especially those that use large quantities of data and computational power, for example natural language processing (NLP) models like Google’s BERT. According to the MIT Technology Review, the larger version of BERT has 340 million data parameters and training it just once costs enough electricity to power a US household for 50 days.

“Over the past few years there’s been an arms race of sorts in AI and an effect of this competition is that we’ve seen some ML models become enormous in the race to achieve high performance,” says Nick McQuire, vice president of enterprise research at CCS Insight. “Microsoft, for instance, recently introduced the Turing Natural Language Generation model, the largest one ever published at 17 billion parameters.” 


Why do we need Tiny AI?

Training sophisticated AI takes a huge amount of energy. Sumant Kumar, director of digital transformation at consultancy firm CGI UK, notes that the carbon footprint of training a single AI is as much as 284 tones of carbon dioxide equivalent (CO2e) – five times as much as the lifetime emissions of an average car. As AI adoption grows it’s become clear that the technology needs to become greener, which is one of the factors pushing Tiny AI forward.
Another factor is the need to run inference and sophisticated models on resource-limited devices at the edge, for use cases like robotics, automated video security and anomaly detection in manufacturing. “To get increasing intelligence out of the data center and into better performing consumer electronics, cars and medical devices, AI needs to run on much smaller microprocessors, often powered by batteries,” says Tim Ensor, director of AI at consultancy business Cambridge Consultants. 


Share to

Leave a Reply

Your email address will not be published. Required fields are marked *

Close Bitnami banner
Bitnami