How Edge Devices Can Help Mitigate the Global Environmental Cost of Generative AI - Seite 2
Running AI on local, private devices also saves the expense of sending queries and data across the network (and through the internal data-routing systems at a cloud provider) and sending the answers back.
Finally, the limited processing power of local devices, compared to massive resources available to run queries in cloud data centers, enforces a form of AI discipline on AI software companies, app developers and users. Not all generative AI queries require the resources of cloud-based ChatGPT-4 or its equivalent. By reallocating a portion of AI tasks to the edge, we can leverage the benefits of on-device AI processing, which offers efficient computations with minimal power consumption. A balanced strategy, involving a deliberate distribution of AI workloads across the cloud and edge, can enhance performance efficiency and minimize energy consumption.
As technology providers begin to distribute generative AI capabilities to personal devices and start to gather data on the economics of various query types and where those queries run, we expect that they will start to surface these calculations for users, allowing people to make individual cost-based decisions about how much AI processing power they consume.
Taking Efficient Edge AI Technologies to the Cloud
The technology that enables edge devices, such as smartphones and tablets, has evolved to be both powerful and power-efficient. Users expect these devices to be fast, responsive, and capable of lasting a full day on a single battery charge.
In fact, modern smartphones have surpassed the power of IBM's Deep Blue supercomputer, which gained fame for defeating chess grandmaster Garry Kasparov in 1997.6 What's even more impressive is that these powerful mobile devices consume significantly less energy than an LED light bulb.7
This remarkable energy efficiency is the result of decades of innovation in the field. Lean computing instruction sets have been developed to process data using fewer operations, while systems-on-chip integrate multiple components into a single chip to reduce power consumption.
Lesen Sie auch
Such innovations have allowed Qualcomm Technologies, Inc. to deliver record-breaking power-efficient cloud AI processing products. This showcases the significant potential of edge technologies in addressing the energy challenge associated with processing AI models in the cloud.
AI might mitigate its own efficiency problems
AI tools are well-suited to optimizing complex systems and can be used to reduce energy requirements and environmental impacts.8 There's a possibility that AI tools can help offset some of the impacts of human-caused climate change. Research from the Boston Consulting Group says that, "AI can accelerate climate action by taking climate modeling to the next level, enabling new approaches to climate education, and supporting breakthroughs in climate science, climate economics, and fundamental research."9