مدونة
3 minutes to read

Beyond the Buzz: The Forgotten Costs of Powering AI/ML Workloads

Artificial intelligence and machine learning are trending topics across media, having captured the imaginations of companies and consumers alike with the promise of how these technologies will revolutionize industries and lifestyles in the near and distant future.

As many the once-only-imagined projects, like autonomous vehicles, become reality, the coverage has been celebratory and optimistic about how they will benefit society.

But what is often underreported is the cost that such advancements impact on the world to achieve such results. The proliferation of internet-connected devices and the growing demand for compute-intensive applications require a massive expansion of data center capacity, which will require a massive increase in electricity consumption. Because such costs are absorbed by the huge tech companies leading these initiatives, the public has largely been unconcerned but the environmental impact of AI/ML workloads will be paid by all.

While the amount of electrical power required once followed Dennard’s scaling while processors followed Moore’s law of growth – doubling transistors every 2 years, there has been an exponential increase in power consumption since 2012 to satisfy the requirements for machine learning systems – doubling every 3.4 months, according to AI research group OpenAI. The largest data centers today are estimated by the U.S. Department of Energy to each require more than 100 megawatts of power capacity – enough to power around 80,000 households in the U.S. With the media buzz promising that AI will soon be everywhere, energy consumption such as this is unsustainable.

To contain the energy crisis facing the world with the inevitable increase in AI/ML applications, a solution like Prodigy, the world’s first universal processor, is needed to help offer significant improvements in performance, energy consumption, server utilization and space requirements in hyperscale data centers.

In provisioning Prodigy for use in current and future data centers in exchange for the slate of non-optimized chips now available, the industry will be able to achieve significant “green” benefits, such as:

  • CO2 emission reduction of 600 million tons per year – the equivalent of eliminating the entire airline industry
  • 10 years of data center expansion without the need to expand power usage at all
  • Enablement of single hardware platforms that can handle multiple AI workloads at a time, eliminating redundant systems and avoiding the need to move massive data from one system to another
  • Democratization of AI that will enable sustainable products and solutions across a broad spectrum of applications and markets

The need for energy-efficient data centers in not a new concept but with the rapid acceleration of power-hungry AI/ML workloads placing an increased strain on an already taxed electrical grid infrastructure, the costs are too high to continue to ignore. By enabling a 4x lower data center annual TCO, Tachyum’s Prodigy is an ideal solution to solving these issues and advancing the entire world to a greener era.

What happens in an internet minute (2020)