# Embracing the Potential of LLMs: Debunking the Energy Myths
Written on
Chapter 1: Understanding the Misconceptions
Each week, there's a recurring theme in my social media feeds: someone claiming that GPT and other large language models (LLMs) "consume excessive electricity," suggesting they pose a threat to society due to greenhouse gas emissions. This perspective is fundamentally flawed for several compelling reasons.
This paragraph will result in an indented block of text, typically used for quoting other text.
Section 1.1: The Reality of LLM Usage
It’s crucial to recognize that LLMs cater to millions of users. Furthermore, we are rapidly transitioning to renewable energy sources; for instance, Microsoft Azure is predominantly powered by green energy.
Additionally, there are already LLMs similar to GPT-3 that are significantly smaller—up to 1000 times less resource-intensive. We are at the beginning of this technological journey, having just unlocked its vast potential, and we’re witnessing remarkable advancements.
LLMs are not only capable but transformative, enabling a multitude of applications, smart robotics, and even future androids. They promise to bring immense, sustainable benefits to humanity.
Subsection 1.1.1: The Fallacy of CO2 Arguments
Section 1.2: Addressing Skepticism
Critics often cite studies or images to support their claims against LLMs. However, these skeptics frequently lack an understanding of the actual utility and benefits of LLMs, revealing their ignorance of the points listed above.
The assertion that LLMs are "useless" is reminiscent of fringe theories and reflects a misunderstanding of their value. One could apply the same CO2 manufacturing argument to justify banning virtually every product, from food to technology.
Chapter 2: Insights from Ark Invest
Ark Invest has projected that the energy costs and training expenses associated with LLMs will decrease by a factor of 20 or more. This year, generative AI made headlines, with tools like DALL-E-2 and ChatGPT significantly enhancing the productivity of knowledge workers—doubling efficiency for coding tasks, for instance.
The annual decline in AI training costs has been around 70%. The expense of training a large model to the GPT-3 standard dropped dramatically from $4.6 million in 2020 to $450,000 in 2022, with expectations for this trend to continue through 2030.
At full adoption, AI could potentially boost global labor productivity by about $200 trillion, far exceeding the total salaries of knowledge workers, which stand at approximately $32 trillion.
The rise of more efficient LLM architectures, developed by open-source contributors, will likely be adopted by the industry, and these innovations could be thousands of times more efficient than current models.
Regardless of the hurdles, technological advancement is inevitable.
If you are involved in technology, you are likely already aware of the realities behind the critiques. It is disheartening to witness such shortsightedness in these discussions.