By Patrick Smith, VP, EMEA Field CTO, Pure Storage
AI can do more and more. Think of any topic and an AI or genAI tool can effortlessly generate an image, video or text. Yet the environmental impact of, say, generating a video by AI is often forgotten. For example, generating one image by AI consumes about the same amount of power as charging your mobile phone. A relevant fact when you consider that more and more organizations are betting on AI.
After all, training AI models requires huge amounts of data, and massive data centers are needed to store all this data. In fact, there are estimates that AI servers (in an average scenario) could consume in the range of 85 to 134Twh of power annually by 2027. This is equivalent to the total amount of energy consumed in the Netherlands in a year.
The message is clear: AI consumes a lot of energy and will, therefore, have a clear impact on the environment.
To create a useful AI model, a number of things are needed. These include training data, sufficient storage space and GPUs. Each component consumes energy, but GPUs consume by far the largest amount of power. According to researchers at OpenAI, the amount of computing power used has been doubling every 3.4 months since 2012. This is a huge increase that is likely to continue into the future, given the popularity of various AI applications. This increase in computing power is having an increasing impact on the environment.
Organizations wishing to incorporate an AI approach should therefore carefully weigh the added value of AI against its environmental impact; while it's unlikely a decision maker would put off a project or initiative, this is about having your cake and eating it. Looking at the bigger picture and picking technology which meets both AI and sustainability goals. In addition to this, the underlying infrastructure and the GPUs themselves need to become more energy-efficient. At its recent GTC user conference, NVIDIA highlighted exactly this, paving the way for more to be achieved with each GPU with greater efficiency.
A number of industries are important during the process for training and deploying an AI model: The storage industry, data center industry, and semiconductor industry. To reduce AI's impact on the environment, steps need to be taken in each of these sectors to improve sustainability.
In the storage industry, concrete steps can be taken to reduce the environmental impact of AI. An example is all-flash storage solutions which are significantly more energy-efficient than traditional disk-based storage (HDD). In some cases, all-flash solutions can deliver a 69% reduction in energy consumption compared to HDD. Some vendors are even going beyond off-the-shelf SSDs and developing their own flash modules, allowing the array's software to communicate directly with flash storage. This makes it possible to maximize the capabilities of the flash and achieve even better performance, energy usage and efficiency, that is, data centers require less power, space and cooling.
Data centers can take a sustainability leap with better, more efficient cooling techniques, and making use of renewable energy. Many organizations, including the EU, are looking at Power Usage Efficiency (PUE) as a metric -- how much power is going into a data center vs how much is used inside. While reducing the PUE is a good thing, it's a blunt and basic tool which doesn't account for, or reward, the efficiency of the tech installed within the data center.
The demand for energy is insatiable, not least because semiconductor manufacturers -- ,especially of the GPUs that form the basis of many AI systems -- are making their chips increasingly powerful. For instance, 25 years ago, a GPU contained one million transistors, was around 100mm² in size and did not use that much power. Today, GPUs just announced contain 208 billion transistors, and consume 1200W of power per GPU. The semiconductor industry needs to be more energy efficient. This is already happening, as highlighted at the recent NVIDIA GTC conference, with CEO Jensen Huang saying that due to the advancements in the chip manufacturing process, GPUs are actually doing more work and so are more efficient despite the increased power consumption.
It's been clear for years that AI consumes huge amounts of energy and therefore can have a negative environmental impact. The demand for more and more AI generated programmes, projects, videos and more will keep growing in the coming years. Organizations embarking on an AI initiative need to carefully measure the impact of their activities. Especially with increased scrutiny on emissions and ESG reporting, it's vital to understand the repercussions of energy consumption by AI in detail and mitigate wherever possible.
Initiatives such as moving to more energy efficient technology, including flash storage, or improving data center capabilities can reduce the impact. Every sector involved in AI can and should take concrete steps towards a more sustainable course. It is important to keep investing in the right areas to combat climate change!
Conozca por qué el almacenamiento flash empresarial sustituirá la tecnología HDD en el centro de datos, antes de lo que se imagina.
Descubra cómo anticiparse al crecimiento de los datos y acabar con los quebraderos de cabeza provocados por el aumento de los costes de la energía y de las limitaciones de espacio.
Opciones de almacenamiento para todas sus necesidades
Almacenamiento de alto rendimiento para las canalizaciones de datos, el entrenamiento y la inferencia.
Soluciones de ciberresiliencia que defienden sus datos
Almacenamiento rentable para Azure, AWS y las nubes privadas
Almacenamiento de baja latencia para el rendimiento de las aplicaciones
Un almacenamiento eficiente en cuanto a recursos para mejorar el uso del centro de datos
Key benefits:
Key benefits:
Key benefits:
Key benefits:
Key benefits:
Key benefits:
Key benefits: