Intel, AMD Talk Up Biggest AI Challenges, Benefits at Data Center World
AI challenges include data center power and cooling, software accuracy and privacy, and trying to keep up with rapid massive changes caused by the technology.
DATA CENTER WORLD — Like most IT conferences these days, Data Center World 2024 is focused on changes brought about by artificial intelligence (AI). Besides talking up AI’s benefits, the data center conference is looking at AI challenges such as the increase in power and space consumption. Speakers at the Washington, D.C., event are exploring ways to overcome these challenges. (Data Center World is run by Informa Tech, Channel Futures' parent company.)
Representatives from Intel and AMD – chip makers that seek to challenge Nvidia’s dominant position in GPUs that power AI – spoke about AI’s impact on the entire IT ecosystem during a Data Center World keynote session. As with other speakers at the show, they agreed that AI is here to stay and must be addressed. But it’s not just the hardware that needs to accelerate to keep up with performance and sustainability.
Jen Huffstetler, chief product sustainability officer of Intel, said three keys to achieving sustainability with AI include optimizing AI models and software as well as the system architectures. Liquid cooling and denser systems play a role, but there are other ways to improve data center efficiencies to overcome AI challenges.
“Innovation needs to keep happening on the hardware,” Huffstetler said. “But the software side, I think it's the biggest lever you've got. When the software is not tightly coupled with that hardware, if it's not utilizing everything that's inside that chip, you're not getting the benefit out of it.”
Laura Smith, corporate vice president of engineering solutions and AMD fellow, stressed an open ecosystem of hardware and software that will provide organizations with the flexibility to ride current and future AI waves.
“When that next ChatGPT comes in, it is probably going to be very different,” Smith said. “So, the next time there’s a big breakthrough and somebody has a really fantastic idea, am I going to be able to adopt it? That’s one reason at AMD that we believe strongly in an open ecosystem and partnerships, both on the hardware attributes and products that form a data center, but also in how we feel about software stack.”
Huffstetler and Smith emphasized the importance of data as well when implementing AI, particularly generative AI. Data accuracy and privacy are other key AI challenges for IT professionals.
“A lot of enterprises feel like whoever solves how to actually access and leverage the data they have, will have an advantage,” Smith said.
IT Professionals Under Pressure from AI
Smith said making AI sustainable requires data center modernization.
“What we’re seeing is IT professionals are under a lot of pressure,” she said. “There's this urgency to figure out AI, this push to quickly adopt AI. But when you turn it over to the IT professionals, it creates a lot of adjacent problems for them to solve. Where are they going to find the data center space? Where are they going to find the power? How are they going to fund the project? And we're seeing a big push to modernize existing data centers because as infrastructures age, they become a little more hands-on to manage, and the performance degrades over time.
“We generally work in enterprises that rely on legacy applications," continued Smith. "And so we can't let go of the legacy applications. We need to effectively manage and reduce the cost and the power consumption associated with those legacy applications to help us make room for the new investment.”
Huffstetler pointed out AI-powered software using machine learning (ML) can automate processes and decrease energy consumption by as much as 30%.
“I look at it as two lenses: deploying AI for sustainability versus how you make AI sustainable,” she said. “You can deploy AI in many different use cases across your enterprise to lower the energy consumption with classic ML techniques. We've seen many examples where customers are modulating the server, modulating the processor power using software and energy management tools."
Omdia: AI Challenges for Power Can Be Overcome
Vlad Galabov, Omdia’s research director for cloud and data center, said the research firm forecasts data center power capacity will double over the next five years. He said AI will be take up about 30% of data center power capacity by 2025, and that is with a small number of large companies deploying large clusters. By 2030, many more organizations will deploy AI.
Omdia's Vlad Galabov
“I’m sure your first question is, ‘Where is this power going to come from?’” Galabov said.
Galabov said there are four factors that can help overcome power constraints. The first is IT footprint consolidation from denser servers. He pointed out that today’s processors have 10 times the cores as the servers they are replacing. Another factor is improved IT utilization from software tools and professional services that will help IT teams make better decisions and avoid overprovisioning on-premises and in the cloud. He also expects further improved power use effectiveness (PUE), partially because of regulation. The final factor — microgrids that operate independently of larger utility grids.
Data Center AI at an Inflection Point
“Design with the future in mind,” Data Center World chairman Bill Kleyman said. “Every data center is going to become an AI data center. It just depends on how quickly they can get there, what technologies they use to get there.”
Intel’s Huffstetler advised data center professionals to “embrace the wave,” despite the AI challenges.
“AI has taken off. We know that we are embarking upon an inflection point, and one that is going to come faster than any [previous] inflections,” she said. “This one is happening at an astounding rate. Everybody is now thinking, ‘How do I wrap my arms around this challenge?’ It's coming fast. It's coming now.”
About the Author
You May Also Like