AI and advanced applications are straining current technology infrastructures

A majority of executives in a recent survey, 76%, say their current infrastructures are not ready to meet upcoming demands -- aka, AI and associated analytic workloads.
Written by Joe McKendrick, Contributing Writer

The Hitachi survey finds data storage needs may double in two years. Where will all that data go? There's the cloud, right? 

MR.Cole_Photographer/Getty Images

If you're looking at moving your team or organization to artificial intelligence in a big way, you may need to examine and prepare to make investments in the infrastructure underneath -- data capacity, processing capacity, tooling, and related resources. 

While the world gets enmeshed in debates over the efficiency and risks of artificial intelligence, the matter of supporting infrastructure doesn't get enough attention. It appears many current systems may not be ready to handle AI workloads.

Also: Is AI the 'biggest bubble of all time'? Stability AI CEO thinks so

A majority of executives in a recent survey, 76%, feel their current infrastructures "will be unable to scale to meet upcoming demands" -- aka, AI and associated analytic workloads. In addition, the survey of 1,288 executives, released by Hitachi Vantara, also finds 60% report they are simply "overwhelmed" by the amount of data they manage. By 2025, the report's authors predict, large organizations will be storing more than 65 petabytes of data. (It wasn't that long ago that a terabyte was a huge load.)

The Hitachi data mirrors findings out of the AI Infrastructure Institute (AIII), which found that only 26% of teams were "very satisfied" with their current AI/ML infrastructure. The big tech companies, of course, have the huge budgets, staffs, and capacity to make AI happen. Teams within these companies "built their own AI/ML infrastructure from scratch because there was nothing on the market to support their efforts," the AIII report authors state.

Also: The best AI chatbots: ChatGPT and other noteworthy alternatives

They add that lately, "we have seen a rapid proliferation of new tools and platforms that allow enterprises and small to medium businesses to benefit from the intelligence revolution. However, building the right AI/ML infrastructure that fits specific company needs is still a significant challenge."

Take simple raw storage capacity for example. The Hitachi survey finds data storage needs may double in two years. Where will all that data go? There's the cloud, right? Hold that thought, the survey's authors caution. Cloud is part of the solution, "but not a silver bullet," they point out. About 27% of data center workloads will be in public clouds by 2025, and another 21% co-located. About half of data center workloads, 49%, will remain within company walls -- either in more traditional on-premises systems or in private clouds.  

To complicate things a little more, IT executives estimate that they don't have control over half of the data flowing through their enterprises. This is "dark data" that's collected and stored, but is never used -- and may represent almost half of all data. 

Also: Meet the post-AI developer: More creative, more business-focused

It's not just data capacity that needs shoring up -- tools are important. "The growth of any AI/ML team is a journey, and at each stage you need different tools," the AIII report observes. "At any early stage, with only a few top-notch data scientists, your tooling needs are much simpler. But as your team grows, you need newer and better tools to deal with that growth. Traditional enterprise IT considerations, like role-based access control and security, suddenly become important, as does ongoing monitoring and maintenance."

Additional needs that are arising with the rapid growth of AI are feature stores, as well as visibility in data versioning and lineage. "Some discover data versioning and lineage too late, after regulation or a public mistake highlights the need for it.," the AIII authors state. "As teams grow and compete for in-house resource scheduling across GPUs, it becomes essential. At each stage, new must-have tools rise to the surface rapidly."

The big-tech companies "built their own tools from scratch because there was nothing on the market to support their needs, but that approach is largely out of reach for other enterprises that don't have an army of developers," the AIII authors state. "It's also unsustainable, as technical debt and maintenance of those tools quickly becomes a nightmare, even as commercial tools start to bypass internally built systems with their capabilities."

Also: Ahead of AI, this other technology wave is sweeping in fast

The AIII analysts "expect more and more tech companies to replace parts of their home-rolled stack with commercial or open-source alternatives in the next five years. We expect that most enterprises in the early majority stage will not craft their own tools and instead focus on writing smaller tools that close the gap between modular pieces of the stack."  

Editorial standards