The real-time revolution is here, but it's unevenly distributed

Real time is still very much a work in progress.
Written by Joe McKendrick, Contributing Writer
Abstract high-speed data
Yuichiro Chino/Getty Images

If you listen to enough conference talks, vendor pitches, and analyst pronouncements, you can be forgiven for assuming every organization on the planet now can sense and respond, in real time, to events within milliseconds of happening. 

While this is not yet the case, there's good reason for the hunger for real time. All the exciting new things coming on the scene -- artificial intelligence, predictive analytics, embedded systems, streaming apps, real-time location monitoring, and alert systems -- all depend on real-time technology to function. However, real time is still very much a work in progress.

 Also: Real-time Ubuntu is available in AWS Marketplace now, and it's ideal for rapid prototyping

Industry surveys show how real time is more a dream than reality. For example, in supply chain management, while 77% of executives actively seek real-time shipment visibility, only 25% currently use it, a survey by Tive finds. Similarly, only 23% of enterprises in a survey by Unisphere Research and ChaosSearch indicate that information is available to end-users in real time. 

The spottiness of real-time capabilities might actually be okay for most situations. "Most businesses don't need real-time data," says Nick Amabile, CEO of DAS42. It comes down to whether a requirement is operational or analytical. "Operational systems often require real-time data for use cases around information security -- such as monitoring for security threats, personalization efforts in marketing, logistics, shipping trends, cost optimization, improved customer experience, fraud detection, and trading strategies," Amabile says. 

Also: What technology analysts are saying about the future of AI

Analytical needs, on the other hand, can have some degree of latency. "For analytical use cases, we first define an SLA for acceptable latency," says Amabile. "Perhaps user-facing reporting needs to be real-time, but executive reporting can be hours old. For example, stakeholders often request real-time data and reporting for use cases where batch processing can still be acceptable."

Managers may also want to be selective in what becomes real time, since it also means extensive and expensive upgrades in infrastructure. "There is a wide disparity in the level of readiness for real-time deployments across different organizations," says Tyson Trautmann, VP of engineering for Fauna. "Larger organizations, and particularly those in technology-centric industries like finance, e-commerce, and tech services, often have robust infrastructures capable of handling real-time data. But these capabilities were often built by adding complex layers over legacy products that did not natively support real-time data." This also brings a "high operational burden," he adds.

Also: State of IT report: Generative AI will soon go mainstream, says 9 out of 10 IT leaders 

Is it worth the effort and expense to make the move? "The infrastructure and complexity in building, running, and operating real-time systems are often not in line with the benefits of moving from batch to true real-time," says Amabile. "Often, near real-time is just as valuable as true real-time for most use cases."

Since real time also means moving analytic data at lightning speed from source to system, care must be taken in ensuring this data is vetted and trustworthy. "The growth of data has created large amounts of complexity for enterprises to govern, manage, and assess, often with expansive datasets from many different sources," says Sam Pierson, senior vice president at Qlik. "It's crucial that organizations have a strong data strategy and infrastructure in place to ensure the freshest data, from valid and trusted sources, is what is being used in real-time, or decisions are likely to lead to the wrong outcomes."

Data quality issues need to be addressed upfront. "With real-time data, there's often less time to clean and prepare the data before it's used," says Trautmann. "This can lead to decisions based on incomplete or inaccurate data, potentially leading to poor outcomes."  

Also: 5 technologies that will transform enterprises, according to Gartner

The issue of real-time trust "is taking on even more importance in a world where generative AI is growing in interest and use," says Pierson. "Being able to trust the data that is being provided to employees, knowing for certain that it is valid and appropriate for how it's being used, is essential to maintaining regulatory compliance and data security and governance while also enabling decisions in the moment that deliver the right impact."

A well-functioning and trustworthy real-time or streaming system "requires complex architecture, infrastructure, and programming skills beyond the scope of a typical data science or data engineering team," says Amabile. "Additionally, there are many other considerations that must be made around release and deployment, monitoring and logging, governance, security, and integration across lines of business applications, customer-facing applications, and analytics systems."

The good news is that there are tools and platforms that make real time more real -- even for smaller or medium-sized organizations with limited IT budgets. Over the past decade, "the emergence of new real-time infrastructure offerings has allowed a much broader range of organizations to take advantage of real-time capabilities," Trautmann says. 

"Cloud service providers like Amazon Web Services, Google Cloud, and Microsoft Azure have rolled out managed services tailored for real-time processing, including streaming data services and real-time analytics," he says. "The rise of distributed, in-memory, and time-series databases addresses the need for efficient real-time data workloads. Open-source offerings like Apache Kafka, Apache Flink, and Apache Storm have further enriched the real-time data processing ecosystem."

Also: Generative AI and the fourth why: Building trust with your customer

In addition, "the growth of edge computing has also enhanced real-time processing, particularly for IoT applications, while 5G technology's potential for lower latency and higher data handling capabilities opens new frontiers for real-time applications," Trautmann adds.

Still, "adoption of real-time data infrastructure is unevenly distributed today," says Pierson. "We often see gaps between more tech-focused industries, which tend to adopt real-time patterns earlier, compared to industries such as e-commerce and finance, which might take more time. Most organizations also have a myriad of data management technologies and cloud providers, creating complexity and complications when accessing and delivering data in real time. Governance and privacy regulations are another complicating factor."

Editorial standards