The real-time revolution is underway, but not for everyone

Estimated read time 5 min read

The real-time revolution is underway, but not for everyone

If you listen to enough tech conferences, sales presentations and analyst statements, you can believe that all organizations on the planet can now detect and respond, in real time, to events that occur in the milliseconds that precede.

This is not the case. At least not yet. But this quest for real time answers important questions. All the exciting new technologies that appear on the tech scene – artificial intelligence, predictive analytics, embedded systems, application streaming, real-time location monitoring or alert systems – all depend on real-time technology to work. However, these projects are still most often in progress.

Worse: Sectoral studies show that real time is more a dream than a reality. For example, in the field of supply chain management, while 77% of executives are looking for real-time visibility of shipments, only 25% of them are currently using it, according to a survey by Tive. Similarly, only 23% of the companies interviewed as part of a survey carried out by Unisphere Research and ChaosSearch indicate that information is available in real time.

“Most companies don’t need real-time data”

“Most companies don’t need real-time data,” says Nick Amabile, CEO of DAS42. The question is mainly whether the need is operational or analytical.

“Operational systems often need real-time data for use cases related to information security, such as security threat monitoring, customizations in marketing, logistics, cost optimization, customer experience improvement, fraud detection and business strategies,” explains Mr. Amabile.


Analytical needs, on the other hand, can present a certain degree of latency.

The question of the cost of infrastructure

“For analytical use cases, we first define a Service Level Agreement (SLA) for acceptable latency,” explains Amabile. “It may be that the reports intended for the user must be in real time, while the reports intended for management may be several hours old. Stakeholders often request real-time data and reports where batch processing may still be acceptable.”


Managers may also want to be selective about the need for real time, because setting it up also means setting up an expensive IT infrastructure.

“There is a big disparity in the level of readiness for real-time deployments between different organizations,” says Tyson Trautmann, vice president of engineering at Fauna. “Large companies, and especially those that are focused on technology, such as in finance, e-commerce and technology services, often have robust infrastructures capable of processing data in real time. But these capabilities were often built by adding complex layers on top of existing products that didn’t natively support real-time data. This also leads to a “high operational load,” he adds.

Moving analytical data at lightning speed from the source to the system

So does the additional cost of work and budget justify the transition to real time? “The infrastructure and the complexity of the construction, operation and operation of real-time systems are often not commensurate with the advantages of moving from batch processing to real real time,” says Mr. Amabile. “Often, near real time is just as useful as real time.


Since real time also means moving analytical data at lightning speed from the source to the system, care must be taken to ensure that this data is verified and trustworthy.

“The growth in data volumes has created complexities for companies in terms of governance, management and evaluation, often with datasets coming from many different sources,” says Sam Pierson, senior vice president at Qlik. “It is essential that organizations have a solid data strategy and infrastructure in place to ensure that the most recent data, from valid and reliable sources, are the ones that are used in real time. Otherwise, decisions may lead to erroneous results.”

The big question of the quality of data applied to real time


Data quality issues need to be addressed from the get-go. “With real-time data, there is often less time to clean up and prepare the data before it is used,” says Trautmann. “This can lead to decisions based on incomplete or inaccurate data, which can lead to poor results.”


The issue of real-time trust “is becoming even more important in a world where generative AI is gaining more and more interest and use,” says Pierson. “Being able to trust the data provided to employees, knowing with certainty that it is valid and appropriate for the way it is used, is essential to maintain regulatory compliance as well as data security and governance, while allowing decisions in the moment that produce the appropriate impact.”


A real-time or continuous system that works well and is trustworthy “requires a complex architecture, infrastructure and programming skills that go beyond the scope of a typical data science or engineering team,” explains Amabile. “In addition, many other considerations must be taken into account with regard to production and deployment, monitoring, governance, security and integration between business applications, customer-facing applications and analysis systems.

Going through the cloud for real-time?


The good news is that there are tools and platforms that make real time more concrete, even for small and medium-sized companies with limited IT budgets. Over the past decade, “the emergence of new real-time infrastructure offerings has allowed a much wider range of organizations to take advantage of real-time capabilities,” says Trautmann.


“Cloud service providers such as Amazon Web Services, Google Cloud and Microsoft Azure have implemented managed services adapted to real-time processing, including data streaming services and real-time analysis,” he adds. “The rise of distributed, in-memory and time-series databases responds to the need for workloads for real-time data. Open-source offerings such as Apache Kafka, Apache Flink and Apache Storm have further enriched the real-time data processing ecosystem.”


In addition, “the growth of edge computing has also improved real-time processing, especially for IoT applications, while the potential of 5G technology for lower latency and higher data processing capabilities opens up new frontiers for real-time applications,” adds Trautmann.

Source: “ZDNet.com “

You May Also Like

More From Author