Applications and products that react to events in real time, such as self-driving cars, heart monitors and flight bookings, require a reliable data flow to function correctly. Think about the disruptions caused when Facebook, AWS or Slack go down. Or take the example of an application designed to monitor electrocardiogram (ECG) signals for sudden changes that could signal a heart attack. An erroneous heart attack alert might signal if the data stream is interrupted. The same logic applies to products that use data to monitor machinery, optimize processes or detect behaviors associated with fraud or cyber attacks. These applications don’t just rely on data; they rely on real-time data streams. You can’t afford processing breakdowns when this much data is critical to your product and business.

Redundancy at every level for resiliency 

High-velocity data introduces new challenges to data processing. The faster the data is created, the more data that can be lost if that system fails, and the more challenging it becomes to process that data. We suggest stream processing in real time over batch for its high resiliency level. Brokers like Kafka, rather than traditional databases, store data in memory using multiple nodes to provide a distributed persistence layer for resiliency. Installing a data processing layer with Kubernetes ensures data is processed without duplicating or missing a single data point. Redundancy throughout the system prevents data from being lost while keeping it moving quickly.

Control puts stream processing (and redundancy) to the test

Let’s look at Control, a telemetry company specializing in race car data connectivity. When a car zooms around the track, every millisecond counts. Race cars already contain hundreds of sensors that enable engineers to monitor car parts and optimize performance. Along with sensors,

Control provides three in-car modems that ensure data flows seamlessly from the sensors on the car to remote race engineers. These modems automatically switch between cell towers during the race to optimize data flow. They give racers an edge over competitors relying on batch processing due to their consistent connectivity, allowing for sophisticated machine learning models to act in real time. If one of the modems breaks, the race isn’t lost, thanks to their number.

Microservices create resilient systems for streaming and using streams of data

Control’s solution involved Kafka, Kubernetes, ML models, a series of devices and a collection of other developer tools to run its streaming solution. The Quix platform helps Control manage complex integrations and individual services so that Control and other users can focus on building data products that work every time with no interruptions.

Previous post

Only a handful of companies have figured out how to use AI to its full extent, report says

Next post

Network redundancy is the safety net of business continuity