Yesterday Datatorrent’s first product, DataTorrent RTS, became generally available. DataTorrent RTS is alleged to be the first platform on top of Hadoop capaple of processing more than 1 billion data events per second.

Founders Phu Hoang and Amol Kekre noticed a gap in the market whilst they were both working for Yahoo! They observed a real demand for technologies which allowed you to see what was happening in real-time, rather than identify problems only after they’ve happened and the analytics have caught up. “We are seeing increasing interest in stream-processing platforms for real-time analytics, as a complement to data warehouses and Apache Hadoop,” said Jason Stamper, Research Analyst, 451 Group. “Enterprise adoption of stream-processing requires fast, in-memory processing of a large volume of events at scale — and in many cases a fault-tolerant architecture too.”

As well as allowing for proactive management rather than management in hindsight, Hoang believes DataTorrent is prepared for the impending explosion of the Internet of Things. “By 2020, the number of smartphones, tablets, and personal computers in use will reach 7.3 billion units. Additionally, the number of Internet of Things (IoT) connected devices will grow to 26 billion units, a nearly 30-fold increase from 0.9 billion in 2009”, states the DataTorrent press release. Although the amount of human-generated data exploded relatively recently and companies are scrambling to process and harvest this influx of data, the rise of machine and sensor data has barely begun, and technologies must adapt to keep up with the relentless volume of generated data to come.

David Hornik of August Capital, whose company has invested $8M in DataTorrent, acknowledges the ways in which DataTorrent will ameloriate existing Hadoop technology to cope with new performance and latency demands. Everybody acknowledges big data matters, and Hadoop is perfect container, but that doesn’t help run your business. How do you take advantage of these nodes distributed around the world?” he states. “When you can process a billion data points in a second, there are a lot of possibilities.”

Read more here.
(Photo credit: Peter Riou)



Interested in more content like this? Sign up to our newsletter, and you wont miss a thing!

Previous post

How Hadoop is Helping Farmers and Preventing Suicide

Next post

Vizzuality - Making Visualisations Accessible to All