Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

MapR Performance Benchmark Exceeds 100 Million Data Points Per Second Ingest

byEileen McNulty
September 9, 2014
in News
Home News
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Enterprise Hadoop vendors MapR just demonstrated that their data platform is capable of ingesting over 100 million data points a second. These impressive performance results were unveiled at the sold-out Tableau Conference in Washington, using OpenTSDB on the MapR Distribution including the in-Hadoop NoSQL database, MapR-DB.

Using only four nodes of a 10-node cluster, the MapR team accelerated OpenTSDB performance by 1,000 times. In the coming years, data processing at these speeds will become increasingly necessary. The growing interest in real-time analytics and the explosion of Internet of Things applications means data processing often needs to happen at breakneck speeds.

In an interview with Dataconomy about Hadoop’s role in the Internet of Things, MapR’s Chief Marketing Officer explained: “What’s required is a platform that can scale very quickly. We’re not just asking ‘Can the system handle terabytes or petabytes?’, but ‘Can it handle millions or billions small, individual files?’ A hundred million files is not that large from an Internet of Things perspective, so these systems need to scale to a billion, or even a trillion files. That is the area that MapR has provided a platform for from the very beginning.”

“The ability to combine deep predictive analytics with real time capabilities is an absolute requirement. So an integrated, in-Hadoop database is a key feature.”

Cisco estimates that there will be approximately 50 billion connected devices by 2020. This unprecedented level of data generation will push the boundaries of existing data platforms. Ingestion speeds like this stand MapR in good stead for the coming explosion of data that the Internet of Things will bring.

Follow @DataconomyMedia

(Image credit: Shashi Bellamkonda)

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Tags: HadoopMapRWeekly Newsletter

Related Posts

Substack goes for the living room with beta TV app launch

Substack goes for the living room with beta TV app launch

January 23, 2026
Google rolls out opt-in “Personal Intelligence” for AI Pro and Ultra users

Google rolls out opt-in “Personal Intelligence” for AI Pro and Ultra users

January 23, 2026
JBL launches AI-powered BandBox amps

JBL launches AI-powered BandBox amps

January 23, 2026
The billion-event problem: How data engineering powers 8-hour battery life in AR glasses

The billion-event problem: How data engineering powers 8-hour battery life in AR glasses

January 23, 2026
Influencer collaboration with brands: 15 real formats beyond the sponsored post

Influencer collaboration with brands: 15 real formats beyond the sponsored post

January 23, 2026
From fragmented systems to intelligent workflows: How CRM platforms like Salesforce power data-driven enterprise operations

From fragmented systems to intelligent workflows: How CRM platforms like Salesforce power data-driven enterprise operations

January 23, 2026
Please login to join discussion

LATEST NEWS

Substack goes for the living room with beta TV app launch

Google rolls out opt-in “Personal Intelligence” for AI Pro and Ultra users

JBL launches AI-powered BandBox amps

The billion-event problem: How data engineering powers 8-hour battery life in AR glasses

Influencer collaboration with brands: 15 real formats beyond the sponsored post

From fragmented systems to intelligent workflows: How CRM platforms like Salesforce power data-driven enterprise operations

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI tools
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.