Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

MapR Performance Benchmark Exceeds 100 Million Data Points Per Second Ingest

byEileen McNulty
September 9, 2014
in News
Home News
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Enterprise Hadoop vendors MapR just demonstrated that their data platform is capable of ingesting over 100 million data points a second. These impressive performance results were unveiled at the sold-out Tableau Conference in Washington, using OpenTSDB on the MapR Distribution including the in-Hadoop NoSQL database, MapR-DB.

Using only four nodes of a 10-node cluster, the MapR team accelerated OpenTSDB performance by 1,000 times. In the coming years, data processing at these speeds will become increasingly necessary. The growing interest in real-time analytics and the explosion of Internet of Things applications means data processing often needs to happen at breakneck speeds.

In an interview with Dataconomy about Hadoop’s role in the Internet of Things, MapR’s Chief Marketing Officer explained: “What’s required is a platform that can scale very quickly. We’re not just asking ‘Can the system handle terabytes or petabytes?’, but ‘Can it handle millions or billions small, individual files?’ A hundred million files is not that large from an Internet of Things perspective, so these systems need to scale to a billion, or even a trillion files. That is the area that MapR has provided a platform for from the very beginning.”

“The ability to combine deep predictive analytics with real time capabilities is an absolute requirement. So an integrated, in-Hadoop database is a key feature.”

Cisco estimates that there will be approximately 50 billion connected devices by 2020. This unprecedented level of data generation will push the boundaries of existing data platforms. Ingestion speeds like this stand MapR in good stead for the coming explosion of data that the Internet of Things will bring.

Follow @DataconomyMedia

(Image credit: Shashi Bellamkonda)

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Tags: HadoopMapRWeekly Newsletter

Related Posts

Anthropic partners with Teach For All to train 100,000 global educators

Anthropic partners with Teach For All to train 100,000 global educators

January 20, 2026
Signal co-founder launches privacy-focused AI service Confer

Signal co-founder launches privacy-focused AI service Confer

January 20, 2026
Adobe launches AI-powered Object Mask for Premiere Pro

Adobe launches AI-powered Object Mask for Premiere Pro

January 20, 2026
Google Workspace adds password-protected Office file editing

Google Workspace adds password-protected Office file editing

January 20, 2026
Claim: NVIDIA green-lit pirated book downloads for AI training

Claim: NVIDIA green-lit pirated book downloads for AI training

January 20, 2026
Tesla restarts Dojo3 supercomputer project as AI5 chip stabilizes

Tesla restarts Dojo3 supercomputer project as AI5 chip stabilizes

January 20, 2026
Please login to join discussion

LATEST NEWS

Anthropic partners with Teach For All to train 100,000 global educators

Signal co-founder launches privacy-focused AI service Confer

Adobe launches AI-powered Object Mask for Premiere Pro

Google Workspace adds password-protected Office file editing

Claim: NVIDIA green-lit pirated book downloads for AI training

Tesla restarts Dojo3 supercomputer project as AI5 chip stabilizes

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI tools
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • Who we are
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.