General Electric has disclosed the setting up of a data lake software approach to make gathering and analysis of Big Data more productive for large enterprises. A data lake is a large object-based storage repository that holds data in its native format until it is needed.
Founded in partnership with platform-as-service (PaaS) provider Pivotal, the data lake approach has a “2,000x” performance improvement on analysis time, which will allow leading enterprises to spend less time and money managing intensive processes and focus more on turning the data into action — improving supply chains, customer service and operations.
“Big and fast data is a critical piece of how modern industry is reinventing itself in order to innovate and compete,” said Paul Maritz, CEO of Pivotal. “The new industrial data lake architecture answers the call for the fast and highly scalable management of the unique industrial big data that is helping global enterprises transform their operations and build a new class of applications.”
This New tool is designed to give industrial customers worldwide, including airlines, railroads, hospitals and utilities a better access to Big Data, as well as improve the analysis and storage of up to petabytes of industrial-strength information.
This exercise enables us to learn more about whatever complex system we’re operating — jet engines, factories, oil platforms — so that it can perform better, faster, more efficiently and at lower cost.
The utility of this service is not limited to airlines. Enterprises could use the data-lake approach to sniff out new inefficiencies of their own. It is safe to believe that a lot will be heard about data lakes in the coming months.
Read more here.
(Image credit: vxla)