In a recent survey conducted by Paradigm4, a computational database company, it was revealed that big data was proving to be a challenge for data scientists – but not because of the amount, or volume, of data being produced, but rather the variety and diverse types of data these professionals have to handle.
“The increasing variety of data sources is forcing data scientists into shortcuts that leave data and money on the table,” said Marilyn Matz, CEO of Paradigm4. “The focus on the volume of data hides the real challenge of data analytics today. Only by addressing the challenge of utilizing diverse types of data will we be able to unlock the enormous potential of analytics.”
According to the survey, 71 percent of data scientists said that big data has made analytics more difficult and that the variety of data was to blame. Interestingly, 36 percent of data scientists in the survey mentioned that because the data is large and cannot be moved to their analytics software easily, gaining insights into this data is a lengthy process. As such, “these issues cause data scientists to omit data from analyses and prevent them from maximizing the value of their work.”
Further results from the survey also include the following:
- Despite the hype around the Hadoop software platform, fewer than half (48 percent) have used Hadoop or SPARK — and of those, 76 percent said it was too slow, took too much effort to program or had other limitations
- 91 percent said they’re using complex analytics on their Big Data now or plan to within the next two years
- Nearly half of data scientists (49 percent) said they’re finding it more difficult to fit their data into relational database tables
- 39 percent said their job had become more stressful with the growth of Big Data
The results of the survey were generated from 111 people who identified themselves as data scientists based in the United States.
Read more here
Interested in more content like this? Sign up to our newsletter, and you wont miss a thing!
(Image Credit: /\ \/\/ /\)