The space race began decades ago, but it has now reached unprecedented new levels that are scaled on a near-daily basis.
Back then, it was nations getting into space. Now, that has transformed to what level of space nations can cross. With each nation’s every attempt (successful or otherwise) to cross newer, farther frontiers in space, come larger and larger volumes of data. The number of earth-gazing satellites has nearly doubled in 2014 alone.
Open Data: The Final Frontier?
Certainly, the instruments used to record this data are incredibly advanced. But the fact remains that there is more information sent than machines have the capacity to interpret. Openly sourcing this data might open it up to new developers and professionals who could potentially create solutions to dealing with this data. It is not clear whether supercomputers currently in existence, as equipped and formidable as they are, would be able to effectively process and interpret this data.
But the issues with making this data open are manifold; first, the sheer size of the data. Superfast connections that can transfer these massive amounts of data are available to governments alone. A single developer, a group or collective may not have the infrastructural wherewithal to even access this data, which would not be in a specific format.
In connection to this issue, the formats this big data is downloaded, stored, accessed and interpreted in may not be obvious, apparent or easily understood to a non-trained eye.
A possible workaround to this might be the initiation of training programs for interested developers, ones that are constantly updated and reconstituted to deal with the burgeoning amounts of data and expanding frontiers explored.
This, however, could have problems of its own, very closely interrelated with potentially making collected data open source. While in an ideal world, interstellar exploration is a collaborative operation that benefits the entire planet, this is not often the case, with a significant portion of the ‘space race’, though advantageous to humanity as a whole, played out as a competition among a select few nations.
Space exploration, world round, is as of now carried out exclusively by national governments, and this data hosted on government servers. With this comes the very real risk of cyber espionage, which both state agencies world over and private organisations and people increasingly have the means and skills to carry out. The smallest of vulnerabilities in a massive system dealing with this volume of big data could have significant repercussions on national security, especially in politically charged times such as this one.
With a largely non-collaborative space race, data protection remains nationalised.
Selling or outsourcing this data would mean the privatisation of integral cogs of a nation’s space operations, which would also echo previously outlined concerns. There are myriad DBMS (Database Management Systems) currently available, many of which are open source; however, the massive amount of data collected and generated by interstellar vehicles, satellites and rovers is not only extremely intricate but interrelated. Even if this data were to be broken down into a more basic form for less tangled, potentially less complex chunks or packets, big data would still need to be coordinated, relayed and interpreted, which brings us back to the initial problem of terrestrial data transfer not being equipped to handle the data.
What’s Next for Big Data and Space
Big data transferred by satellites are not only utilised for the space race, however. Finance, insurance, agriculture, forestry, fishing, mapping, manufacturing, shipping, mining, sustainable industries, and energy enterprises are using eyes and ears in the sky to make better decisions and improve operational efficiency with actionable data sent ultimately from space. The consequences of utilizing big data reach the ultimate grassroots of a nation, especially true for agrarian economies such as those of South-East Asia.
With the current volumes of big data, the resources available are also unfairly skewed. Of the ten supercomputers available in the world that could handle these volumes of data, six are in the United States and owned by its government.
Plans are currently in the works to build what will be the world’s largest radio telescope. Although this will be a worldwide collaborative effort, it will be located in the United States of America. Work on the Square Kilometre Array (SKA), which will then become the largest radio telescope ever built, is due to begin in 2018.
The data generated by the SKA amounts to about 700tbps – 700 terabytes per second, “equivalent to all the data flowing on the Internet every two days”. According to statements by NASA, scientists are currently using cloud computing to store this data, but vulnerabilities in the cloud, which will no doubt be intensely fortified, could lead to a data leak in the style of recent happenings, except with repercussions that are far more serious. Existing data management firms such as Oracle could, of course, be sought after to help find management solutions.
In addition to management, the visualisation and interpretation of this data is important; solutions must concurrently be arrived at to this problem as well. Sorting through the kind of big data that are received and sent by newer and larger satellites each day is a gargantuan task that cannot be sorted through, catalogued or organised by a human, a fact NASA avers to.
The sponsorship for space-data collection methods; satellites, interstellar vehicles et al., have now begun to see a shift from government to more privatised, commercial funding.
Skybox Imaging, a satellite-operating firm that monitors terrestrial surfaces to track change, built and launched the world’s smallest high-resolution imaging satellite. In August 2014, Skybox Imaging entered into an agreement with Google Inc., and was acquired by the firm for US$500 million.
In 2014, Monsanto, the American multinational agrochemical and agricultural biotechnology corporation, acquired The Climate Corporation, which records geoclimatic data and relays that information back to earth.
Although interplanetary and interstellar exploration has insofar remained government-owned, funded and regulated, the realm of space, satellites and space-related devices is now becoming increasingly privatised, with increasingly detailed research into climatology, topography, tomography and an increased ability to predict tectonic and climactic events.
These satellites, just as the increasingly advanced telescopes recording information, data and images from outer space, now record more layered, complex information that needs real-time collection and interpretation. The buck, however, does not stop with the management and sorting of this data, which finally, needs to be made both accessible and understandable to the public via programs, graphics, maps and interactive user interfaces such as Google Maps or its more detailed terrestrial brother, Google Earth. This could result in a definitive boost to the software sector, spurring it on fairly quickly.
Space-related big data is not only all-encompassing in what it records, but also in the endless possibilities for advancements to humanity in terms of both advancement and economic activity.
Writer and communications professional by day, musician by night, Anuradha Santhanam is a former social scientist at the LSE. Her writing focuses on human rights, socioeconomics, technology, innovation and space, world politics and culture. A programmer herself, Anuradha has spent the past year studying and researching, among other things, data and technological governance. An amateur astronomer, she is also passionate about motorsport.
More of her writing is available here and she can be found on Twitter at @anumccartney.
(Image credit: NASA’s Marshall Space Flight Center)