In this article, I’m going to suggest three spaces that enterprise-class organizations should invest in over this year. The basis of these suggestions are as follows:

  • Reduce operational costs over the long term
  • Shift funds from operations to investment
  • Orient the company for changes in the compute landscape
  • Accelerate the ability to manage change

Lofty goals from a guy who can barely balance a checkbook, but I hope the reasoning I propose resonates with you. One downside is that I do not consider NPV in my suggestions; even if I could come up with a generalized formula to accommodate this overwhelmingly important factor (which I can’t), it still wouldn’t do any good. These suggestions are designed to change your posture to the marketplace, not assure return on investment in three years or less. They represent fundamental shifts in technology, process engineering, and the way your business is run. Reasonably speaking, each one of these suggestions should be viewed over a three year time frame. They are tectonic changes that require executive sponsorship, change control, risk assumption, and raw leadership. They are not to be taken lightly, and the return, although manifold, will be over the long term.

So here we go.

1) Your plan for cloud computing.
Cloud is not something you can just jump in to if you are a mid-market or enterprise class organization. There are steps you have to take, organizational changes you have to consider, and legalities you have to wrestle with. It is not simply a direction that a CIO can dictate; rather, it must be a staged, comprehensive plan that involves all executives in an organization.

  • Planning for cloud computing will require a significant investment of your resources. Whether you are looking at private, public, or a hybrid, I cannot stress the importance of this investment as a precursor to any hardware or services investment you make in cloud computing. In my mind, there are three components that have to be defined before a conversation can occur with a cloud provider, assuming that you believe that you will move most of your workloads into the cloud.
  • One, what is the strategy for moving interrelated workloads into the cloud, and how will it unfold. This will require you to analyze your software and compute infrastructure, establish SLAs with the business units, understand the interrelationship between workloads, and describe the risk of failure. Understanding this piece of the puzzle will clarify the challenges your lines of business experience with respect to both performance as well as process and will help stage your migration and deployment plan.
  • Two, what does a post-cloud organizational design look like. Do you need server admins? What about infrastructure architects? How will moving to cloud affect software development? Do you need network admins and domain power users anymore? What role will security play in the future? Cloud fundamentally changes the way that business is done; it automates the provisioning process and abstracts the platform decision to business imperatives rather than the perspective of the manager. Many of the operational roles extant in your organization will be rendered obsolete, and many new roles will have to be recognized. It is vital to consider what the effect will be on how your organization is structured, the technical talent that you need on staff, and the responsibilities of the roles you have established.
  • Three, what process changes will be introduced into your organization after the adoption of cloud computing. How are servers and storage requisitioned? How are they billed back to the business? How are exceptions handled? How will your new organizational design and manage these challenges? The modifications that are introduced into your provisioning and support functions must be understood and modeled out before deployment. The modeling that you create will function as the backbone of the workflow you will establish for these functions.

2) What role will mobile applications play in your organization going forward?Mobile access is not remarkable because it gives users the ability to access and manipulate data from edge devices; it is remarkable because users can manipulate and access data from edge devices. Value is problematic if a user has the option of leveraging data to fulfill a job function; it is an entirely different animal if the employee must leverage that data to make decisions. Mobile applications that shape workflow and define productivity will ensure a level of optimization and process automation that will virtually guarantee cost savings and a reduced instantiation cycle for whatever it is you are deploying. Mobile applications are not merely a way to free your employees from a desk; it is ensuring the resiliency of your organization’s communications, operations, and decision-making.

Real time operations are a good example of this collaboration. Decision-making can be accelerated logarithmically if your value chain is mobile-enabled. Cost savings can be dramatic. As employees in the field work on projects, they can update labor and material costs real-time and feed it directly into SAP. Goods can be identified, counted and tracked, ensuring both the elasticity of the supply chain and the accuracy of invoicing. Tethering time entry to geo-location services can ensure your employees are where they should be when they say they are. The possibilities are endless.

3) Get ready for Big Data.
Invest in two different areas specific to this space: the data science organization I am going to suggest you start building, and the instrumentation needed to start collecting Big Data.

Data science is a fledgling field in corporate America. In a nutshell, it is the science of extracting knowledge from data. From a technical perspective, it includes the disciplines of mathematics, signal processing, probability theory, statistics, pattern recognition, data warehousing, and even philosophy. However, a data science team is more than just a propeller head coming out of Yale. There are three roles that need to be filled:

  • One, the business analyst: this is the individual that will capture business-specific requirements as well as existing processes. Without a consultant that can understand (and document) the as-is process and the business challenges of a line of business, it will be impossible to drive any real value from the plethora of information that Big Data will provide. This individual needs knowledge of not just the organization’s technical and business environment, but also must have some functional understanding of the promise and challenges of big data technologies like Hadoop.
  • Two, the scientist: we’re talking about an individual with a desire to go beneath the surface of a problem, find the questions at its heart, and distill them into a very clear set of hypotheses that can be tested. This often entails the associative thinking that characterizes the most creative scientists in any field. Data scientists provide the basic intellectual, philosophical, and creative underpinning of any Big Data strategy. They are the ones that look at data and try to predict interrelationships that the rest of us might not see using algorithms, number crunching, and using raw intellectual and creative horsepower to make 1+1=3.
  • Three, the Big Data engineer: you need an engineer that has the ability to use tools like Hadoop and HBase and Cassandra to capture, store, and process the data from various ingestion streams. These technical engineers work at the most fundamental level of data, working to aggregate, cleanse, and normalize data for ingestion.

The other area you need to think about is instrumentation. I’ve written blogs about this topic in the past, so suffice it to say that both hardware and software tools that capture data relevant to operations in your company are fundamental to expanding the breadth of information available to your Data Science team for analysis. The heavy lifting associated with big data is a function of instrumentation – the ability of an application or a device to collect runtime intelligence around usage levels, errors, user behavioral patterns, and ultimately the statistical analysis of this information. After all, without the “big” in big data, you’re missing half the story. An uninterrupted flow of data in massive sets across disparate data elements is critical to creating data associations that will ultimately drive sales, operational efficacy, and profits.

Obviously, this list is incomplete. There are many other sectors to invest in – refreshing hardware, investing in your people, accelerated hiring, marketing, database normalization, new technologies, and even leveraging partner relationships. There are many sectors available to invest in, and each one will have a compelling story for your organization. However, the three I have mentioned today are kind off “off-the-radar” in that they are long –term visions for CIOs rather than actionable value propositions. They are future-state technologies that sit on the border between marketing hype and market displacement. For the visionaries that elect to invest hard dollars in these technologies today, the future promises to be a leaner, more agile, and more profitable environment. For the others, I expect to see a bloody and agonizing conflict between those that catch-up and those that fall to the wayside.

Jamal is a regular commentator on the Big Data industry. He is an executive and entrepreneur with over 15 years of experience driving strategy for Fortune 500 companies. In addition to technology strategy, his concentrations include digital oil fields, the geo-mechanics of multilateral drilling, well-site operations and completions, integrated workflows, reservoir stimulation, and extraction techniques. He has held leadership positions in Technology, Sales and Marketing, R&D, and M&A in some of the largest corporations in the world. He is currently a senior manager at Wipro where he focuses on emerging technologies.

Interested in more content like this? Sign up to our newsletter, and you wont miss a thing!

Previous post

The History of BI: The 1960's and 70's

Next post

The Week in Big Data - 16th June, 2014