[“IDC predicts that the market for big data will reach $16.1 billion in 2014, growing 6 times faster than the overall IT Market.”](http://www.forbes.com/sites/gilpress/2013/12/12/16-1-billion-big-data-market-2014-predictions-from-idc-and-iia/ ““IDC predicts that the market for big data will reach $16.1 billion in 2014, growing 6 times faster than the overall IT Market.””)_
### Variety (Business Intelligence)
The variety of your data comes from the instrumentation of business applications, and the flexibility with which application developers can utilize, measure and normalize infinite data types. Your business analysts need to be free to pull in any data feed they can get their hands on. The variety of data, as complex as typically is, needs to be supported by data velocity, data veracity and data volume. Why the other 3 V’s you ask?? You have to access data within the right timeframe to make the smart decisions. Data has to be trustworthy to for the sake of accuracy and support the scale of the information being collected.
It’s critical for the CIO and CFO to invest resources to ensure a businesses’ infrastructure can support the creativity and rapid insight that executives need to guide the company by setting strategic goals and cutting costs where appropriate. Failing to establish systems and practices that support the entire organization, from the bottom up, will likely lead to poor decision making and unnecessary costs down the road. In the end, smart companies will spend money instrumenting their business applications to allow themselves the flexibility and scalability use their data analytically and effectively.
_[“…it’s the variety in data that holds the most potential for exploitation.” – Edd Dumbill, Contributor, Forbes](http://www.forbes.com/sites/edddumbill/2013/12/31/big-data-variety-means-that-metadata-matters/ ““…it’s the variety in data that holds the most potential for exploitation.” – Edd Dumbill, Contributor, Forbes”)_
### Volume (Cloud Orchestration)
To support your volume of data it’s important to understand Cloud Orchestration and how it fits into your architecture. Not all clouds and their services are built alike. You can’t always rely on one instance, region or vendor for uptime or affordability. Cloud Orchestration compels you to map your application to the market and understand how to ensure the need for scale, speed and cost. Similarly, you need to map the deployment architecture to reliably provide the right performance and capacity. Just assuming one cloud instance or data center can support your needs is a significant risk to the business—the golden rule of infrastructure is redundancy and scale. Avoid dependencies on any single vendor’s price and scale appropriately; assume the worst case scenario and build your infrastructure accordingly.
All clouds have outages. What happens if an outage occurs during the high point of an enterprise’s sales cycle and suddenly they are flying blind? Or your systems go down during a sales presentation? How much money did you just leave on the table? Capacity planning and scale not only prevent revenue loss but they also accelerate recognizing revenue by helping you get your product to market. Cloud orchestration can deliver scale on demand but first you have to automate the processes driving it so it doesn’t become an additional cost center.
### Veracity (Data Assurance)
Trust through data assurance is critical to making smarter decisions. In today’s hostile environments, there are many threats to your data. Accidental or malicious incidents can and will alter data more frequently than you think. If data goes unchecked it’s subject to tampering; if the data is flawed, you’ll make the wrong decision and will hear all about it in the press. How much of an impact can altered data have on your business? Just ask Target, TJ Maxx or Sony for starters.
Technologies such as Guardtime’s KSI are key to making sure you have the ability to audit your data and data integrity across your entire repository. Data security is a multi-billion dollar market—what you have to realize is that this is more than just encryption and user access. By not scrutinizing the data and locking it down you are leaving yourself wide open. The NSA and Target all had Encryption and user security, but they weren’t watching the data itself. Data changes went unnoticed and the end result, well you can read about that in the New York Times.
### Velocity (Systems Automation)
Continuous deployment and rapid product development allow companies the ability to be nimble and react to changes in the market. Staying ahead of market changes and forces is one thing, but good data velocity can keep you ahead of the curve and make sure you’re hitting your goals. Gone are the days where you could have monthly or quarterly product release cycles, it’s daily if not hourly—your data environment is no different.
Systems automation allows for fluid action in your environment to make sure the latest tools are deployed to scale BI efforts. By leveraging systems automation, enterprises can speed up (velocity) the variety and volume of their data efforts while ensuring the efficacy (veracity) of the data. As a bonus, fast systems will allow you to rapidly iterate and roll out new bug fixes, features and tune your systems for maximum performance. Technologies like Chef have taken the guess work out of deploying redundant systems while allowing your devs to focus on marketable features, rather than spending time tinkering with infrastructure.
* * * * * * * *
I’m reminded of a line from Frank Herbert’s Dune: “he who controls the spice controls the universe.” Replace spice with data and you have an apt description of the world we live in. Your data is quickly approaching cosmic proportions—at one time our greatest challenge was getting more data into systems to become smarter. That was then; today the challenge is taming, protecting, accelerating and trusting the data is informing decisions to lead to positive outcomes rather than infamy in major publications.