With big data comes big data responsibility. Simply collecting large amounts of data isn’t helpful in and of itself. To use data to its full potential, it is vital that companies have strategies in place for collecting, storing, managing, and utilizing that data. Unfortunately, many MSPs often place too much focus on collecting data and too little on getting the most out of it.
The earliest attempt to define the essential attributes for big data management came from Gartner in the late 1990s. Their system, known as the “3V’s Framework” provided an outline for understanding and managing data assets. As time has gone on, other companies have expanded upon their framework. Today the most commonly referenced big data management attributes feature a fourth V:
Every organization collects data from a range of different sources. Those sources often include transactions, content, and internal data among others. The 4 V’s provide a framework for leveraging and maintaining that data to produce superior value.
Let’s take a deeper look at each of the 4 V’s and examine how MSPs can analyze them to achieve superior results.
When discussing big data, analysts use the term volume to describe the amount of data involved in the process when considering all the different possible data sources.
Pound for pound, there are few types of companies that collect more data than MSPs. Not only do they collect a lot of internal operations data, but they are often responsible for their client's data as well. According to IBM, 90% of today’s data has been created in the last 2 years. It’s easy to see why data volume can quickly become a serious concern. That concern isn't just for storing large amounts of data, but for accessing, managing, and effectively using that data.
The volume of data a business produces on a daily, weekly, or monthly basis can be overwhelming. Not only does your team need to manage traditional data points, like tickets opened, employee time, and sales deals closed, you may also need to keep track of and monitor any data created by social media networks. Additionally, considering any historical data you may have on hand, curating information from employee reports, customer logs, and other sources gives your team a long term look at trends and patterns. Having more data sources to analyze allows you to explore and create a bigger picture but challenges your team to create and manage both a storage system for this information and a user-friendly system for analyzing and explaining these data points.
The term velocity is used to describe the rate at which data is processed and the pace at which the data flows from the customer or consumer to the management team.
With the widespread use of computers and the introduction of mobile technology like tablets and smartphones, the flow of data can be overwhelming. In addition to traditional means of interaction, like phone calls, surveys, and emails, customers and clients are leveraging the power of social media in order to be heard, creating an almost continuous stream of data.
MSPs in particular often deal with rapid influxes of data, and it can be overwhelming if you don’t have a system that allows you to store, categorize, and filter incoming data on a real-time basis. Being able to filter out some of the “noise” and focus on the information that helps you streamline your decision making process is critical in order to identify potential problems, and implement solutions and alternatives well in advance.
In the context of big data, variety is used to describe both the different types of data as well as the different sources it comes from.
Management analysts and experts often tout that businesses “can’t manage what they can’t measure”, meaning that you can only be accountable for the data and processes that are available to you at any given time. With the influx of new data provided by social media and mobile technology as well as traditional data generated by your clients and customers, the sheer variety of your data can cloud your management picture.
There are two varieties of data that businesses collect:
The best way to address data variety is to streamline your information into one singular view so that your managers have a clearer picture of what’s happening with any client or customer.
Data veracity refers to the trustworthiness of the data collected.
Companies with high-volume data operations are more likely to suffer from low-quality data. In turn, business decisions made based on bad data are unlikely to result in the desired outcome.
On a big-picture scale, it is estimated that poor data quality costs the US economy up to $3.1 trillion every year. Translated to the MSP industry, high data volume and variety are commonplace, and because of that occasional data abnormalities are unavoidable. The most important thing to remember about veracity at your MSP is to take the time to clean up incoming data and filter out that of poor quality. Maintaining clean, reliable data is critical for the health of your business since your data is the foundation of your decision-making process.
There is a lot to consider when attempting to improve your data operations. The 4 V’s - volume, velocity, variety, and veracity, encompass the most critical demands of effective, data-driven decision making. Learn more about how to successfully manage big data using BrightGauge in our free white paper: