"Information is the oil of the 21st century, and analytics is the combustion engine.” As early as 2011, Peter
Sondergaard, a Senior Vice President at Gartner predicted a
change in data management strategies known as big data, the pursuit of which would create an unprecedented
amount of information of enormous variety and complexity.
He was right. Today organizations store vast amounts of data, much of it across multiple, disparate databases
that are unable to talk to each other. It’s a problem that’s exasperated by mergers and acquisitions where
new data sets are inherited.
Organizations generally understand the power behind analytics, but how do you make it work culturally and
technically? We take a look at the barriers to data analytics success and suggest new approaches that buck
the system, with dramatic results.
The Technology Challenge
Different departments will always need separate access rights. Probably most HR data should be
accessible by others than financial data. Highly sensitive data should only be accessed by authorized
personnel, while other data (sales/marketing) might need to be shared among cross-functional teams during
certain time intervals.
A common approach to this problem is to store all “sharable” data in a data warehouse, an expensive and
time-consuming approach. 80 percent of the effort of a typical data project are
focused solely on cleaning data. Furthermore, by extracting data from its original source into a warehouse
duplicates that data, increasing storage demands. Another problem is that data, such as sales forecasts,
ages quickly. Without a way to continually and automatically update that data in the warehouse – in
real time – your analytics will be founded on outdated information.
The Cultural Challenge
To become a true data-driven organization, a cultural shift is necessary. Change always prompts concern. Fear of
change and subsequent data fiefdoms are some of the main reasons why data analytics projects fail. Data owners fear they’ll
lose relevance or control of data, if they are forced to share data sets with other departments, agencies,
or external expertise is brought in.
Without a Shift Data Analytics is Destined to Fail
Welcome to the world where technological complexity and cultural fiefdom is killing data
analytics projects. It comes as no surprise that 60% of data
analytics projects fail.
How can organizations counteract these challenges and find a way to connect disparate data (only using the
original data source) while gaining buy-in from the team? The answer lies in an unlikely source –
Application Program Interfaces (APIs).
Addressing the Technology Challenge – Breakdown Silos with APIs
Data may be today’s oil, but it will be tomorrow’s oxygen. Mobile devices, IoT, and cloud applications generate
vast data streams. We’ve come to expect access to valuable information at our fingertips.
Enterprise data problem solving has also changed. Gone are the days when software giants, such as Microsoft, SAP,
Oracle and MicroStrategy, were one-stop-shops for addressing your data challenges. Today you can mix and
match data from different systems, without the help of the big guys.
Thanks to APIs, disparate systems can now interact with one another and exchange data. Lightweight (APIs
eliminate the need for traditional hard-coded system integration), modern, flexible, and less risky than
other data sharing approaches, API use is booming.
Concurrently, data warehouses are losing relevance. They still have a role to play, but are no
longer predominant as the single or predominant source of data in the enterprise. And
that’s ok. It’s not necessary to maintain a “golden
record” of all the data entities in your organization. And that’s where APIs truly excel. They
allow you to work with real-time data (as opposed to historical data) and real-time analytics to provide
a better understanding of what’s going on at any given time.
Addressing the Cultural Challenge – Take an API-Enabled Iterative Approach
With APIs, fear and entrenched data fiefdoms are a thing of the past. Instead of grabbing data
from a department’s database, cleaning it and prepping it for analysis, the data stays right where it is, under
the control of the data owner. Opening your API also helps you maintain the health of your business
intelligence program by promoting data hygiene. Knowing that their data will be shared, data owners
instinctively become more accountable for keeping that data clean. Whereas with a data warehouse approach,
once the data leaves the department, data owners no longer feel responsible for it.
APIs also support an iterative approach to analytics. Data owners can decide what to share based on
what they feel most comfortable with. As they see the fruits of their sharing, they start giving up
their data monopoly. It’s a nimble and cost-effective approach that increases team buy-in.
Of course, it doesn’t happen overnight. How can your organization achieve this shorter, nimbler path to
actionable data insights? Read more about what we call the Minimal Viable Prediction (MVP) approach.
How to Turn your APIs into A Powerful Analytics Foundation – Meet the API-in-a-Box
More and more businesses are embracing an API business model. But how do you enable this API-driven
analytics transformation? Allow us to introduce the API-in-a-Box.
An API-in-a-Box is a containerized API adapter that can be deployed in a plug and play fashion, quickly and
cost-effectively. It integrates disparately-stored data by providing a safe passage for non-sensitive data
or data that’s been given a green light by a department to be shared. With an API-in-a-Box, data remains in situ
at its original source but is accessed in real-time.
APIs are a proven method for encouraging cross-departmental collaboration, analytics, and reporting, while
facilitating the identification and correction of data discrepancies. Teams maintain full control of their data
and can provide exact rules as to who can access that data.
An API-in-a-Box can be spun up in an extremely short period of time, eliminating the time-consuming data
integration problem. Plus, after data errors are found and one department’s data is merged with another,
actionable insights start to emerge and the barriers of fear and fiefdom start to break down.
Go Ahead, Resist the Big Bang Approach
The traditional approach to data analytics is often risky big bang-thinking. Some
of these projects have worked, but those successes are few and far between. They call for a huge planning
endeavor, one that’s beyond the time and resources of many organizations. That old
safeguard, the data warehouse has also run its course, as the stalwart of business intelligence initiatives.
As the arguments in this piece show, it’s time for a new approach.
Using new technology concepts (API-in-a-Box) and iterative approaches (Minimal Viable Prediction), results
emerge, sometimes in a matter of weeks, not months or years, and at a fraction of the cost of doing it the
Data owners become heroes as new and actionable insights are achieved. A culture shift starts to take place
as more people pull in the direction of a data culture.
The field of API for data analytics is still new, but to be successful in the long term it’s an approach that we
vehemently advocate. Give it a try.