X
Business

Evolution of data platforms: Using the right data for the right outcomes

Thought leader Esteban Kolsky takes on the big question: What will data platforms look like now that big data's hype is over and big data "solutions" are at hand? This is the second part beckoning.
Written by Paul Greenberg, Contributor
adrift-in-data.jpg

Last week, I ran my friend and thought leader Esteban Kolsky's first part on how to think about data in a post-big data world -- and it was a hit. Now, to end the suspense, we have part two, where EK looks at how data impacts B2C and B2B differently and what has to be done with that data. All this came as a result of research Esteban was asked to do by Radius, as I mentioned last week. Kudos to them for inspiring this excellent piece.

So, now, Esteban, onto part two.


What did you think of the last post on evolution of data?

If you know where you came from, you know where you are going. And to see where we are going, we need this second part: How have data platforms evolved?

Evolution of Data Platforms: Putting Data to Work

Data has been the "blood" of business since the early days of computing in the 1960s. Along the way changes brought different tools to collect, store, process, and manage data, all the while not asking the essential questions: Why are we doing this? Why do we need data? What are we doing with it? How do I know if we are doing the right things?

There is a difference in how data affects traditional mass market (B2C) organizations and those that sell to other businesses (B2B). There are issues related not just to the potential size of the market (a B2C company targeting hundreds of millions of potential customers versus a business targeting a few hundreds to a few thousands of other organizations) but also the complexity of the data models used.

Also: IBM Cloud Private for Data preps Red Hat OpenShift certification, queryplex search tool

We've had data aggregation happen in mass market organizations (buying and aggregating credit card information, for example, or collecting and sharing loyalty information about customers' needs and actions) for a long time -- we call that segment differentiation in B2C organizations. While this aggregation is interesting for those organizations, it is also far simpler than doing it for B2B worlds. And the tools necessary to solve those problems are focused on managing large quantities, not in focusing on the best data.

dataplatevol2.png

In a B2B world, the problem becomes more complex and requires better tools and processes to be resolved. We cannot say that, for example, 120 of our 200 potential targets belong to the same segment. It becomes impossible to differentiate and very hard to serve properly.

We need better data models that are more focused on our sales processes and our customers' intents.

Also: Apache Flink takes ACID

The amount of data is not a problem in B2B, as in B2C markets, because the potential markets are far smaller, making the data easier to find, but the quality and aggregation of the data represent a bigger challenge. This is where data platforms began to shine.

A combination of easy access to the right data for multiple purposes and generating effective insights is what makes data platforms and insight engines so powerful -- not just the data aggregation. It is not the amount of data stored and potentially used, but the necessary data that is used at the appropriate time. We need to rely on aggregated data to optimize processes and on the and insights generated from those transactions.

The chart shows the evolution of data platforms and data storage over time, including the pros and cons of each. It also includes guidelines as to what each model can accomplish and how it works best.

Matching these tools to the planned outcomes from using data is the first step in generating insights and this is not without problems.

Also: Digital transformation: Getting beyond the hype

The amount of data stored after big data systems were deployed has created, or exacerbated, the three critical problems for B2B companies seeking to optimize data in their digital transformation initiatives:

1. Trusted data aggregation: When organizations have too much data from too many sources, some of the datum may overlap and even conflict. Using insights as the outcome of aggregating data in platforms differentiates between trusted data sources based on the outcomes of the processes and those that should not be used as much.

2. Simple access and updates: Just because the data was collected, aggregated, and stored does not mean that the data is never going to change. In our always-on-the-go world, data changes as often as many times a day (in case of operational data) to every few weeks (in case of segmentation or identifying data).

3. Optimized data use: Business functions rely on having the most accurate and most current data to make the proper decisions. Businesses in the digital era demand the right data at the right time, to make the right decisions.

Until now, organizations have used most of the tools described in the chart to aggregate, consolidate, and centralize data in different versions of platforms used by myriad processes and functions. While this has given them a starting point and the illusion of movement and progress, the reality is that for B2B organizations making data-based decisions in real time requires not just more data but better-quality data and insights. Tools that simply focus on aggregating data regardless of the use of it don't understand what's needed and can't do the job.

Also: Big data is now economics and not just technology TechRepublic

Master data management, for example, is a discipline and set of tools that focused on aggregating as much data as possible. This is done for creating a better centralized store of data that then becomes "the single source of truth." While it creates massive, optimized data stores, there is no purpose or knowledge of the outcome, thus it misses the mark on knowing or highlighting what data to collect, store, and use for each different process.

Customer data platforms are a more modern model and intends to solve the problem of MDM systems by going beyond simply creating a single source of truth to focus on uses for that data. It consolidates data, provides access to many different systems, and ensures that data flows freely from A to Z and is used appropriately in between for known processes. While it does a good job of creating an ODS (operational data store) on the cloud, it fails in the same places where huge data stores failed: It does not know what happened to the data once it was used, whether it was good or needs to be updated or improved, and what were the outcomes.

It is About Using the Right Data for The Right Outcomes

Very often, data-use initiatives are focused on achieving one or more KPI change: Increase revenue, decrease costs, generate more customers, and many others. Traditionally, methodologies already exist for achieving these simple outcomes. For example, to acquire more customers, the steps are simple: Identify a target segment, understand what they need, create an offering that fulfills that need, generate marketing to match the need and identify customers, implement high-pressure sales tactics, and close the deal.


quote-greenberg.jpg

Thus, the next time more customers are needed, a variant of the above will be implemented, and results will be there. It has proven to work well so many times, it will continue to be repeated -- usually the same way as before. With data platforms and aggregated data, the system can optimize the above model so that it reflects both real-time needs as well as lessons learned and insights from past occurrences.

If an organization wanted to optimize or transform based on lessons learned, it would create some sort of report post-event and analyze what could've been done differently. This is a way to generate delayed insights (or post-facto insights). It works but does not help optimization since by the time we learned what needs to be done, the event is over, and changes cannot be made anymore.

Also: Turning Big Data into Business Insights

Learnings and changes can be made in near real-time using real-time insights, artificial intelligence, and advanced analytics. Optimizations happen quickly, and results are captured dynamically, creating the insights that help improve the event this time and every other time. Campaigns can be optimized with a better message, sale cycles can use updated pricing and even bundling, marketers can use better content that has seen results, and so forth.

Instead of waiting until the end and potentially the next event to see if changes worked (and in the process ignore the fact that conditions changed already between events), an organization can use insights from aligning the right data, the right processes, and the right outcomes to improve operations on the fly.

Of course, to do this requires not just a different set of tools (not focused on storing and aggregating data only) but also a different mentality. And a different methodology.

And for that, we have a framework -- but you will have to download the paper to see what it is.

Back to you, my friend.


Thank you, Esteban. This was awesome. Come back if you want to expand on this -- anytime.

NOTICE: The CRM Watchlist 2019 registration and the registration for The Emergence Maturity Index Award for 2019 is closing on Sept. 30 with no extensions possible. So, if you want to make sure that you are part of these (see the links in the names for the details), then you have roughly two weeks to go (a little less). I'd do it, if I were you, though I'm not. You, that is.

To request the registration form for either of them, please email me at mailto:paul-greenberg3@the56group.com.

The tech that changed us: 50 years of breakthroughs

Previous and related coverage:

What to do with the data? The evolution of data platforms in a post big data world

Thought leader Esteban Kolsky takes on the big question: What will data platforms look like now that big data's hype is over and big data "solutions" are at hand?

The past, present, and future of streaming: Flink, Spark, and the gang

Reactive, real-time applications require real-time, eventful data flows. This is the premise on which a number of streaming frameworks have proliferated. The latest milestone was adding ACID capabilities, so let us take stock of where we are in this journey down the stream -- or river.

Arcadia Data brings natural language query to the data lake

Arcadia Data provides a search engine-style text box as its latest query interface, bringing BI natural language query to the data lake.

This startup thinks it knows how to speed up real-time analytics on tons of data

Making sense of the vast amounts of data gathered by businesses is a problem for business that Iguazio says it's cracked.

Editorial standards