>

Shopping for data

by Ivo-Paul Tummers on 16th November 2018

We live in an age where it seems as though there are endless possibilities for new insights, trends and uses of data. In this new era, decision-making rests even more on information. But many organisations still face significant challenges when it comes to digital transformation. The word ‘connectedness’ has been included in many policy documents or 2020 visions of organisations. “Connectedness” is not limited to the connected car or home automation. It concerns all areas of connection: human, social and technological. Behaviour change then becomes one of organisations’ biggest challenges. Social, technological and business trends are strongly connected and provide new players and new balances in the market. An important development is the increasing dominance of the end customer.

Connect versus collect

You can hardly imagine it, but less than 10 years ago, internet shopping was still in its infancy. The average retailer had no more than a digital brochure on the internet. And you physically visited a store. Now, the provider now has to serve you via multiple channels. This creates more and more tension between supply versus demand-driven partners in the total chain. Having insight into the demand, behaviour and motives of end customers and other players in the chain is decisive for dominance and profitability.

Whilst the use of these resources does not automatically lead to success it is clear that companies that have better information are better equipped to meet customer demand as well as to predict it. What is striking here is that more and more business models are driven by information, in contrast to product-centric or transaction-driven models. The market is evolving from a collection of individual channels to an eco-system of integrated data and services. Trends are transparency, openness, virtual cooperation of networks. Users want control over their data and how it is presented.

This has a big impact on your information technology landscape. Currently many organisations are focused on integrating everything and to centralising their data sources. However, it appears that this will also disappear in the future and organisations will move back to leaving their data where it lives, instead of constantly copying and moving it.

In 10 years you will not spend your time designing a data model. You will connect the data to roles and tasks. The data mash is generated and will evolve into multi-structured models. Forget the central data storage and databases. Release the ‘possession’ of data. Try to imagine that we will grow to distributed models where data sources are shared. Analysts bureau Gartner calls this a connect versus collect strategy. In short, you offer data to third parties and in turn you use data from third parties. Property is no longer primarily in the network economy.

Trust

This will automatically lead to concerns about trust and security. When you offer value, people will respond to your offer. Data can be considered as a language in that perspective. The language will constantly develop and adapt, and dialects will arise. But again it is stated that metadata becomes more important than the data itself. That is a difficult concept to grasp. It’s about semantics, value and quality. How can I trust the quality if I do not own it?

Trust is emotion. This also applies to the reliability of data. It could just be that external data does not meet your own quality standard. But in the era of digital transformation one still has to use it. So how do we increase trust? A possible policy is to trust the data until there is reason to mistrust it. An alternative is to turn this around exactly. One distrusts until there is a reason to trust.

Both perspectives offer obvious challenges and limitations. For example, if you do not dare to trust, this can have a big impact on your market position as competitors continue. But what is the solution? That is when the source is checked. A possible approach is triangulation. You search for and verify multiple sources that refer to the specific data. Or can you identify other parties that have also labeled this data as useful? In Facebook terms, parties giving the “like it” thumb.

Data integration and governance

Now I want to return to the fundamental importance of metadata: the granular data that accompanies all activities, such as store history, side effects of pharmaceutical products, weather statistics, agricultural trends and many other types of information. What the experts are trying to share is that we all long for advanced analysis possibilities. And they challenge us to get used to integrated services that communicate with each other in our vision. No longer large software components in which we collect everything ourselves and keep it clean. Today it is not possible to realise an information infrastructure that has dynamic analysis algorithms and is based on distributed services unless you are a brand-new organisation, a fast-growing start-up, an organisation without history, archives, existing systems and paper information carriers. I do not expect many readers to respond positively to this. And the software providers such as IBM and Microsoft are not ready yet.

Recently the term ‘data lake’ has grown in popularity, and everyone gives their own interpretation. Let me not venture on slippery ice by doing likewise. The only thing that all concepts have in common is that as an organisation you will have to be able to manage three streams: data behind your company wall and data from third parties; data in storage and streaming data (for example database vs. feeds); and structured and unstructured data. And it is correct that the future is to join, not to gather.

To get here, my advice is to focus on your data integration competencies. This includes importing, transforming and then managing the data. You create a so-called data lake foundation. A basis that will be able to handle all these forms in the future in order to lead to behavioural changes in your market. This can now be realised very well, in manageable steps. VIQTOR DAVIS is currently implementing such data lake foundations for various European companies. With industry-leading scalability which makes it easy to keep pace with the growth of data. To redeem this basis quickly, you need to have your own business objectives. For example, is your omni-channel strategy central? Or self-service? In addition, the governance is really on your own board and not with an external specialist. You can support that at most.

Design-once, run-anywhere simplicity promotes efficient business operations, even if your environment evolves. This brings us to fit-for-purpose data. You immediately have the basis for ‘connect’ when it will be released. As all trends can suddenly go fast – was not that also the case with internet shopping? Can you wait? No. Before you realise it, you must shop for dates.

Need to find out more: Get In Touch