From Brexit to Trump: How organizations and data prepare for and respond to political events

Politics and data do mix, apparently, so we try to count the ways.
Written by George Anadiotis, Contributor

As the Big Data London event coincided with a new development in the Brexit story, back-stage discussions could not possibly have remained unaffected. In some cases they even took to the center stage, going as far as panelist Kim Nilsson from Pivigo who has been quoted to say that "if we have a hard Brexit, the data industry is going to die".

The newly elected US president has drawn parallels between himself and the Brexit movement, so even though comparing leaving a federated union of nations and changing head of state is like comparing apple pie to Marmite, looking at what organizations can do using data to cope with political change may be of relevance.


Predictions are hard, especially about the future. Even more so when trying to predict something as volatile and intangible as political sentiment on a massive, real-world scale. People have notoriously tried to use technology for this purpose and failed, and even if everyone was either listed on the telephone guide or on social media and willing to truthfully express their intentions, which they are not, technology has its limitations.

Natural Language Processing (NLP), the part of AI that tries to interpret language, may be able to identify clearly articulated commands as showcased by personal assistants like Siri and Google Assistant. But going from that to identifying expressions and arguments that range from the ludicrous to the elaborate and may feature sarcasm, cultural references, elliptic and erroneous use of language is a challenge nobody can claim to be up to at the moment.


Can organizations really prepare for the impact of political events? Talend reported on the case of Travis Perkins, a company involved in contracts with the public sector in the UK. Such contracts can run over the course of many years, and companies with supply chains extending worldwide need to have as good an approximation of the fluctuations in currency as possible - otherwise, they run the risk of undercosting. The prospect of Brexit was recognized as having the potential to impact the value of the British pound, so the organization would have benefited immensely from the ability to have a somewhat credible projection on this.

Solutions like Talend help organizations integrate and process data points in the range of millions or even billions, but that's not enough. Trouble is, even organizations in the financial sector that specialize in forex with state-of-the-art technology and tons of data available to them can offer predictions ranging from daily to weekly at best. Nowhere near the range of 5-10 years, which was what was needed in this case, so data to facilitate hedging options was not of much use. But even organizations with less demanding scenarios were mostly caught unaware, did not bother to prepare at all, or did what they did in top secrecy.


If predicting and preparing for political events are more or less out of the question, how can organizations at least measure their impact post-mortem in order to react? Business Intelligence (BI) and analytics may have some consolation to offer here. Traditional business intelligence relied mostly on data warehouses that were notoriously hard to build, deploy, operate and update. This meant that the end-to-end process required to go from metric and KPI definition to being able to see and explore results was an expensive and time consuming operation available to only a handful of organizations.

Technologies like the cloud and NoSQL databases along with the advent of agile approaches for BI have democratized access to BI and enabled more organizations to gain analytical insights through data. For example, Birst reported that an unnamed client of theirs was able to go from management decision to producing a dashboard in 6 weeks. This may seem long in business time, but taking into account the complexity and urgency of the task, it would have been unheard of in the recent past. This goes to show how modern BI can be more effective - but it's the definition of metrics that is actually the hard part here.

Defining metrics is hard, interpreting them is harder, as correlation does not necessarily imply causation. Image: Scott Adams

Dealing with such a broad challenge, defining how to measure it and locating, integrating and using the data required is hard. Although metrics used by the client reported by Birst were not discussed, some approximations are known. When recently discussing upcoming IPOs with a colleague, he indicated that the result of the US election may impact them directly. Naturally, there are a number of other variables that may affect the number and valuation of IPOs, cognitive biases abound and correlation does not always imply causation, but this example shows how something intangible can be approached using metrics.


Ready or not, changes happen and organizations are impacted. What's data got to do with this - do changes affect their data, and can data help in dealing with the change? In our research, the most obvious answer we got from every vendor we talked to was "we can cope, because we run in the cloud". Policies and regulations regarding data differ from country to country, and as a result organizations have to be extra careful as to where their data resides and how they are handled. This may have an impact not just on their obligations with respect to data collection and privacy, but also on their bottomline.

For example, many British betting and gaming companies operate in places like Malta and Gibraltar, because of the tax and data regimes in effect. At the same time, being active in the UK imposes different rules for data collected and processed there. The landscape was anything but clear as it was, and the Brexit prospect has added to the fear, uncertainty and doubt. But one of the things that being cloud-based has brought is the ability for organizations to effectively virtualize and outsource their data centers, thus enabling them to move data hosted in the cloud at will and in very short notice.

A more challenging aspect has to do with identifying and acting upon information pertaining to new circumstances. For example, Astra Zeneca -another Talend client- and other pharma companies operating in the UK have to navigate a complex landscape of rules and regulations pertaining to medicine packaging and labelling ingredients. The rules are different for UK and the EU, and it is not clear at the moment what will apply and when. Pharma companies need to not only identify and collect information in huge legislative corpuses, but also figure out their contents and whether they apply to them.

In this case, NLP may be of assistance, as reported by Gary Richardson, KPMG's Head of Data Engineering in his keynote, KPMG has managed to significantly cut down on their costs and time to deliver while making more productive use of their labor force by utilizing such techniques to scan documents used by organizations for due diligence purposes. Similar approaches may be used in other domains to the same effect.

The original idea for this article was submitted by the Birst PR team. Information was compiled based on interviews with Richard Neale, EMEA Marketing Director at Birst, Ciaran Dynes, VP Products & Marketing at Talend, and others. The author's travel expenses were covered by Datastax.

Editorial standards