BY NINA WEBER, 03/05/2016
Every year, the size of the digital universe – all the data we create, store or copy – doubles, and the data we accumulated over the past two years now constitutes 90% of the total amount of data that currently exists in the world (Science Daily, 2013). Without doubt, Big Data is one of the most significant developments of the 21st century to date. But what does Big Data actually mean and what is the potential of this new development?
Simply put, Big Data is the accumulation and analysis of huge amounts of data which cannot be processed by traditional applications and which requires new and more complex programmes. Beyond question, this is an incredibly useful and profitable development which allows companies not only to gain insights into markets and customer behaviour but also to develop completely new products or to specialise in providing analytical programmes for Big Data, like IBM or Cisco already do (Nambiar, 2014). According to the International Data Corporation, the value of the Big Data and analytics market is currently growing at over 23% every year and is estimated to reach $48.6 Billion in 2019 (IDC, 2015).
However, one area in which Big Data has seemingly yet to gain a foothold is in the public sector. Whilst Big Data would probably be able to improve almost all areas of government intervention, one is of particular importance: central banking. In light of the financial crisis in 2008 and the subsequent sovereign debt crisis in Europe, central banks around the globe face an existential question: how can we protect ourselves from another crisis of this magnitude? The answer might well be Big Data.
In the run-up to the financial crisis in 2008, the Bank of England based its decision-making on summary financial statements and statistics available on a quarterly basis and only in standardised numerical format (Bholat, 2015). This lack of detailed, real-time information decreased the Bank’s ability to react appropriately and promptly enough to issues in the financial market. Whilst, at the time, the Federal Reserve and the Bank of England focussed on lowering interest rates and saw the cause of the crisis in a lack of liquidity, later research showed that risk management in the financial industry was more likely to be the problem and that central banks did not react appropriately, probably because of a misinterpretation of the situation (Taylor, 2009). As outlined in a PwC report about the Central Banking Forum in 2014, Big Data analytics could help detect irregularities in markets whilst they are developing and therefore be a vital part in crisis detection and prevention. It could in fact work “in a way comparable to seismic activity indicators for earthquakes” (PwC, 2014).
The Bank of England did recognise the potential of using Big Data after the financial crisis and has now laid out a strategic plan with the use of data at its heart. In a report by the Bank of England in 2014 entitled ‘Big Data and Central Banks’, the Bank set forth the benefits of using data to understand patterns in both the housing and employment markets and to obtain greater knowledge about the financial industry. One particularly interesting focus of the Bank’s new plan is the ‘One Bank Research Agenda’, which embodies the new commitment of the Bank to make data sets accessible to the public in order to crowdsource possible solutions (Bholat, 2015). Thus, contrary to the idea that more data would only increase the surveillance of government institutions, it could, in fact give rise to more influence of the individual in policy decisions.
The important difference for central banks is the new ability to collect granular data instead of just aggregate data. This means that instead of relying on overall financial statements of firms and banks, data can now be collected and analysed more frequently and in greater detail. Thus, the huge benefit of more detailed data for central banks becomes especially clear in the light of financial crises. As the Bank of England report states, central banks of countries which suffered a financial crisis were often unable to pinpoint financial fragilities because of inaccurate or incomplete aggregate data (Bholat, 2015). The development of Big Data analytics and the access of central banks to more detailed information in the financial markets could, potentially, encourage a more precise and prompt reaction to these fragilities.
But can this new commitment by central banks to the use of Big Data really prevent another financial crisis? Without doubt, the use of Big Data will enable central banks to spot irregularities faster and more frequently. However, the actions taken based on these detected problems will really decide whether a crisis can be prevented or not. Moreover, one could argue that the political dependence of central banks must necessarily influence their ability to react rationally and based solely on data. Following this idea, introducing big data tools into central banking would therefore have a feeble impact, as the actions of the banks would still be biased based on the political consensus. One particular problem is public expectation around the abilities of central banks. As stated in the PwC report, if the economy relies on specific, “too big to fail” banks, the central bank will have no option other than bailing out the institution of interest to prevent any further damage (PwC, 2014). However, this is likely to pose a problem in future crises, as the public opinion is now largely against bailouts of any form and the political consensus is likely to adapt accordingly (Kopicki, 2013).
Nevertheless, this concern simply lacks substantiation. As shown in cross-sectional studies, overall macroeconomic performance is not influenced by the dependence or independence of central banks on governments (Alesina; Summers, 1993). It seems therefore likely that the access to more data as well as the ability to analyse it in more detail, will give central banks a crucial tool to interpret issues more accurately , and to base their actions on more reliable information. In addition to this, another benefit of Big Data will be the central banks’ ability to forecast the effects of potential interventions in more detail.
As the Bank of England recently stated, the use of Big Data will enable central bankers to start with more sufficient data and to then develop theories from these results, instead of taking theoretical macroeconomic ideas and then fitting the data into the theory (Bholat, 2015). Whilst this new approach to central banking seems promising, the jury is still out as to whether it will significantly enhance the central banks’ abilities to prevent future financial crises.
Alesina, Alberto. (1993, May) Central Bank Independence and Macroeconomic performance. Available at http://www.econ.ucdenver.edu/smith/econ4110/Alesina%20Summers%20-%20Central%20Bank%20Independence%20and%20Macro%20Performance.pdf Last accessed 20 February 2016
Bholat, David (2015) Big Data and Central Banks, Available at :http://www.bankofengland.co.uk/publications/Documents/quarterlybulletin/2015/q108.pdf Last accessed 20 February 2016
FRAMINGHAM, Mass. (2015, November 9). New IDC Forecast sees Worldwide Big Data Technology and Services growing. Available at: http://www.idc.com/getdoc.jsp?containerId=prUS40560115 Last accessed 20 February 2016
Kopicki, Allison. (2013, September 26). Five Years Later, Poll Finds Disapproval of Bailout: http://economix.blogs.nytimes.com/2013/09/26/five-years-later-poll-finds-disapproval-of-bailout/?_r=0 Last accessed 20 February 2016
Nambiar, Raghunath Nambiar (2015, September 28) Thinking Bigger! Cisco + IBM – Collaboration of giants brings industry-leading solution for big data analytics, available http://blogs.cisco.com/datacenter/biginsights Last accessed 20 February 2016
PWC. (2015, February): At a crossroads: the future of central banking. Available at: http://www.pwc.com/gx/en/financial-services/publications/assets/pwc-central-banking-forum-2014.pdf Last accessed 20 February 2016
Taylor, John B. (November 2008) The Financial Crisis and the policy response: What went wrong. Available at: http://www.nviegi.net/teaching/taylor1.pdf Last accessed 20 February 2016
SINTEF. (2013, May 22). Big Data, for better or worse: 90% of world’s data generated over last two years. ScienceDaily. Available:www.sciencedaily.com/releases/2013/05/130522085217.htm Last accessed 20 February 2016