Thursday, 19 November 2015

Using big data to turn global catastrophic risks into opportunities

Big data is currently transforming both the public and private sectors by increasing efficiency, transparency and productivity whilst also promoting sustainability. As the ability to utilise intelligent data analytics distinguishes today’s winners, data is fast becoming the oil of the 21st century. Organisations and countries that manage to harness this new commodity will ensure sustainable economic growth in the same way that those with access to cheap fossil fuel resources have been in an advantageous position in the past.  

The proliferation of mobile technology, wireless sensors, social media and the Internet of Things, provides a means of monitoring socio-economic activity, consumption of resources, transactions, human mobility and environmental change. Recent advances in data science are now capable of coping with the technical challenges of collecting, managing and developing actionable insights from big data. Much of the exciting research has focused on addressing the technical challenges of dealing with the three V’s that define big data (volume, velocity and variety), which is growing at 40% per year (Figure 1). The sheer size and complexity of the data being created by internet devices (Figure 2) implies a need to move beyond simple linear models and embrace sophisticated modelling approaches. Many organisations sit on a treasure chest of data, which when combined with external data will offer enormous potential.  

Measuring and monitoring the UN’s sustainable development goals will require better processes to utilise big data. The UN Statistical Commission has established a global working group to provide strategic vision, direction and coordination of a global programme on Big Data for official statistics. There are numerous challenges ahead that will require multidisciplinary teams to process raw data, extract insights and produce dashboards to enable intelligent decision-making. Fortunately, this revolution has already started in the insurance sector.
Figure 1: Amount of big data created each year.

There are many contenders when it comes to identifying the most threatening global catastrophic risks. Over the centuries, epidemics, earthquakes, floods and windstorms have competed for the position of deadliest disaster. Those with the highest death tolls include the Black Death of 1348 that wiped out up to 60% of Europe’s population and the Spanish Influenza of 1918 that killed between 40 and 100 million people. The costliest catastrophe, with estimated economic losses now exceeding $235 billion, is the earthquake and tsunami that hit Tōhoku, Japan in 2011, resulting in meltdowns at the Fukushima nuclear power plant.  

Reinsurance organizations quantify and compare catastrophic risks in terms of potential financial losses. Since 1987, when AIR Worldwide released the first catastrophe model, reinsurers have benefited from the scientific rigor of catastrophe models to assess risk. The financial losses associated with a particular peril are simulated by combining the hazard, exposure and vulnerability. While impact is clearly important, the frequency of catastrophic events must also be calculated to determine how to develop adequate risk management systems. Big data comprising historical events, crowd-sourced data and computer-simulated output form the ingredients of a CAT model. As the science matures and both practitioners and academics seek to cooperate, the growing need for a collaborative platform has emerged in the form of the Oasis Loss Modelling Framework ( 

There are many opportunities to use big data to improve the assessment and management of global catastrophic risks. At present, risk assessment is largely a backward looking exercise where a catalogue of historical extreme events form the basis of the analysis. In many cases, an assumption is made that the risk has not changed during the historical period. This approach is defensible if the hazard, exposure and vulnerability are not changing over time. In reality, all three can vary and both data and advanced modeling techniques are required to understand the complex interactions.  

Emerging risks, such as terrorism, lack a historical catalogue and forward-looking predictive models are required. Natural disasters such as windstorm and flood are affected by climate change and overreliance on the past may underestimate future risk. Satellites and drones are helping to collect data to better understand exposure and vulnerability. Crowd-sourcing can also be used effectively to encourage people to build resilience to disasters and develop disaster risk management strategies. 

Scientific models allow insurers to evaluate the risk associated with natural disasters. The probability of exceeding a specific financial loss is calculated using advanced quantitative modelling. At the core of this risk modelling is the need to determine the relationship between a particular measure of the hazard, such as wind speed or rainfall and the resulting financial losses. Catastrophe models involve the computationally intense process of using geographical information systems (GIS) to describe the spatial variation of exposure and vulnerability for a particular portfolio of buildings. By running numerous simulations of extreme events that vary in time and space, the catastrophe model assesses the chances of experiencing losses of different magnitudes. These models can be broken down into modules describing the hazard, exposure, vulnerability and financial components. The development of these modules relies on access to a skilled team of scientists, engineers, statisticians and actuaries. 


Opportunities arise at the interface of novel data, advanced modeling and a willingness to innovate business practices. The transition to using quantitative models to automate decision-making, remove inefficiencies and prioritize resources is already taking place in many organisations.  

Big data is providing the ability to offer weather insurance for farmers. Data from weather stations or satellites can be used to construct an index that tracks the losses that have arisen due to extreme weather events. With the availability of low-cost wireless sensors and higher resolution information, the accuracy and feasibility of this innovative type of insurance is improving. 

Many countries are now using public-private partnerships as a means of structuring national catastrophe programs to protect against natural disasters. New Zealand’s Earthquake Commission (EQC) provided primary natural disaster insurance that protected the owners of residential properties from the 2010 and 2011 Christchurch earthquakes. Early warning systems are another innovation and these rely on timely access to information – social media is playing an important role in communicating alerts. 

Figure 2: Number of internet devices being used each year.

Great opportunities exist for the private sector to use big data to monitor business activities and interactions with customers. This information is providing information about what works and what does not and is helping to increase efficiencies in many sectors. Success relies on being profitable and also managing risk when making decisions – big data is helping to provide actionable insights for both.  

Data is also becoming a valuable source of information about the preparedness of firms to cope with shocks that might arise from regulation, technology and climate change. Pension funds are consumers of such information in order to make long-term decisions about companies. Novel datasets and surveys are available to assess the true value of firms and to better understand how their activities are likely to be aligned with future opportunities in an effort to strengthen resilience. Key decisions in the face of future uncertainty can be supported by data and those that understand how to utilize big data are more likely to prosper. 

Risk reduction strategies tend to be reactive as it is easier to justify allocating resources in the aftermath of a disaster. As the lifespan of politicians and business leaders is relatively short, they rarely have the stamina to support long-term strategies that will not reap a reward until the next disaster. Furthermore, many responses involve incremental solutions that fail to grasp long-term opportunities.  

Talk about strengthening resilience is a growing trend that is replacing less optimistic discussions about risk management. Resilience implies more than risk reduction and can be viewed as having the capacity to adapt, recover and transform in response to adverse events. There is an important role here for big data to encourage proactive transformative solutions as opposed to incremental changes. New sources of data and innovative decision support tools could identify strategic actions and allow companies to be rewarded for transforming early. Performance metrics could help investors evaluate the companies that are already transforming and positioning themselves to make the most of future opportunities deserve to be rewarded now for their foresight.


Wednesday, 1 April 2015

Forecasting demand using Big Data

As we walk, cycle or drive around Oxford, make telephone calls, send texts or emails and do our shopping, many of us are unaware of exactly how much data is being generated by our activities. "Big data" is a catch-phrase for describing the overwhelming volume, velocity and variety of this stream of information. Big data has the potential to provide many opportunities for the public and private sectors, offering a means of fusing different sources of information and supporting decision-making in real-time.

Perhaps the most interesting aspect of big data is how it deepens our understanding of human behaviour seen through the collective actions of many individuals. We tend to consume services following the temporal cycles in our everyday lives. There are three evident cyclical patterns based around the hour of day, the day of the week and the season of the year. All of these patterns can be seen in electricity consumption, call centre activity, internet usage, financial transactions, traffic flow and the use of healthcare services.

Fortunately the repetition of these patterns offers potential for accurate demand forecasting. Services can be delivered with greater efficiency if staff and limited resources are scheduled in order to meet forecasted demand. The National Grid has been balancing supply and demand for years and knows the value of accurate forecasts. If they get it wrong, the lights go out and everybody notices. While power outages still happen in many countries, we take it for granted in the UK that we have reliable access to electricity at all times. Amazingly, we are relatively tolerant of imbalances in supply and demand in other sectors and this may explain why sophisticated demand forecasting is not widely utilised.

Take healthcare, for example. The NHS has a target of seeing 95% of patients arriving in A&E within four hours. Until recently there was little information about the performance of our local hospitals or indeed how they compare with the rest of the country. Now weekly A&E data about the percentage of patients seen in four hours is available. This week the John Radcliffe Hospital A&E scored 87.1%, slightly less than the national average of 91.5%. Here is a chance for Oxford City to become smarter.

There are many opportunities to use big data and quantitative models to forecast demand, develop early warning systems and improve staff scheduling. The graph below shows the average hourly A&E arrivals at the John Radcliffe for different days of the week. We immediately see the hour of the day effect with low demand during the night and two peaks at 12:00 and 18:00. Most striking is the near doubling of arrivals in the early hours of Saturday and Sunday, which can be attributed to the effect of weekend partying and pubs closing at 11:00 on Friday and Saturday night. While A&E staff are well aware of the additional burden caused by weekend festivities, the data analysis paints a clear picture of its impact on arrivals.
Big data can facilitate a better understanding of social behaviour and the effect of the environment. Arrivals increase on bank holidays. Temperature is another important factor with arrivals increasing in warm weather. But this is just the start. Forecasts of extreme weather events and information about social events could be used to construct an accurate model.

New book: