Analytics for Insurance. Tony Boobier. Читать онлайн. Newlib. NEWLIB.NET

Автор: Tony Boobier
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Зарубежная образовательная литература
Год издания: 0
isbn: 9781119141082
Скачать книгу
href="#i000003100000.png"/>

Figure 1.5 The hierarchy of analytics

      ■ Analytics which serves simply to report on what has happened or what is happening which is generally known as descriptive analytics. In insurance, this might relate to the reporting of claims for a given date, for example.

      ■ Analytics which seeks to predict on the balance of probabilities – what is likely to happen next, which we call ‘predictive analytics.’ An example of this is the projection of insurance sales and premium revenue, and in doing so allowing insurers to take a view as to what corrective campaign action might be needed.

      ■ Analytics which not only anticipates what will happen next but what should be done about it. This is called ‘prescriptive analytics’ on the basis that it ‘prescribes’ (or suggests) a course of action. One example of this might be the activities happening within a contact center. Commonly also known as ‘next best action,’ perhaps this would be better expressed as ‘best next action,’ as it provides the contact center agent with insight to help them position the best next proposed offering to make to the customer to close the deal.

      It need not unduly concern us that predictive and prescriptive are probabilistic in nature. The insurance industry is based on probability, not certainty, so to that extent insurers should feel entirely comfortable with that approach. One argument is that prediction is a statistical approach responding only to large numbers. This might suggest that these methods are more relevant to retail insurance (where larger numbers prevail) rather than specialty or commercial insurances which are more niche in nature. Increasingly the amount of data available to provide insight in niche areas is helping reassure sceptics who might previously have been uncertain.

      In all these cases there is an increasing quality of visualization either in the form of dashboards, advanced graphics or some type of graphical mapping. Such visualizations are increasingly important as a tool to help users understand the data, but judgments based on the appearance of a dashboard are no substitute for the power of an analytical solution ‘below’ the dashboard. One analogy is that of an iceberg, with 80 % of the volume of the iceberg being below the waterline. It is much the same with analytics: 80 % or more of the true value of analytics is out of the sight of the user.

      The same may be said of geospatial analytics – the analytics of place – which incorporates geocoding into the analytical data to give a sense of location in any decision. Increasingly geospatial analytics (the technical convergence of bi-directional GIS and analytics) has allowed geocoding of data to evolve from being an isolated set of technical tools or capabilities into becoming a serious contributor to the analysis and management of multiple industries and parts of society.

Overall it is important to emphasize that analytics is not the destination, but rather what is done with the analytics. Analysis provides a means to an end, contributing to a journey from the data to the provision of customer delight for example (Figure 1.6). The ultimate destination might equally be operational efficiency or better risk management. Insight provided should feed in to best practices, manual and automatic decisioning, and strategic and operational judgments. To that extent, the analytical process should not sit in isolation to the wider business but rather be an integral part of the organization, which we might call the ‘analytical enterprise.’

Figure 1.6 From ‘data’ to ‘customer delight’

      1.1.3 Next Generation Analytics

      Next generation analytics is likely to be ‘cognitive’ in nature, not only providing probabilistic insight based on some degree of machine learning but also with a more natural human interface (as opposed to requiring machine coding). Cognitive analytics is not ‘artificial intelligence’ or ‘AI’ out of the mold of HAL in Kubrick's ‘2001 – A Space Odyssey’ but rather represents a different relationship between the computer and the user. We are already on that journey as evidenced by Siri, Cortana and Watson. Speculators are already beginning to describe ‘cognitive’ analytics as ‘soft AI.’ This is a trend which is likely to continue as a panacea to the enormous volumes of data which appears to be growing exponentially and the need for enhanced computer assistance to help sort it. Cognitive analytics may also have a part to play in the insurance challenges of skill shortages and the so-called demographic explosion.

      Forms of cognitive computing are already being used in healthcare and asset management and it is only a matter of time before it finds its way into mainstream insurance activities.

      Coupled with this is the likely emergence of contextual analytics. Insurance organizations will become increasingly good at knowing and optimizing their own performance. Unless consideration is given to what is happening outside their own organization, for example amongst their competitors, then these viewpoints are being made in a vacuum. The American scientist Alan Kay expressed it succinctly in these words: ‘Context is worth 80 IQ points.’

      In the cold light of day, there are two key objectives which need to be adopted by insurers: Firstly, to outperform direct competitors, and secondly, to achieve strategic objectives. To do one and not the other is a job only partly completed. Often but not always the two key objectives go hand in hand.

      Outperformance of competitors by insurers may be measured in varying forms:

      ■ Finance performance – profit, revenue, profitable growth.

      ■ Customers – retention, sentiment, propensity to buy more products.

      ■ Service – both direct and through third parties such as loss adjusters who are considered, by extension, as part of the insurer themselves.

      ■ Staff – retention, sentiment.

      These issues need to be considered in the context of the wider environment, for example the macro-economy or the risk environment. In a time of austerity or where there is rapid growth in the cost of living, individual families may choose to spend more on food than on insurance products. At a time when the agenda of insurers has been dominated by risks associated with capital and solvency, perhaps their eyes have been temporarily taken off the ball in terms of other risks such as underwriting risk, reputational risk and political risk but that position is relatively easily and quickly remedied.

      1.1.4 Between the Data and the Analytics

      Big Data in either its structured or unstructured forms does not naturally flow into analytic outcomes, which usually takes the form of reports, predictions or recommended actions, but relies on intermediate processes which exist ‘between the data and the analytics.’

      How this is done in practice is a matter for the technical experts but in simple terms the raw data needs to be captured, then brought into the system where it is filtered, cleansed and usually stored. Massive volumes of data lend themselves to complex sorting systems or ‘landing zones,’ most of which have their own language and jargon. Often a datamart or staging layer is created to ensure that an analytical outcome can be created relatively quickly. The process by which data is moved through the system is referred to as ETL, or ‘extract, transfer, load.’

      There are other alternatives, such as ‘data warehouse appliances’ which provide a parallel processing approach and create a modular, scalable, easy-to-manage database system. These high speed solutions allow very rapid computing power by providing an alternative to traditional linear processing, and often come with pre-bundled analytical and geospatial capabilities. In effect this is a ‘plug and play’ approach to Big Data and Analytics. These serve as a reminder that, as was experienced with the internet in the early days, both organizations and individuals will increasingly press for computing power in the form of analytics to be provided ‘at speed.’ It doesn't seem that long ago that, in a domestic environment, connecting to the internet was accompanied by some form of whistling and other strange noises down the telephone line. Now instant 4G connectivity is expected anytime, anyplace, anywhere – within reason. Perhaps in that light, if one level of differentiation between technology vendors is that of the breadth and depth of analytical capability, the other differentiating factor may well be speed