« August 2016 | Main | October 2016 »

2 posts from September 2016

12 September 2016

How London Market Insurers can navigate the challenges of deploying predictive analytics

In my first London Market focused blog, "How London Market insurers are responding to the perfect storm of challenges", I touched on the competitive advantages that could rise from exploiting Data Analytics.

I believe that predictive analytics is key to boosting underwriting returns in the London Market – even in very soft market conditions – most notably through teaming advanced analytics capabilities and tools with the augmentation of enhanced internal data with third party data and Big Data. The improved accuracy of risk analysis, segmentation and technical pricing will enhance an underwriter’s decision-making capabilities in implementing cycle management strategies, both in terms of:

  • walking away from a given risk if an appropriate premium cannot be commanded, with greater confidence; or
  • deepening their understanding of a given risk profile, so that rate reductions compared to expiring terms could (if needed) be offered to secure highly lucrative business at still attractive prices, or in order to robustly support rate increases at renewal.

Another angle relates to understanding exposures and thus aggregations better (potentially in real or near real time), which could also lead to more realistic Minimum & Deposit premiums or the ability to calculate and collect adjustment premiums earlier, improving cashflow management.

Counterarguments to this view often cite the lack of credible data volumes or sufficiently granular data, poor data quality, issues round the timeliness of data, data that is largely in paper format, and inertia within the industry to embrace the latest technology developments, as critical barriers to enabling widespread advances in data analytics in the London Market. The current soft market conditions also present a key challenge – how do you justify the level of investment you need to make today, with combined ratios nudging 100%, in the expectation of improved underwriting returns in the future?

The data conundrum can be addressed from a number of different directions:

  • The major brokers already hold a vast amount of data, and have been digitising large chunks of it, as part of data analytics-as-a-service propositions (albeit not necessarily focussed on enhancing insurers’ underwriting and pricing capabilities). In the ideal world, or further into the future at least, broker presentations of risks would be 100% digital, allowing insurers the ability to capture much more granular information on exposures (such as comprehensive details of individual aircraft or marine vessels within the fleet to be insured, and the associated telemetry and geo-spatial/GPS data over prior periods of cover, for example) and more detailed claims information at the point of underwriting.
  • Paper-based documentation and PDF versions of documents held internally, whether it be detailed appendices to claims files, policy slips, loss adjusters’ reports, legal reports, detailed listings of vessels insured, detailed listings of places of business for multinational insureds etc, can be digitised into valuable data. For example, this could be through scanning into digital document management systems and deploying text mining, “scraping” from PDFs, or manual re-keying of specific data items into structured data formats.
  • Trade organisations and associations, such as the International Union of Marine Insurance, often capture detailed exposure and claims data at the global industry level, offering a much richer data set than most insurers could hope to build.
  • A myriad of diverse data vendors have been springing up at an increasing rate, providing specialised data that supplements the plethora of open data sources that are relatively underemployed in the insurance sector.
  • Real time exposure data and telemetry can be captured via next generation telematics, RFID technology deployed in conjunction with cargo containers, and the ever growing wealth of data coming from Internet of Things connections. Insureds are already taking advantage of these new technologies, and recognise that these data sources can provide a clearer understanding of risk and operational resilience, which insurers can also seek to leverage from an underwriting and pricing perspective.

Technology and advanced software solutions are fundamental in capturing, storing and effectively analysing data from so many sources and in so many formats in one place. The costs of data storage have been falling rapidly, and the advent of Cloud computing has changed the perception of what is possible with gargantuan data sets. At the same time, a number of London Market players are re-examining their systems strategy, and any decisions around new platforms should ideally seek to future-proof their requirements in relation to data analytics.

Quantifying the future benefits from the deployment of data analytics in order to support a business case for data analytics initiatives is not straightforward in the pricing conditions currently faced by the London Market. For comparison purposes, in the US SME commercial lines market, insurers have been able to demonstrate improvements in class of business level loss ratios to the tune of 2-3 percentage points (and much higher improvements for specific subsets of business) through enhanced risk segmentation and keener pricing. The emerging game changer for the SME commercial lines market is the advance in analytics from descriptive (what happened) and diagnostic (why it happened) to predictive (what is likely to happen) and prescriptive (determining and ensuring the right outcome). Drawing on enriched data sets and the latest tools, this shift has not only allowed insurers to anticipate what will happen and when, but also to respond proactively. It has also fuelled the possibility for new types of coverage to be provided in the future, which rely much more on real time pricing, for example. The additional benefits of first mover (or at least early follower) advantage in offering a new product or form of coverage that is ground-breaking should also be taken into consideration in building a compelling business case.

The US specialty lines market, which is a more comparable barometer for the London Market, has seen much less in the way of active usage of advanced analytics techniques to date, and remains an area of significant opportunity. However, the growing volume of business derived through delegated authorities and binders in the London Market – which typically encompasses niche personal lines, commercial lines and specialty risks – means that some of the benefits achieved by US commercial lines insurers should be capable of being translated to the London Market.

Any improvements in loss ratios achieved by London Market players through the application of advanced analytics techniques would likely largely arise in the short term from either:

  • non-renewing business which can now be demonstrably shown to require significant rate increases, but which the market will not bear;
  • securing significant rate increases on renewals now that the risk profile is better understood; or
  • writing select tranches of new business at actual premium to technical premium ratios which are very attractive, but which have not been recognised as high quality risks by competitors using less sophisticated underwriting and pricing approaches.

The first of these channels would have an adverse impact on insurers’ expense ratios, as declining premium volumes from non-renewals would leave the fixed expense base relatively more uncovered. It would then be incumbent upon insurers to pursue profitable premium growth within the second and third of these channels to offset the impact – which would have its challenges in the current soft market. To ensure that the combined ratio is not adversely impacted in the short term, insurers would potentially also need to implement strategic cost reduction initiatives, including the deployment of new technologies to cut acquisition costs and target specific areas of variable expenses.

With the challenges of the soft market set to continue in the near term, the pressure on margins provides an impetus for a step change in the use and development of advanced analytics, supplemented by targeted strategic cost reduction programmes. These analytics innovations will be crucial in keeping one step ahead of the competition, especially when the market starts to harden.

If you would like to discuss these issues, or the impact of emerging technology or data and analytics on your industry, then contact our Data & Analytics team.

01 September 2016

Data and analytics: what will be the tax functions’ role?

The tax function is one of the largest consumers of data in organisations. Typically, according to a joint study we carried out with the Manufacturers Alliance for Productivity and Innovation (MAPI) tax spends more than half of its time gathering data. It’s something that tax functions excel at even though this is becoming more challenging as tax data is housed in many locations and the volume of data is increasing. But that same survey also found that tax functions spend less than a third of their time on analysing the data they’ve collected. And that’s a problem.

The volume of data available and our increasing ability to analyse and use it is having a major impact on companies and on every function within an organisation. We’re creating an environment where data is the basis for decision-making – and the major business functions are working hard to adapt to that world. Tax mustn’t be left behind.

That’s why the impact of data and analytics on the tax function is the subject of the third paper in our Tax Function of the Future thought leadership series. We look in detail at the data challenges facing tax functions and what they need to do to get themselves in shape for now and what’s to come.

The central message is that the tax function must shift its focus if it’s to harness the full potential of data analysis. That means concentrating on the analysis of the data it collects, and on creating strategic partnerships throughout the organisation.

This will take, among other things, excellent technology. But overall, spending on technology in the tax function remains stubbornly low – according to the Tax Executive Institute 2011-2012 Corporate Tax Department Survey, it amounts to less than 5% of the budget in most cases. Our MAPI survey found that 80% of tax functions recognise that better technology and integration would improve their tax effectiveness, and yet three-quarters don’t have a tax technology strategy, or any plans to develop one. Clearly, this has to change.

If they’re to truly add organisational value in a data-driven world, tax functions must also break out of the silos that many find themselves in. Tax has relevance to decisions made across the organisation and the tax function has a real opportunity here to show how it can bring value to the business – as long as the function is in line with the wider business strategy.

This means that tax functions have to be far more actively involved in data management and analytics – and in the budgeting and planning process for data projects – than they have been in the past. Historically, the function has taken a passive role in data strategy but the data challenges that the function faces, and the critical importance of data use and analytics to its future value, make active involvement absolutely critical. A coherent data strategy should include people from tax, finance, IT and other functions if a complete business case is to be made.

If you would like to discuss these issues, or the impact of emerging technology or data and analytics on your industry, then contact our Data & Analytics team.