Tag Archives: United Kingdom

New developments in the computation of mortality rates: An actuary’s bread and butter

The computation of mortality rates has traditionally been the bread and butter of actuaries. The first mathematicians to venture into the actuarial field most likely spent their days analysing mortality rates and conducting life valuations. Nowadays, the work of actuaries is much more varied—which is a welcome development for most—but are we sometimes neglecting this core skill?

Milliman researchers in Paris certainly aren’t and their new research, hot off the press, published on 22 February 2017, represents a significant development in mortality and longevity risk modelling. It is vital reading for anyone working in this sphere.

My colleagues have developed a robust statistical methodology to correct the implicit inaccuracies of national mortality tables which are used widely in sophisticated mortality and longevity risk modelling. The results are striking.

Here I take a closer look at the relevance of these national mortality tables, the problems with them, and the corrections available in order to enhance mortality and longevity risk models. I will touch on the key technical points behind these developments from an Irish/UK perspective, leaving the rigorous mathematical explanations to the underlying research publications—the 2017 publication can be found here and the 2016 publication can be found here.

The use of national mortality tables
In Ireland and the UK, to set basic mortality assumptions in our pricing and reserving work, we tend to use insured lives mortality tables, such as the Continuous Mortality Investigation (CMI) tables. However, national mortality tables based on the population as a whole are also used extensively in mortality and longevity risk modelling, where a greater quantity of data is required.

National mortality tables are used to calibrate stochastic mortality models, to derive mortality improvement assumptions, in sophisticated mortality risk management models, in Solvency II internal models, in pricing mortality/longevity securitisations, and in bulk annuity transactions.

Bulk annuity transactions are popular in the UK market, with a number of large deals executed during 2016, including the ICI Pension Fund’s two buy-in deals completed in the wake of Brexit, totalling £1.7 billion. Legal & General completed a £2.5 billion buyout agreement with the TRW Pension Scheme in 2014.

Longevity hedging (in particular, use of longevity swaps) is also an attractive approach to the de-risking of pension schemes, and would equally require the use of national mortality tables. Transactions range from the large-scale £5 billion Aviva longevity swap in 2014 to the recent, more modest, £300 million longevity swap completed between Zurich and SCOR in January 2017.

While the use of internal models to calculate mortality and longevity risk capital requirements under Solvency II is not prevalent in the Irish market, which is due to the size of companies and the amount of risk retained, it is likely that reinsurers are looking at such models. In the UK, larger companies may opt to use internal models if they are retaining large exposures.

Indeed, national mortality tables also typically inform mortality improvement assumptions for all companies, as the analysis of improvements requires large volumes of data. Therefore, even companies that do not use sophisticated mortality and longevity risk modelling techniques are implicitly impacted by the new developments in relation to the construction of national mortality tables.

Continue reading

Member of Parliament Steve Webb on pensions in the UK

Like millions of people around the globe, retirees throughout Britain are facing increasing uncertainty when it comes to saving for their golden years. From unreliable markets to mismanaged pensions, the retirement landscape has never been more volatile.

In this Milliman Visionary interview, former MP and Minister of State Pensions Steve Webb addresses the threat to retirement in the United Kingdom and beyond, as well as some possible solutions.

Big data, consumers, and the FCA

Newton_DerekIn November 2015 the Financial Conduct Authority (FCA),1 a UK financial services regulator, announced that it intended to investigate the use of “big data”2 in retail general insurance in the UK. In September 2016, it announced that it was not, after all, going to pursue this investigation. Why this apparent turnaround?

The opportunities big data provides general insurers are widely acknowledged and the reason general insurers are investing heavily in this area. But with such opportunities come potential threats: big data could potentially lead to better service and outcomes for many consumers, but could it also lead to some consumers effectively being excluded from the market, or to the exploitation of consumers who are less price-sensitive than others? They are the concerns that the FCA sought to address when announcing its investigation in November 2015.

Since then, the FCA has been gathering and evaluating relevant information, mostly relating to private motor and home insurance. It has found a lot of evidence that the use of big data results in benefits to users of insurance, through products and services being better tailored for individual needs, through more focused marketing and better customer service, and through increasing feedback to consumers about the risks that they run and how to manage them effectively, most notably to those with telematics auto insurance.

While its concerns remain, the FCA concluded from this preliminary investigation that the increasing use of big data is “broadly having a positive impact on consumer outcomes, by transforming how consumers deal with retail GI firms, streamlining processes and encouraging more innovation in products and services.” As a result, it has decided that there is no immediate need either to push ahead with the full investigation that it had originally proposed or to change its regulatory framework in response to any issues raised. However, it will continue to look at big data, in particular looking for any related data protection risks and seeking to understand how big data is used in pricing.

Full details of the FCA’s views can be found in its Feedback Statement FS16/5.


1The FCA regulates the financial conduct of the financial services market within the UK and shares with the Prudential Regulation Authority the prudential regulation of the businesses within the UK financial services market.
2There is no universally accepted definition of “big data.” In the context of its investigation, the FCA considered big data very broadly, embracing data sets that are larger or more complex than have hitherto typically been used by the insurance industry, data sets derived from new sources such as social media, and the emerging technologies and techniques that are increasingly being adopted to generate, collect, and store the data sets, and then to process and analyse them.