Tag Archives: Jonathan Glowacki

Validating machine-learning models

While machine-learning techniques can improve business processes, predict future outcomes, and save money, they also increase modeling risk because of their complex and opaque features. In this article, Milliman’s Jonathan Glowacki and Martin Reichhoff discuss how model validation techniques can mitigate the potential pitfalls of machine-learning algorithms.

Here is an excerpt:

An independent model validation carried out by knowledgeable professionals can mitigate the risks associated with new modeling techniques. In spite of the novelty of machine-learning techniques, there are several methods to safeguard against overfitting and other modeling flaws. The most important requirement for model validation is for the team performing the model validation to understand the algorithm. If the validator does not understand the theory and assumptions behind the model, then they are likely to not perform an effective model validation on the process. After demonstrating an understanding on the model theory, the following procedures are helpful in performing the validation.

Outcomes analysis refers to comparing modeled results to actual data. For advanced modeling techniques, outcomes analysis becomes a very simple yet useful approach to understanding model interactions and pitfalls. One way to understand model results is to simply plot the range of the independent variable against both the actual and predicted outcome along with the number of observations. This allows the user to visualize the univariate relationship within the model and understand if the model is overfitting to sparse data. To evaluate possible interactions, cross plots can also be created looking at results in two dimensions as opposed to a single dimension. Dimensionality beyond two dimensions becomes difficult to evaluate, but looking at simple interactions does provide an initial useful understanding of how the model behaves with independent variables….

…Cross-validation is a common strategy to help ensure that a model isn’t overfitting the sample data it’s being developed with. Cross-validation has been used to help ensure the integrity of other statistical methods in the past, and with the rising popularity of machine-learning techniques, it has become even more important. In cross-validation, a model is fitted using only a portion of the sample data. The model is then applied to the other portion of the data to test performance. Ideally, a model will perform equally well on both portions of the data. If it doesn’t, it’s likely that the model has been over fit.

Predictive analytics for the mortgage industry

How can predictive analytics help Government National Mortgage Association (GNMA/Ginnie Mae) issuers decide whether they want to buy out a nonperforming loan or not? In their article “Enhanced vision,” Milliman’s Jonathan Glowacki and Makho Mashoba provide perspective on an algorithm used to analyze loans that are likely to bounce back in order to reissue them as a mortgage-backed security.

Here is an excerpt:

The XGBoost model, like similar algorithms, is easy to implement. Once the mechanics of the technique are understood, and the parameters are tuned correctly, the model can be turned on a data set to produce accompanying predictions. The model can be updated continuously each month based on new data feeds. Pointing an XGBoost program toward a new data set and running it again is virtually all that is needed to refresh the results. It is also possible to retune the parameters for the update to further enhance the effects.

A use case of this type of model would be to pursue early buyouts for mortgages that have a high probability of re-performing and potentially not pursue early buyouts for mortgages that have a low probability of re-performing, as long as this policy is consistent with GNMA servicing guidelines.

This same technique can be used on a variety of data for alternative purposes. Predictive analytics can capture predictive power from internal data, whether that involves established and go-to data sets or whether that involves bringing together data from across an organization to make predictions. Predictive analytics can also help a firm leverage industry data and other outside sources to forecast trends or improve decisions. This case is a concrete example of how using the tool should result in higher return on investment on GNMA early buyouts.

Considering the growing amounts of data available, the mortgage industry should pay attention to predictive analytics tools. Investing in the technology has proven to generate significant returns. GNMA issuers is just one group to which predictive analytics can be applied. Predictive analytics can be applied to many other techniques and tools to increase efficiencies within the mortgage industry. The future depends on it.

Lender credit risk transfer considerations for government-sponsored enterprises

One of the roadblocks for lender credit risk transfer (CRT) has been a lack of knowledge and understanding of the risk/reward profile of a potential lender CRT transaction. This article by Milliman’s Madeline Johnson and Jonathan Glowacki provides an overview of lender CRT and uses public information to demonstrate the expected premium and loss rates for a potential lender CRT transaction.

This article was originally published in the March/April 2017 issue of Secondary Marketing Executive.

Credit risk sharing transactions considerations for insurers

Credit risk sharing transactions offered by Fannie Mae and Freddie Mac present a new business opportunity for insurance companies seeking to invest capital. Milliman’s Jonathan Glowacki and Michael Jacobson say insurers must first understand the risks associated with these transactions before investing in or insuring them. Their Contingencies article “The trillion-dollar marketplace” provides some perspective.

Here’s an excerpt from the article:

Given FHFA’s focus on de-risking the GSEs, mortgage credit risk offerings are likely to continue to become more prevalent in the marketplace as the GSEs seek to meet their annual conservatorship scorecard requirements and reduce capital. According to FHFA’s 2015 conservatorship scorecard, Fannie Mae and Freddie Mac were instructed to collectively transact credit risk transfers on reference pools of mortgages of at least $270 billion for the year. In actuality, the GSEs’ transactions covered reference pools exceeding $400 billion of the nearly $1 trillion of mortgages acquired by the GSEs in 2015. The 2016 scorecard requires the GSEs to transfer the credit risk on at least 90 percent of the unpaid principal balance of targeted groups of newly acquired mortgages, which represents the majority of expected acquisitions. Thus, it can be assumed that there will be a similar level or greater amount of credit risk transferred in 2016, assuming GSE mortgage acquisition levels remain consistent with 2015 acquisitions.

Insurance companies will have the opportunity to participate in this marketplace in 2016 through investment opportunities in the STACR and CAS debt structures as well as by writing credit insurance through anticipated ACIS and CIRT transactions. While the debt offerings require principal outlays equal to 100 percent of the notional amount of the securities, the credit insurance transactions to date have typically only required collateral between 15 and 20 percent of the credit risk assumed. The collateral requirements for the credit insurance transactions vary based on the rating of the insurance entities assuming the risk and the type of participation. For context, the $2.8 billion of credit insurance risk placed through the 10 Freddie Mac ACIS transactions in 2015 required minimum collateral of approximately $440 million (or approximately 16 percent of the risk assumed).

These debt securities and insurance opportunities may offer attractive risk/return profiles to strategic companies in the insurance sector. However, before entering into such agreements, it is important to understand the risk profile of the underlying collateral and the performance volatility inherent in the structure of the transactions. With data being published by the GSEs, it is now easier than ever before to evaluate the risk profiles of these exposures.

Student loan debt at for-profit colleges

For-profit colleges attract students through innovative scheduling and online educational opportunities. However, 44% of defaults on federal loans come from students at for-profit colleges. In his latest Insight article, Milliman’s Leighton Hunley examines some of the possible causes of these for-profit defaults as he revisits the issue of student loan debt. The article also highlights student loan debt and delinquency trends.

For more analysis on this issue read Leighton and Jonathan Glowacki’s article “The student loan debt crisis in perspective.” The authors also offer some reform ideas in the article.

Estimating credit losses for the lifetime of a loan

On December 20, 2012, the Financial Accounting Standards Board (FASB) issued a proposed Accounting Standards Update (ASU) that discusses changes to the ways banks recognize and account for potential credit losses (the ASU is “Financial Instruments—Credit Losses,” Subtopic 825-15). A simple summary of the update is that the FASB proposes that banks and other financial institutions modify recognition of impairment from a “probable loss” to a “lifetime of loss” estimate.

For mortgages, this means changing the base of the impairment provision from a provision for losses arising from the current delinquency inventory to a provision for all mortgages, recognized at origination. Impairment provisions for delinquent loans are typically estimated using a roll-rate model based on recent experience.

Milliman’s Eric Wunder and Jonathan Glowacki provide a methodology to estimate credit losses (including losses on loan repurchases) for the lifetime of a loan in this article.