Categories
News

AuditOne Advisory – Estimating NMD Average Life From Bank Data

Kruskal Hewitt, a Senior Associate in our ALM practice, has prepared a document below outlining approaches to the calculation of estimated average life on non-maturity deposit (NMD) accounts using an institution’s internal historical data. Because the assumptions made on NMD average life are a, if not the, critical driver of EVE (economic value of equity) rate-sensitivity, we thought it would be worthwhile sharing this document. Please refer this to whoever in your organization has responsibility for managing your IRR/ALM modeling. The document contains Kruskal’s contact information for those who have questions or would like additional information. Thank you.

Estimating NMD Average Life From Bank Data

This note will describe in detail four methods for estimating historical average life of non-maturity deposits (NMDs) using bank-specific data.  There are many approaches for estimating this; while all are valid, some methods are easier and/or more accurate than others.

What follows is a discussion of NMD average life and its significance, followed by the four methods.  The knowledgeable reader should proceed directly to the methods.

Average Life Assumptions in EVE Simulations

NMD historical average life is important because it represents a key model assumption used in interest rate risk models’ estimation of economic value of equity (EVE).  EVE is calculated as the net present value (NPV) of assets minus the NPV of liabilities.

NMD for many banks, especially smaller banks, is the largest component of liabilities, which explains large the impact that these assumptions have.  They determine what point on the discount curve will be used to discount cash flows associated with NMDs.  In a positive yield curve environment, the longer the assumed average life, the higher the assumed discount rate, the lower the resulting NPV of NMDs, and the higher the calculated EVE.

In other words, all other things held constant, in a positive yield curve environment (which is typically though certainly not always the case), increasing the assumed NMD average life has the effect of increasing EVE asset-sensitivity (or decreasing liability-sensitivity) under rising rate scenarios.  Higher average life is equivalent to slower decay (or run-off) rates on NMD accounts.

Given the high sensitivity of EVE results to the NMD assumptions, regulators have stressed the importance of analyzing internal bank data to help formulate those assumptions.  In our experience, banks that have done so typically find longer estimated life for NMD accounts as compared to the earlier default assumptions provided by FDICIA 305 (or the OTS), particularly for community banks.

Setting Average Life Assumptions

There are three elements that should be considered by a bank in setting average life assumptions:

  • Historic experience
  • Management’s judgement of the future behavior of their NMD depositors
  • Peer banks’ assumptions

The most important thing is that a bank clearly document what they choose and why.

Estimating Average Life From Bank Data

Estimating average life is problematic!  Even with great data, how one measures – i.e., which technique is employed – will give different answers.  There are further complications.  What one measures, the number of accounts or account balances, will give different answers.  Account balances can vary significantly over time.  Transaction account balances, for example, can move up and down without it having any run-off implications.  Another consideration:  A bank cannot have accounts with a longer life than the bank’s existence, which complicates the estimation exercise for a younger bank.

The interaction of account balances and account life can have a meaningful impact on true (as opposed to modeled) EVE.  Consider two accounts that have been open for ten years, one with a balance of $1 for the first nine years and $999,995 in the tenth, the other with a constant balance of $100,000.  The first account has had little value for the bank (i.e., in discounted terms), while the second had a great deal of value.  But both have had an average balance of $100,000.

It may seem that the easiest approach is to measure account balances.  However, unless there is detailed data available to calculate the daily average balance by account, account balances are problematic.  They can be affected by a variety of external and internal factors that have nothing to do with “decay”.  They can be skewed by individual large accounts, as well as by transaction activity as noted above.

The number of accounts open, and how long they have been open, is more straightforward to measure, but it has the drawback of either assuming today’s balance is a good proxy for the average balance (method B below) or that every account has the same value to the bank (method C below).

Because average life estimates are limited by a bank’s historical data, it can limit the choices of estimation technique.

What follows is a discussion of various techniques, the required data resources, the strengths and weaknesses, and a detailed how-to.

METHODS

A. Current Average Account Life

Data required.  The account opening date and current account balance.

Pros/Cons.  If a bank has the data, this is a very accurate and easy to calculate measure of the current average life.  If a bank recalculates each month or quarter, predictive trends will be identifiable. However, there are two drawbacks.  First is the reliance on the current balance as a proxy for average balance.  This may or may not be reasonable; some accounts have monthly cycles, or other seasonality.  Second, if a bank is relatively young with accounts that have been open since the start of the bank, this technique will understate the true average life.  The amount of understatement is related to the proportion of accounts open since day one.

How to calculate.

For all accounts Sumproduct(Account Life : Account Balance) / Sum(Account Balance)

B. Open-Close Technique to Estimate Decay

Data required.  The ability to identify which specific accounts are open at various past dates.

Pros/Cons.  This technique is the default methodology, as virtually every bank can identify the individual open accounts at specific dates in the past.  It is simple to calculate and to update.  The problem is that the account balance has no role in the estimation.  As a result, every account is assumed to be of the same value to the bank.

How to calculate.  The discussion below is for annual cohorts, but other periods (monthly, quarterly, semi-annually) can be substituted.

For each NMD type (DDA, Savings, NOW, MMA), segment historical data according to the accounts on the books (open) at the start of each year (or quarter, etc).
E.g., As of December 31, 2012, count how many accounts were open, Act12(O).  As of December 31, 2013, count how many of the Act12(O) closed during the year, Act12(C).
Act12(C) / Act12(O) = AnnualDecayRate(12)
Repeat for five years, and average the Annual Decay Rate:
1 / AverageAnnualDecayRate = AverageLife (in years)

C. Tail Analysis

Data required.  The account opening date and current account balance.

Pros/Cons.  The open-close technique assumes a constant decay rate.  If one assumes a constant decay rate, then in theory there is always something left – i.e., the “tail”.  An average life of four years means that after eight years, 10% will remain.  An average life of six years means that after 12 years, 11% will remain.  An average life of ten years predicts after twenty years 12% will remain.  This technique can be used to estimate the expected life of accounts for a young bank where more than 10% of accounts have been open for as long as the bank.

How to calculate.  Build an excel spread sheet. 
In cell A1 put the number of years the bank has been open.
In cell A3 put 0, A4 put 1… A53 put 50
In cell B2 put the formulae =1/A1
In cell B3 put 1.
In cell B4 put the formulae =(1-$B$2)*B3, copy this formulae to all the cells until cell B53.
In column C place the percentage of accounts open since the beginning of the bank in the cell that corresponds to the number of years the bank has been open (if the bank has been open seven years put the percentage in cell C10)
Change the value in cell A1, until the value in B, adjacent to the percentage in C, is as close as possible.
The final value in A1 is the average life of the sample.

D. Average Life of Time Weighted Average Balance

Data required.  The daily balance of each account since the day the account was opened.

Pros/Cons.  Arguably this is the Cadillac of measures, as it would account for both the account balance and the balance average life.  There is one significant drawback: banks generally won’t have the required data.

How to calculate.  For each account, calculate the average balance (remembering to account for weekends and holidays).  For all accounts: Sumproduct(Account Life : Account Average Balance) / Sum(Account Average Balance)

CONCLUSION

There are different ways to calculate the average life of NMDs.  If the bank has the account opening date, then technique A offers advantages; otherwise, B may be a more sensible option.  Remember that calculating historical average life is an intermediate step in setting the model average life assumption; management must then look into its crystal ball and assess whether average life in the future will be longer or shorter than the data shows it has been in the past.

Remember, too, that if rates are expected to go higher (as is presently the case), then shading to the shorter side is more conservative; upward movements in rates don’t produce as big an increase (or as small a decrease) in EVE.  Conversely, if rates can only go lower, biasing to the longer side is more conservative.  Of course, we always recommend accuracy over bias, and hopefully this write-up will help in that regard.

If you have questions on the content, please contact Kruskal Hewitt, Senior Associate, at Contact Us.