AuditOne Advisory – FDIC Issues Guidance on Model Risk Management

Our Co-CEO Jeremy Taylor has prepared (below) a summary of the Guidance recently issued by the FDIC on model risk management. It’s an important topic as more and more institutions make use of models, whether in-house or vendor-supplied, for a widening range of purposes. To the extent models feed into (or even substitute for) decision-making in various banking contexts, it’s important to have a disciplined process for ensuring the integrity of the models themselves, of the data and assumptions fed into them, and of the governance and controls surrounding model access, changes, back-testing, etc.

Please share this with colleagues having responsibilities related to models and how they’re being used within your organization. Thank you, Bud

FDIC Issues Guidance on Model Risk Management

The FDIC has just released FIL-22-2017, Supervisory Guidance on Model Risk Management, identical to what was previously issued back in 2011 by both the OCC (Bulletin 2011-12, superseding 2000-16) and the FRB (SR Letter 11-7).  The NCUA has yet to follow suit; its website refers readers to the OCC and FRB publications.

The first thing to note is that the FIL specifically exempts sub-$1 billion banks unless they have significant reliance on models.  But that applies to putting in place a framework or program for model risk management (MRM).  Many of the FIL’s provisions instead apply to managing risk associated with individual models, such as IRR, suspicious activity monitoring, ALLL, each of them already widely used by smaller banks and already covered by previous guidance statements.  For most small banks, therefore, FIL-22-2017 reinforces existing guidance rather than introducing new requirements.

It’s unclear where “significant” model reliance will kick in.  But for larger banks, at or closing in on the $1 billion threshold, there are some key steps to keep in mind, if they’re not already in place:

  1. Assign model management responsibility.  The vendor management (VM) process provides a useful parallel.  Each vendor will have primary contact(s) within the bank, but it is expected that someone (typically within Compliance, IT or Finance) will be assigned overall VM program management/coordination responsibility.  The same general comment applies to MRM.
  2. Compile an inventory of all models (both vendor-supplied and in-house-developed) in use across the bank.  That database should include updated information on key things like model validation, SOC reporting[1].
  3. Develop a Model Risk Policy document.  As with any new policy development, it’s probably easiest to start with a template from a vendor (like LexisNexis/Sheshunoff, BCG or Young & Associates) and then customize it.

While smaller (i.e., exempted) institutions may not want to worry about #3, the first two items above are relatively straightforward – and prudent for institutions of any size.  Besides, it never hurts to get out ahead of formal requirements (and to impress your regulator in the process).


[1] We have issued two recent Advisories on these important topics:


AuditOne Advisory – Estimating NMD Average Life From Bank Data

Kruskal Hewitt, a Senior Associate in our ALM practice, has prepared a document below outlining approaches to the calculation of estimated average life on non-maturity deposit (NMD) accounts using an institution’s internal historical data. Because the assumptions made on NMD average life are a, if not the, critical driver of EVE (economic value of equity) rate-sensitivity, we thought it would be worthwhile sharing this document. Please refer this to whoever in your organization has responsibility for managing your IRR/ALM modeling. The document contains Kruskal’s contact information for those who have questions or would like additional information. Thank you.

Estimating NMD Average Life From Bank Data

This note will describe in detail four methods for estimating historical average life of non-maturity deposits (NMDs) using bank-specific data.  There are many approaches for estimating this; while all are valid, some methods are easier and/or more accurate than others.

What follows is a discussion of NMD average life and its significance, followed by the four methods.  The knowledgeable reader should proceed directly to the methods.

Average Life Assumptions in EVE Simulations

NMD historical average life is important because it represents a key model assumption used in interest rate risk models’ estimation of economic value of equity (EVE).  EVE is calculated as the net present value (NPV) of assets minus the NPV of liabilities.

NMD for many banks, especially smaller banks, is the largest component of liabilities, which explains large the impact that these assumptions have.  They determine what point on the discount curve will be used to discount cash flows associated with NMDs.  In a positive yield curve environment, the longer the assumed average life, the higher the assumed discount rate, the lower the resulting NPV of NMDs, and the higher the calculated EVE.

In other words, all other things held constant, in a positive yield curve environment (which is typically though certainly not always the case), increasing the assumed NMD average life has the effect of increasing EVE asset-sensitivity (or decreasing liability-sensitivity) under rising rate scenarios.  Higher average life is equivalent to slower decay (or run-off) rates on NMD accounts.

Given the high sensitivity of EVE results to the NMD assumptions, regulators have stressed the importance of analyzing internal bank data to help formulate those assumptions.  In our experience, banks that have done so typically find longer estimated life for NMD accounts as compared to the earlier default assumptions provided by FDICIA 305 (or the OTS), particularly for community banks.

Setting Average Life Assumptions

There are three elements that should be considered by a bank in setting average life assumptions:

  • Historic experience
  • Management’s judgement of the future behavior of their NMD depositors
  • Peer banks’ assumptions

The most important thing is that a bank clearly document what they choose and why.

Estimating Average Life From Bank Data

Estimating average life is problematic!  Even with great data, how one measures – i.e., which technique is employed – will give different answers.  There are further complications.  What one measures, the number of accounts or account balances, will give different answers.  Account balances can vary significantly over time.  Transaction account balances, for example, can move up and down without it having any run-off implications.  Another consideration:  A bank cannot have accounts with a longer life than the bank’s existence, which complicates the estimation exercise for a younger bank.

The interaction of account balances and account life can have a meaningful impact on true (as opposed to modeled) EVE.  Consider two accounts that have been open for ten years, one with a balance of $1 for the first nine years and $999,995 in the tenth, the other with a constant balance of $100,000.  The first account has had little value for the bank (i.e., in discounted terms), while the second had a great deal of value.  But both have had an average balance of $100,000.

It may seem that the easiest approach is to measure account balances.  However, unless there is detailed data available to calculate the daily average balance by account, account balances are problematic.  They can be affected by a variety of external and internal factors that have nothing to do with “decay”.  They can be skewed by individual large accounts, as well as by transaction activity as noted above.

The number of accounts open, and how long they have been open, is more straightforward to measure, but it has the drawback of either assuming today’s balance is a good proxy for the average balance (method B below) or that every account has the same value to the bank (method C below).

Because average life estimates are limited by a bank’s historical data, it can limit the choices of estimation technique.

What follows is a discussion of various techniques, the required data resources, the strengths and weaknesses, and a detailed how-to.


A. Current Average Account Life

Data required.  The account opening date and current account balance.

Pros/Cons.  If a bank has the data, this is a very accurate and easy to calculate measure of the current average life.  If a bank recalculates each month or quarter, predictive trends will be identifiable. However, there are two drawbacks.  First is the reliance on the current balance as a proxy for average balance.  This may or may not be reasonable; some accounts have monthly cycles, or other seasonality.  Second, if a bank is relatively young with accounts that have been open since the start of the bank, this technique will understate the true average life.  The amount of understatement is related to the proportion of accounts open since day one.

How to calculate.

For all accounts Sumproduct(Account Life : Account Balance) / Sum(Account Balance)

B. Open-Close Technique to Estimate Decay

Data required.  The ability to identify which specific accounts are open at various past dates.

Pros/Cons.  This technique is the default methodology, as virtually every bank can identify the individual open accounts at specific dates in the past.  It is simple to calculate and to update.  The problem is that the account balance has no role in the estimation.  As a result, every account is assumed to be of the same value to the bank.

How to calculate.  The discussion below is for annual cohorts, but other periods (monthly, quarterly, semi-annually) can be substituted.

For each NMD type (DDA, Savings, NOW, MMA), segment historical data according to the accounts on the books (open) at the start of each year (or quarter, etc).
E.g., As of December 31, 2012, count how many accounts were open, Act12(O).  As of December 31, 2013, count how many of the Act12(O) closed during the year, Act12(C).
Act12(C) / Act12(O) = AnnualDecayRate(12)
Repeat for five years, and average the Annual Decay Rate:
1 / AverageAnnualDecayRate = AverageLife (in years)

C. Tail Analysis

Data required.  The account opening date and current account balance.

Pros/Cons.  The open-close technique assumes a constant decay rate.  If one assumes a constant decay rate, then in theory there is always something left – i.e., the “tail”.  An average life of four years means that after eight years, 10% will remain.  An average life of six years means that after 12 years, 11% will remain.  An average life of ten years predicts after twenty years 12% will remain.  This technique can be used to estimate the expected life of accounts for a young bank where more than 10% of accounts have been open for as long as the bank.

How to calculate.  Build an excel spread sheet. 
In cell A1 put the number of years the bank has been open.
In cell A3 put 0, A4 put 1… A53 put 50
In cell B2 put the formulae =1/A1
In cell B3 put 1.
In cell B4 put the formulae =(1-$B$2)*B3, copy this formulae to all the cells until cell B53.
In column C place the percentage of accounts open since the beginning of the bank in the cell that corresponds to the number of years the bank has been open (if the bank has been open seven years put the percentage in cell C10)
Change the value in cell A1, until the value in B, adjacent to the percentage in C, is as close as possible.
The final value in A1 is the average life of the sample.

D. Average Life of Time Weighted Average Balance

Data required.  The daily balance of each account since the day the account was opened.

Pros/Cons.  Arguably this is the Cadillac of measures, as it would account for both the account balance and the balance average life.  There is one significant drawback: banks generally won’t have the required data.

How to calculate.  For each account, calculate the average balance (remembering to account for weekends and holidays).  For all accounts: Sumproduct(Account Life : Account Average Balance) / Sum(Account Average Balance)


There are different ways to calculate the average life of NMDs.  If the bank has the account opening date, then technique A offers advantages; otherwise, B may be a more sensible option.  Remember that calculating historical average life is an intermediate step in setting the model average life assumption; management must then look into its crystal ball and assess whether average life in the future will be longer or shorter than the data shows it has been in the past.

Remember, too, that if rates are expected to go higher (as is presently the case), then shading to the shorter side is more conservative; upward movements in rates don’t produce as big an increase (or as small a decrease) in EVE.  Conversely, if rates can only go lower, biasing to the longer side is more conservative.  Of course, we always recommend accuracy over bias, and hopefully this write-up will help in that regard.

If you have questions on the content, please contact Kruskal Hewitt, Senior Associate, at Contact Us.


AuditOne Advisory – SOC 1 Changes with the new SSAE 18 Standard

AuditOne Advisory

From Bud Genovese, Chairman

Your financial institution should be receiving from your major service provider’s annual SOC 1 controls reports (formerly called “SAS 70”). These reports are based on reviews called SOC – Service Organization Controls. The AICPA has just modified the SSAE 16 attestation standard for performing a SOC 1 review. Effective May 1, 2017, the SSAE 16 has been replaced by SSAE 18. The major changes that the SSAE 18 present are reviewed in this advisory written by Robert Kluba, our Technology Practice Co-Director.  Please forward this to appropriate personnel in your firm, such as IT management or the person responsible for vendor management compliance. We hope you enjoy this technical update, thank you! – Bud

SOC 1 Summary of Changes from the SSAE 16 Standard to the SSAE 18 Standard

Services providers that store or process information for third parties should be able to provide an annual SOC (Service Organization Controls) report to customers when requested.  A SOC 1 report focuses on the controls over financial reporting. If the information handled by the service provider relates to financial statements, then a SOC 1 review and report should be completed. The SOC 1, SSAE 16 format was created originally under the SSAE 16 standard which replaced the SAS 70 standard.  Effective May 1, 2017, a SOC 1 report is now completed under the SSAE 18 AICPA attestation standard.  The standard requires that the SOC 1 report only note “SOC 1” and should not reference or use “SSAE 18” as part of the report or title. This advisory
presents the major changes that apply to SOC 1.

SOC 2 and SOC 3 reports are completed according to the AICPA Trust Service Principles. SOC 2 and SOC 3 reports are focused on the controls related to compliance and operation of the service provider. A SOC 2 or SOC 3 report provides documented assurances that operational safeguards are in place that relate to one, or all, of the following trust service principles: security, availability, processing integrity, confidentiality, or privacy. The following changes do not affect the SOC 2 and SOC 3 reports, as the SSAE 18 does not apply to them.

SSAE 18 Changes That Apply to SOC 1

Subservice Organizations:

SSAE 18 is requiring that service organizations implement processes that monitor the controls at subservice organizations. This new requirement requires service organizations to state the vendor management controls they have in place for subservice providers (for example, colocation facility).

Complementary Subservice Organization Controls:

The SSAE 18 introduces the concept of “Complementary Subservice Organization” controls which will be included in the service provider’s system description. This concept establishes and defines the controls for which customers must now assume in the design of the system description. This addition to the system description is similar to the Complementary User Entity Controls section.

Signed Written Assertion Requirement:

The written assertion is the statement found within the SOC report where the service organization asserts that the system description provided is true and complete. This statement has always been contained within the SOC 1 reporting document but the requirement that the service organization signs the document was optional. Like many firms, AuditOne, Inc. has already been requiring this section to be signed by service providers as a way to strengthen the credibility of the report.

Service Auditor Risk Understanding:

The SSAE 18 requires service auditors to obtain a more in-depth understanding of the development of the subject matter than currently required, in order to better identify the risks of material misstatement in an examination engagement. This enhancement should lead to an improved understanding between assessed risks and the nature, timing, and extent of attestation procedures performed in response to those risks.

AuditOne Inc. Delivers Effective and Efficient SOC Audits

AuditOne Inc.’s skilled audit, technical and security experts deliver the highest quality, cost-effective, responsive SOC services in the industry. Please contact myself or Bud Genovese to review how we can make the SOC audit an effective and efficient experience for your firm. I will be more than happy to help you understand why AuditOne Inc.’s user-friendly process and focus, makes it the market-leading smart choice.

Robert Kluba is the Technology Practice Co-Director of AuditOne LLC, the Nation’s leading firm with the sole focus on financial institution internal audit and consulting services. AuditOne LLC affiliates with AuditOne Inc., a PCAOB registered CPA firm that specializes in SOC audits for service providers. Under Managing Director Bud Genovese, AuditOne Inc. has positioned itself to deliver affordable SOC reviews utilizing hands-on technical staff. The AuditOne group of technical experts also can assist in SOC related risk assessments and penetration testing requirements. Contact Robert Kluba or Bud Genovese ( more information.