STUMP » Articles » Public Pensions: Why Do 100% Required Contribution Payers Have Decreasing Fundedness? » 19 August 2017, 16:21

Where Stu & MP spout off about everything.

Public Pensions: Why Do 100% Required Contribution Payers Have Decreasing Fundedness?  


19 August 2017, 16:21

In yesterday’s post, I looked at the press release from a public pensions advocacy group.

One key point was this quote from the paper the press release pointed to:

Pension funds are resilient and well managed. They have stood the test of time for more than 100 years through economic ups and downs. If state and local legislators had kept their side of the bargain over the years by making scheduled payments on time, most critics of public pensions would have to find another hobby. (page 2)

Hmmm. John Bury attacked that if, but I’m going to look at the plans where the states/localities did pay their full payments. At least, the way they were calculated. We’ll get back to that in a bit.


I explained this before, but let’s look at it again:

What’s this normal cost thing with regards to pensions? Is this related to ARCs (actuarially required contributions)?

Essentially, there are two pieces to the ARC:

- the normal cost
- the amortized cost of the unfunded liability

The normal cost is essentially the actuarial present value (so, that includes all the life contingent items, discount rate, etc.) of the pension benefit accrued for the current year.

So keep in mind the two bits: part of the payment is to cover newly-accrued benefits (that is, penesion benefits earned for employees during the current financial reporting period, and thus an operating cost); the other part is to cover the shortfall of the pension fund.

When I talk about 100% payers below, they’re covering both parts.


So here’s the question: why are those plans with “full contributions” not fully-funded?

They’re paying 100% of what they’re supposed to, at least since 2001. (I can’t speak to earlier time, based on data I get from the Public Plans Database.)

I’ve done a more recent extract from the Public Pensions Database, which includes fiscal year 2016. Mind you, less than half the plans have numbers for 2016 in there… indeed, in the graphs below, you’ll see I stopped at 2014 because I was missing data for 2015 and 2016 for some of the plans. I had a total of 72 plans in my group.

So here is what I did: I found all the plans that contributed 100% (or more) of the ARC for the years I had data for them. Then I made a histogram of their funded ratios for each year. Sometimes they didn’t appear in certain years.

I also graphed the median funded ratio for each year.

High-level movement:
- the median funded ratio in 2001 was 100%. YAY!
- Hmmm, the median funded ratio dropped to 89% in 2007. Odd, given the 2001-2007 wasn’t that bad. I picked 2007 because that was before the market drop, no matter which start/end month of the fiscal year.
- And the median funded ratio dropped to 78% in 2014

Do you see why 80% is a much more popular benchmark than the proper 100%?


So I started with 100% ARC payers, but maybe I wasn’t fair. Sure, they started with a median 100% fundedness, but there were plenty of under-funded plans in the group in 2001.

Why not just restrict it to the full-funders with funded ratios >=100% in 2001?

This drops the 72 plans down to 36. I’ll keep the same vertical axis, though:

- Yes, they had a higher median in 2001
- And also in 2007, but not by a lot
- And in 2014… their median was the same as the original group

So the issue isn’t that there were underfunded plans in the original group.

Heck, I decided to look at all plans that were >=100% funded in 2001, no matter their contribution history. That’s 75 plans.

This is what I got:

Interesting how the 2014 funded ratio medians kept ending up in the same place.


So let’s look at some scatterplots. I’m going to go in reverse order this time.

First, let’s look at plans that were at least fully-funded in 2001:

I labeled a few of the points, and I especially made sure the Kentucky plans were highlighted on this chart. The median point is marked in red. By the way, these are percentage point differences, so Kentucky’s ERS could decrease more than 100% because in 2001 its funded ratio decreased from 126% in 2001 down to 19% in 2016. I just took a straight difference — change of -107 percentage points.

So now let’s look at just the plans that made 100% ARC:

There are some odd things going on here — West Virginia Teachers is a very special case, because it was so poorly funded in 2001. There is a whole history there, which include actors such as NCPERS pointing at it saying “See! You shouldn’t switch to defined contribution!”

And lastly, the fully funded in 2001, making 100% ARC since then:

Hmmm, Kentucky County waaaaay down there. Isn’t that interesting.


Yeah, what is going on?

Let’s look at how median funded ratios have done, and include a line for all plans:

Look at that.

So it really calls into question that the ARC really defines what should be contributed.


Working Paper —
Actuarial Inputs and the Valuation of Public Pension Liabilities and Contribution Requirements: A Simulation Approach
from the Center for Retirement Research at Boston College.



This paper uses a simulated public pension system to examine the sensitivity of actuarial input changes on funding ratios and contribution requirements. We examine instantaneous and lagged effects, marginal and interactive effects, and effects under different funding conditions and demographic profiles. The findings emphasize the difficulty of conducting cross-sectional analyses of public pension systems and point to several important considerations for future research.

The paper found that:

Discount rates, salary growth rates, cost methods, and mortality tables all influence funding ratios and contribution requirements. Without considering these effects, comparisons of funding ratios across pension systems will produce biased results.

The discount rate assumption is the most influential actuarial input on funding ratios and contribution requirements. We show that a plan can postpone required contributions by raising its discount rate assumption, but its funding condition deteriorates in the long run. In contrast, if a plan reduces its discount rate by 1 percentage point, and its investment returns continue at the level that was previously assumed, it will take approximately seven years for the funding ratio to return to its original level and an even longer time period for the ARC to return to its original level (though the exact length of time depends on investment returns and the baseline discount rate assumption).

The effects of actuarial inputs greatly depend on plan characteristics such as demographic profiles and asset levels, and also interactions with other actuarial inputs. Because of the interactive effects, it is difficult to standardize funding ratios or pension obligations by only controlling for a single actuarial input. With better data on plan characteristics (such as information on mortality tables and age distributions), simulations could be used to standardize pension liabilities. In the absence of that information, improved consistency in financial reporting (such as requiring a single cost method) is an effective way to facilitate better comparisons of financial conditions across pension plans.

The policy implications of the findings are:

The valuation (or measurement) of public pension liabilities and contribution requirements is highly sensitive to the choice of several actuarial assumptions, which should be considered when assessing the financial condition of public pension systems.

The sensitivity of liability and contribution requirement valuations to actuarial assumptions and methods depends on the demographic profile of pension participants.

Making more optimistic assumptions reduces the liability and contribution valuations in the short term, but, over time, more optimistic assumptions can have substantive and harmful effects on pension liabilities and contribution requirements.

The full paper is here.

I highlighted the part that is significant: a sensitivity to valuation parameters, especially the discount rate used.

And public pension plans get to choose that for themselves. Many plans have been revising that parameter downward… slowly. But the point is that the plans are often optimistic on parameter choice.

And it goes beyond that. The discount rate is just the most obvious parameter choice. There are many others that go into the mix.


Many plans are using approaches to determining the ARC – in both the normal cost and amortization of the unfunded liability – that will drive the funded ratio to decrease even when economic conditions are good.

It’s not even a matter of the most egregious practices, like spiking, early retirement inducements, DROP benefits, and retroactive granting of benefits. It’s that inherently the valuation approaches always assume somebody will pay more tomorrow to make up for losses.

That is why I’m as cynical as Mark Glennon about projected ramp-ups of contributions….in the future.

I don’t want to give them any credit for giving more later. Edgar did that before in Illinois. Christie said that in New Jersey, and never made full payments. Connecticut said that, and they’re in trouble now. And those are the ones that never made full payments under their optimistic approaches.

But crazily enough, even the plans that are doing what they supposedly are supposed to do… keep falling farther and farther behind…. I get real suspicious.

There’s obviously something inherently wrong.

Underlying spreadsheet.

Related Posts
Wisconsin Wednesday: Is Benefit Growth Moderate?
Kentucky Pensions Even Closer to the Brink: New Assumptions, New Report
New York State Climate Investing Goals: Who's On the Hook?