Monday, December 02, 2019

Fraser Institute Claims That Alberta Public Sector Employees Are Overpaid

The Fraser Institute is notorious for pushing a hard right, neo-liberal, approach to economic and fiscal policy.  In my experience, their methods are at best sloppy analysis and at worst outright twisting to push a particular policy agenda.

This past week they have published a battery of "reports" that they claim show the public sector workers are being paid considerably more than their peers in the private sector.  They have produced these reports for Alberta, Ontario, and British Columbia (so far). Presumably, this is because once you have set up the analysis for one province, it's pretty much boilerplate to apply it to other provinces with similar datasets available.

For the sake of simplicity, I will focus my commentary on the Alberta version of this report.
The Fraser Institute's approach to its work is pseudo academic at best.  They make a show of citing sources, and so on.  However, it is always problematic when they primarily cite their own past work in their analysis.  The references in this paper boil down to:

a)  Several Statistics Canada datasets and reports used as raw data.
b)  1 20 year old paper which presumably establishes a method for comparing private and public sector pay data.  (Note:  I am not able to locate a copy of this file in the resources I have access to, so I cannot verify this)
c)  The Alberta Government's 2019 budget
d) 4 papers from the Fraser Institute which involve one or more of the same authors who wrote this report.

Why is this significant?  Largely because self-citation in research work tends to lead to reinforcing the author's biases, and means that the author is not looking broadly in their field when doing their work.  This is a fundamental flaw in the work.

The second aspect of the report that stands out to me is the "waving of the hand" at the discussion of methodology used.  What little is provided barely passes at the idea that a future researcher could take the raw data and end up with the same results.  A good example of this pops up early in the paper, where they describe doing an analysis using two models:



At best the authors wave their hands at the issue of "what are Model 1 and Model 2" by stating that Model 2 contains "controls" for a variety of variables.  First of all, they really haven't described the models used, leaving that to the reader's imagination.  Further, if you don't have a background in statistics, interpreting the line "Adjusted R^2" is basically meaningless to most readers.  Given that Coefficient of Determination typically yields values between 0 and 1, with 1 being a near perfect match, and 0 (or less) indicating a poor fit.  What we see here is arguably at best a moderate fit.We would need some domain analysis to decide if that shows us anything meaningful or not - the report authors do not provide us with that analysis to understand if their model is reasonable or not.  

[Update:  Dec 2 13:33]
Another major issue in the study is that it fails to examine the composition of the workforces it is comparing, and does not attempt to correct for weighting issues that might skew the outcomes for public and private sector workforces.  For example, does the public sector workforce skew heavily towards professional and certified tradespeople?  Did the study's authors account for those factors?  Similarly, there is also a relevance to comparing specific subgroups - does an engineer working in government make more or less than their counterparts in private industry?  
[/Update]

There are a number of framing issues in this report that I also take exception to.  First, because they are working at such a high level, we don't really get to understand differences in the composition of the workforce, nor are we able to analyze if any discrepancies that exist are specific to particular disciplines.  The use of what they euphemistically call "non-wage benefits" in their analysis is profoundly problematic, in part because the data around that subject is very limited, but also because the authors admit this and then proceed to use it anyhow to perpetuate the idea that public sector workers are overpaid.


Basically, the authors are saying "well, we haven't got any real data here, so let's speculate wildly to forward our agenda.".  This is intellectually dishonest, and frankly is downright garbage.  Age of retirement, and job loss, for two examples, simply do not represent any kind of quantifiable "wage premium".  Pensions sort of do, but then it becomes important to compare those with private sector savings strategies such as Deferred Shared Profit Plans (DSPP), as well as RRSP matching schemes, share purchase plans, and bonuses.  All of which are "non-salary compensation" that isn't part of the government compensation landscape.  Of course, these matters are not evaluated at all by the authors (probably because the needed statistical data simply does not exist). 

This leads me to the fundamental assumption of these reports:  That the Private Sector's compensation is the baseline which everything should be measured against.  Those who have lived with stagnant wages, declining benefits, and an increasingly scarce amount of full time employment options might well take exception to that assumption.  

Just because something is "good for business" doesn't make it "good for people", or "good for government".  Government does not live by the same vagaries of the "free market" as does private enterprise.  Complaining about government job compensation because it doesn't align with the "race to the bottom" economics currently dominating the private sector strikes me as simply demanding that government treat its works as bad, or worse, than the private sector does rather than insisting that the private sector up its game and treat workers better. 


No comments:

Trans Athletes ...

So, wayyyy back in 2021, I wrote a piece pointing out that a lot of the arguments about whether transgender athletes (and particularly trans...