What is the point of PEFA?

26 Apr 16

PEFA has become the standard tool for measuring public financial management systems – but has it also become a misleading victim of its own success?

Launched in 2005, the Public Expenditure and Financial Accountability (PEFA) framework has become the ‘go-to’ measure of quality in public finance systems. It has led the way for harmonising diagnostics on PFM and influenced a range of new tools for measuring specific parts of the PFM system. Now, more than ten years on, the framework has been revised significantly for the first time. But countries and their donor partners need to do more to make sure the new indicators are used appropriately.

But has PEFA’s undoubted success undermined its early ambitions? Today, PEFA is an integral part of donor internal processes, influencing programming decisions over budget support, use of country systems, and PFM reforms. It has also been adopted by some countries as a benchmark for their own reform efforts, or the quality of PFM under local governments. In Nepal, the PFM reform coordination team is called the “PEFA Secretariat”. In this way it has gone beyond early expectations. However, we have learnt from the “Doing Business Indicators” that global indicators can influence incentives, and not always for the best.

These incentives have changed PEFA from a good tool for understanding what PFM systems look like, to a bad tool for determining what PFM systems should look like. It is standard practice after a PEFA assessment for a country (or a set of external consultants) to develop a PFM roadmap specifying what reforms will be implemented. There are a number of examples of these plans that have directly targeted improvements in almost every PEFA indicator. Kiribati, Tonga and Maldives are three random examples of this from our previous work. This exposes PEFA’s three main limitations:

• It measures the “form” of a PFM system but not always how well it “functions”. Most of the 30 indicators check if a process has been established, but not if it works. In Uganda, laws comply with 70% of PEFA’s good practice, but implementation complies with just 50%, leaving an estimated implementation gap of 20% [Andrews 2013].

• It has limited depth and coverage. To stay affordable, PEFA focuses on the processes of the central government, and particularly the finance ministry. But that can mask problems in PFM systems of line ministries, local governments and service delivery units. A health resource tracking system in Timor, for example, found large arrears distorting spending in health centres that were not recorded in the central FMIS.

• It misses the important interactions between actors in the PFM system. PEFA does not assess politics, organisational arrangements, or the relationship between processes. Yet, the quality of PFM depends on all three of these. In Tonga, for example, the Ministry of Finance faces challenges coordinating its planning and budgeting processes, while in Kiribati the staff simply are not in place to attempt it. The question is, do they face the same problem?

By using PEFA in too formulaic a manner, reforms can improve PEFA scores without making systems work better, or worse still they can draw resources away from actual problems facing macroeconomic management or service delivery. Our work with the World Bank in the Pacific shows that most reforms had only weak links to problems with macroeconomic management and service delivery in Tonga and Kiribati.

Will the new framework be a game changer? The new PEFA is bigger, and possibly better, but it still uses the same underlying logic and we’ve heard from ‘testing’ that the new PEFA yields much the same scores as the old one. This is not a problem for the PEFA framework itself, which remains the most comprehensive attempt to measure PFM systems. However, if it continues to be used in the same way, then we should be ready for a surge in ‘cheap’ fiscal strategies that are not adhered to (the new indicator [PI-CFS]); semi-independent project evaluation units (the new [PI-PIM]) that simply sign-off politically-motivated projects without question; and performance plans and reports (the new [PI-23]) that have little impact on institutional incentives to delivery stronger services.

We would like to see developing countries and their partners use PEFA like Norway. Norway is the only OECD country to publish a PEFA assessment, which is an interesting fact in its own right. The self-assessment gave the government several C grades and a couple of D results, with particular weaknesses in internal audit and procurement. The Norwegian response was very rational: the procurement systems needed improving, but internal audit reforms were not necessary given the strong internal controls that were already in place. In other words, processes are not necessarily good or bad just because a PEFA score says so.

Exactly how to change the rules of the game is not clear. PEFA provides a useful framework for initial engagement on PFM reforms, but it is not an end-game. Other diagnostic tools – like Public Expenditure Tracking Surveys – may be more useful for understanding how PFM links to problems with services. Public Expenditure Reviews will give at least some insight into broader macroeconomic management. However, there will be no substitute for a better dialogue process, like the one being promoted by CABRI for the Effective Institutions Platform. If we cannot find a solution, the next PEFA revision will need to be much more radical.

Did you enjoy this article?

Related articles

Have your say

Newsletter

CIPFA latest

Most popular

Most commented

Events & webinars