The Economics Of Deflation

In a speech at Imperial College Business School, Deputy Governor Ben Broadbent explores the economics of deflation. He explains what the theoretical risks are, how the MPC could respond were risks to materialise and why, while the MPC must be ‘watchful’, the chances of sustained deflation are ‘low’.

This week the ONS announced that CPI had fallen to 0%, ‘the first time the UK has failed to record positive annual inflation for over 50 years.’ Some have claimed that ‘deflation is a problem “because it encourages people to postpone consumption”. … and this can lead to a vicious circle in which prices are then depressed further.’ This is ‘at best a partial description of the problem, at worst a bit misleading. It leaves out two important considerations.’

First, any incentive to delay purchases is offset by an increased propensity to consume when, as now, nominal incomes are rising. Secondly, the incentive to save applies whenever nominal interest rates are higher than inflation and doesn’t ‘suddenly appear when expected inflation dips below zero’. Likewise, debt deflation is driven by nominal incomes falling much faster than nominal interest rates and would not occur when incomes are growing. What we have seen in the past few months is, instead, ‘what some have termed “good” deflation’; an improvement in the terms of trade which has led to an increase in nominal income growth.

In theory, even where periods of mild deflation, like that seen in Britain in the second half of the 19th century and up to the First World War, do lead to falls in nominal wages, ‘there doesn’t look to be anything particularly bad about negative inflation’ – so long as the appropriate level of the real interest rate is sufficiently high not to have to contend with the zero lower bound. When that is the case, interest rates can compensate for widespread price falls.

‘Clearly, we are not in the same position today’ and given the cuts in interest rates made necessary by the financial crisis ‘there is now less room to deal with any entrenched and protracted deflation, in prices and wages alike’ should that occur. ‘The evidence suggests that unconventional policy is effective: even if they don’t circumvent it entirely, asset purchases help soften the constraint of the zero lower bound. Second, with the financial system gradually improving my guess is that the neutral real rate of interest is more likely to rise than fall over the next couple of years.’

Nonetheless the constraints of the effective lower bound means the MPC must be ‘watchful’ for signs that ‘falling prices will beget yet weaker or even negative wage growth – that today’s “good” deflation will metastasise into something a good deal worse’.

‘What are the chances of such a thing?’ There is evidence that price falls have ‘already fed through to spending and household confidence’ and that the propensity to consume has dominated the propensity to save. The historical experience also suggests that low inflation is very unlikely to persist. Looking at 3,300 data points from across the world, in only 70 has inflation actually fallen and ‘in the majority of those inflation was positive the following year. Of the 24 instances in which prices have fallen for more than one year, all but four – two in Japan, two in Bahrain – have occurred in developing countries with pegged exchange rates.’

The presence of a credible monetary policy framework reduces even further the risk of protracted deflation. ‘If people expect inflation generally to be on target, over the medium term, the central bank has to do less to actively ensure it’ and policy does not need to respond to ‘one-off hits to the price level.’

Ben concludes that over the next couple of years ‘headline inflation will get a sizeable kick upwards, once the falls in food and energy prices drop out of the annual comparison. In the meantime, the UK labour market continues to tighten – low inflation won’t be the only influence on pay growth this year – and the boost imparted by lower commodity prices to real incomes is already apparent in higher consumer spending, here and in other countries too. ‘

The New Complexity

On microscopes and telescopes is a speech given by Andrew G Haldane, Chief Economist, Bank of England. It contains some important insights into the questions of how to model the current state of the financial system. For example, there may be greater scope to co-ordinate macro-prudential tools. One way of doing so is to develop macro-prudential instruments which operate on an asset-class basis, rather than on a national basis. Here are some of his remarks.

At least since the financial crisis, there has been increasing interest in using complexity theory to make sense of the dynamics of economic and financial systems. Particular attention has focussed on the use of network theory to understand the non-linear behaviour of the financial system in situations of stress. The language of complexity theory – tipping points, feedback, discontinuities, fat tails – has entered the financial and regulatory lexicon.

Some progress has also been made in using these models to help design and calibrate post-crisis regulatory policy. As one example, epidemiological models have been used to understand and calibrate regulatory capital standards for the largest, most interconnected banks – the so-called “super-spreaders”. They have also been used to understand the impact of central clearing of derivatives contracts, instabilities in payments systems and policies which set minimum collateral haircuts on securities financing transactions.

Rather less attention so far, however, has been placed on using complexity theory to understand the overall architecture of public policy – how the various pieces of the policy jigsaw fit together as a whole. This is a potentially promising avenue. The financial crisis has led to a fundamental reshaping of the macro-financial policy architecture. In some areas, regulatory foundations have been fortified – for example, in the micro-prudential regulation of individual financial firms. In others, a whole new layer of policy has been added – for example, in macro-prudential regulation to safeguard the financial system as a whole.

This new policy architecture is largely untried, untested and unmodelled. This has spawned a whole raft of new, largely untouched, public policy questions. Why do we need both the micro- and macro-prudential policy layers? How do these regulatory layers interact with each other and with monetary policy? And how do these policies interact at a global level? Answering these questions is a research agenda in its own right. Without answering those questions, I wish to argue that complexity theory might be a useful lens through which to begin exploring them. The architecture of complex systems may be a powerful analytical device for understanding and shaping the new architecture of macro-financial policy.

Modern economic and financial systems are not classic complex, adaptive networks. Rather, they are perhaps better characterised as a complex, adaptive “system of systems”. In other words, global economic and financial systems comprise a nested set of sub-systems, each one themselves a complex web. Understanding these complex sub-systems, and their interaction, is crucial for effective systemic risk monitoring and management.

This “system of systems” perspective is a new way of understanding the multi-layered policy architecture which has emerged since the crisis. Regulating a complex system of systems calls for a multiple set of tools operating at different levels of resolution: on individual entities – the microscopic or micro-prudential layer; on national financial systems and economies – the macroscopic or macro-prudential and monetary policy layer; and on the global financial and economic system – the telescopic or global financial architecture layer.

The architecture of a complex system of systems means that policies with varying degrees of magnification are necessary to understand and moderate fluctuations. It also means that taking account of interactions between these layers is important when gauging risk. For example, the crisis laid bare the costs of ignoring systemic risk when setting micro-prudential policy. It also highlighted the costs of ignoring the role of macro-prudential policy in managing these risks. That is why the post-crisis policy architecture has sought to fill these gaps. New institutional means have also been found to improve the integration of these micro-prudential, macro-prudential, macro-economic and global perspectives. In the UK, the first three are now housed under one roof at the Bank of England.

He concludes:

This time was different: never before has the world suffered a genuinely global financial crisis, with every country on the planet falling off the same cliff-edge at the same time. This fall laid bare the inadequacy of our pre-crisis understanding of the complexities of the financial system and its interaction with the wider economy, locally but in particular globally. It demonstrated why the global macro-financial network is not just a complex adaptive system, but a complex system of systems.

The crisis also revealed gaps and inadequacies in our existing policy frameworks. Many of those gaps have since been filled. Micro-prudential microscopes have had their lens refocused. Macro-prudential macroscopes have been (re)invented. And global telescopes have been strengthened and lengthened. Institutional arrangements have also been adapted, better enabling co-ordination between the micro, macro and global arms of policy. So far, so good.

Clearly, however, this remains unfinished business. The data necessary to understand and model a macro-financial system of systems is still patchy. The models necessary to make behavioural sense of these complexities remain fledgling. And the policy frameworks necessary to defuse these evolving risks are still embryonic. More will need to be done – both research and policy-wise – to prevent next time being the same.

For example,

There may be greater scope to co-ordinate macro-prudential tools. One way of doing so is to develop macro-prudential instruments which operate on an asset-class basis, rather than on a national basis. This would be recognition that asset characteristics, rather than national characteristics, may be the key determinant of portfolio choices and asset price movements, perhaps reflecting the rising role of global asset managers.

There has already been some international progress towards developing asset market specific macro-prudential tools, specifically in the context of securities financing transactions where minimum collateral requirements have been agreed internationally. But there may be scope to widen and deepen the set of financial instruments covered by prudential requirements, to give a richer array of internationally-oriented macro-prudential tools. These would then be better able to lean against global fluctuations in a wider set of asset markets.

Is Higher Market Volatility The New Normal?

Chris Salmon, Executive Director, Markets, Bank of England, gave a speech “Financial Market Volatility and Liquidity – a cautionary note”. He suggests that financial market conditions have changed quite noticeably over the past year or so, with significantly higher volatility. For example, on 15 October and 15 January the immediate intra-day reaction to the news was unprecedented. The intra-day change in 10-year US bond yields was 37 bps, with most of this move happening within just an hour of the data release. The intraday range represented nearly eight standard deviations, exceeding the price moves that happened immediately following the collapse of Lehman Brothers. On 15 January, the Swiss franc appreciated by 14%. The intraday range was several times that number, and market participants continue to debate the highest traded value of the franc on the day.

Could such events could imply that a number of major asset markets may have become more sensitive to news, so that a given shock causes greater volatility? Is this the new normal?

If so, what is the root cause? We know the macroeconomic outlook has changed and perhaps become less certain. Central banks have reacted, and in some cases pushed the innovation envelope further. Unsurprisingly these developments have been associated with more volatile financial markets. Given that policy-makers have previously been concerned that persistently tranquil financial markets could encourage excessive risk-taking, this development is not necessarily an unwelcome development. But recent months have also seen a number of short episodes of sharply-higher volatility which coincided with periods of much impaired market liquidity. There are good reasons to believe that the severity of these events was accentuated by structural change in the markets. There must be a risk that future shocks could have more persistent and more widespread impacts across financial markets than has been the case in the recent past.

Whilst market intelligence suggests that uncertainty surrounding the global outlook has been one factor; itself in no small part a consequence of the unexpected rough halving in the price of oil since last summer. This uncertainty can be seen in the recent increase in the dispersion of economists’ forecasts for inflation in the US, UK and euro area during 2015. Central banks themselves have reacted to the changed global outlook and monetary policy makers have been active in recent months. Indeed, so far this year 24 central banks have cut their policy rates. Moreover, the decisions by the ECB and three other central banks since last summer to set negative policy rates have raised questions about where the lower bound for monetary policy exists.

But he suggests there are two other factors in play, which may indicate a more fundamental change, and lead to continuing higher volatility.

First, market makers have become more reluctant to commit capital to warehousing risk. Some have suggested that this reflects a combination of reduced risk tolerance since the financial crisis, and the impact of regulation designed to improve the resilience of the financial system. This reduction in market making capacity has been associated with increased concentration in many markets, as firms have been more discriminating about the markets which they make, or the clients they serve. And this trend has gone hand-in-hand with a growth in assets under management by the buy-side community. The combination has served to amplify the implications of reduced risk warehousing capacity of the intermediary sector relative to the provision of liquidity from market makers during times of market stress relative to the past.

The second is the evolution of the market micro-structure. Electronic platforms are now increasingly used across the various markets. In some cases regulation has been the cause but in others, such as foreign exchange markets, firms have over a number of years increasingly embraced electronic forms of trading. This includes using ‘request-for-quote’ platforms to automate processes previously carried out by phone. Electronic platforms are effective in pooling liquidity in ‘normal’ times but may have the potential, at least as currently calibrated and given today’s level of competition, to contribute to discontinuous pricing in periods of stress if circuit-breakers result in platforms shutting down. There has been much commentary about the temporary unavailability of a number of electronic trading platforms in the immediate aftermath of the removal of the Swiss franc peg.

Oil Price Falls And Monetary Policy

In a speech at Durham University Business School, MPC member Ian McCafferty considers the factors contributing to the recent fall in the oil price, the impact on inflation and its likely persistence and how, given this analysis, UK monetary policy should respond.

Ian argues that as with the oil price falls seen in 1985 and 1998, ‘there is merit in examining recent oil price developments, and the implications for the outlook for the oil market, through the prism of hog-cycle theory.’ As with the livestock markets hog cycle theory is based on, short term elasticity of oil supply is low but the longer-term elasticity substantially higher.  As a result the main adjustment to price falls comes from changes in investment plans which in turn impact productivity and supply in the longer term.

This analysis shows that ‘the lag in the supply response means that for a while, even after the initial price fall, supply continues to exceed demand, such that inventories continue to build.’ As the market balances and inventory levels fall back ‘the market tightens and prices begin to rise, encouraging supply to recover. But here too, there are noticeable lags – first, it will require a period of higher prices to encourage producers to commit to new investment, and geographical, geological and political issues mean that the lead time to new supply is relatively lengthy.’

Ian suggests that ‘we can expect oil production to ease in the second half of the year’ and for demand for oil to increase due to the net positive impact to global demand, estimated by Bank staff to stand at around 0.8%, which in turn will support greater demand for oil. ‘Overall, it is reasonable to assume that, by the end of 2015, supply and demand for oil will be coming back into balance, although inventories will remain high for a further period. This should translate into more stable yet still relatively low prices,’ though further out ‘prices might be expected to recover’.

The fall in oil prices, and their predicted persistence, has important implications for both the likely path of inflation and the appropriate response of monetary policy. While the immediate direct effect is clearly disinflationary, detracting ‘a bit more than half a percentage point from headline inflation for the rest of the year’ indirect effects could emerge in both directions. The fall in the oil price could generate inflationary pressure by boosting demand and with little effect on potential supply in the economy. Conversely, the risk of falling inflation expectations feeding through to lower wage settlements could create further disinflationary pressure.

‘How should monetary policy respond to such a sharp oil price shock?  As always in monetary policy, the answer depends on the source of the shock.’ As it is supply rather than demand that has ‘been the dominant factor behind the recent fall in the oil price… it should be treated primarily as a simple cost or price-level shock’. This would mean looking-through the temporary impact on inflation as the MPC has done previously when rising oil prices pushed inflation well above target.

‘But how temporary is temporary? Policymakers need to consider not just the source of the shock but also its persistence.’ In doing so they should, Ian suggests, refer to the ‘optimal policy rule’ which states that ‘looking through an undershoot of inflation, even a prolonged one, is more justified if the real economy is operating at or above full capacity’.

This, combined with the likely path for spare capacity set out in the Inflation Report and Ian’s view that ‘there may be less spare capacity left in the labour market’ than the MPC’s collective judgement, would suggest that it would be right to ‘look through’ the current low level of inflation.

This, however, is complicated by the potential for the persistent, depressing effect on annual inflation to constrain a growth in pay by causing inflation expectations to become de-anchored. ‘Judging the scale of this downside risk is difficult. Some measures of inflation expectations have fallen but others suggest that inflation expectations remain well-anchored, and there are no signs at present that anything approaching deflationary psychology is likely to take hold.’ Nonetheless, it is not a risk the MPC can dismiss, ‘at least while inflation remains close to zero’. This, Ian concludes, is why he decided not to vote for an increase in Bank Rate at the January and February policy meetings.

The Post-Crisis Bank Capital Framework

David Rule, Executive Director, Prudential Policy at the Bank of England gave a good summary of the current issues surrounding capital, and commented specifically on issues surrounding internal (advanced) methods.

Six and half years after the depths of the Great Financial Crisis, we know the shape of the future global bank capital framework. But important questions do remain. Today I want to focus on how regulators should measure risk in order to set capital requirements, with some final remarks on the particular case of securitisation. To start, though, a reminder of the key elements of the post-crisis, internationally-agreed framework:

  • Banks have minimum requirements for at least 4.5% of risk-weighted assets (RWAs) in core equity and 6% of RWAs in going concern Tier 1 capital, including for the purpose of absorbing losses in insolvency or resolution. Basel III tightened the definition of capital significantly.
  • Systemically-important banks have further loss absorbing capacity so that they can be recapitalised in resolution without taxpayer support, ensuring the continuity of critical functions and minimising damage to financial stability.
  • In November 2014, the Financial Stability Board (FSB) proposed that total loss absorbing capacity (TLAC) for globally systemically-important banks (G-SIBs) should comprise at least 16-20% RWAs.
  • Core equity buffers sit on top of this TLAC so that the banking system can weather an economic downturn without unduly restricting lending to the real economy; the Basel III capital conservation buffer for all banks is sized at 2.5% of RWAs.
    o Systemically-important banks hold higher buffers; and
    o Buffers can also be increased counter-cyclically when national authorities identify higher systemic risks.

The new bank capital framework will cause banks to hold significantly more capital than the pre-crisis regime. Major UK bank capital requirements and buffers have increased at least seven-fold once you take account of the higher required quality of capital, regulatory adjustments to asset valuations and higher risk weights as well as the more obvious increases in headline ratio requirements and buffers. Small banks have seen a lesser increase than systemically-important banks, reflecting the important new emphasis since the crisis on setting capital buffers and TLAC in proportion to the impact of a bank’s distress or failure on the wider financial system and economy. In sum, the framework is now impact- as well as risk-adjusted. From a PRA perspective, this is consistent with our secondary objective to facilitate effective competition.

We are currently in transition to the final standards, with full implementation not due until 2019. Although the broad shape is clear, I want to highlight four areas where questions remain:

First, the overall calibration of TLAC. The FSB will finalize its ‘term sheet’ that specifies the TLAC standard for G-SIBs in light of a public consultation and findings from a quantitative impact study and market survey. It will submit a final version to the G-20 by the 2015 Summit. National authorities will also need to consider loss absorbing capacity requirements for banks other than G-SIBs. In the United Kingdom, the Financial Policy Committee (FPC) will this year consider the overall calibration of UK bank capital requirements and gone-concern loss absorbing capacity.

Second, the appropriate level of capital buffers, including how and by how much they increase as banks are more systemically important. The Basel Committee has published a method for bucketing G-SIBs by their global systemic importance, a mapping of buckets to buffer add-ons and a list of G-SIBs by bucket. This will be reviewed in 2017. Separately the US authorities have proposed somewhat higher add-ons. National authorities also have to decide buffer frameworks for domestically systemically-important banks or D-SIBs. In the UK, the FPC plans to consult on a proposal for UK D-SIBs in the second half of this year.

Third, the location of capital buffers, requirements and loss absorbing capacity within international banking groups. A number of such groups are moving towards ‘sibling’ structures in which operating banks are owned by a common holding company. This has advantages for resolution: first, loss absorbing capacity can be issued from a holding company so that statutory resolution tools only have to be applied to this ‘resolution entity’ – the operating subsidiaries that conduct the critical economic functions can be kept as going concerns; and second, the operating banks can be more easily separated in recovery or post-resolution restructuring. It also fits with legislation in countries such as the UK requiring ring fencing of core retail banking activities and the US requiring a foreign banking organization with a significant U.S. presence an to establish intermediate holding company over U.S. subsidiaries. A ‘single point of entry’ approach to resolution might involve all external equity to meet buffers and external equity and debt included in TLAC being issued from the top-level holding company. An important question then is to what extent and on what terms that equity and debt is downstreamed from the top-level holding company to any intermediate holding companies and the operating subsidiaries. This will also be influenced by the final TLAC standard that includes requirements on these intragroup arrangements.

Finally, I would like to spend more time on my fourth issue: how to measure a bank’s risk exposures in order to set TLAC and buffers – or, in other words, determining the denominator of the capital ratio. Here regulators have to balance multiple objectives:

  • An approach that is simple and produces consistent outcomes across banks. Basel I, based entirely on standardised regulatory estimates of credit risk, met this test.
  • An approach that is risk sensitive and minimises undesirable incentives that may distort market outcomes. Whether we like it or not, banks will evaluate their activities based on return on regulatory capital requirements. So if those requirements diverge from banks’ own assessments of risk, regulation will change market behaviour. Sometimes that may be intended and desirable. But often it will not be. Basel I, for example, led to distortions in markets like the growth of commercial paperback-up lines because under-one-year commitments had a zero capital requirement. Subsequent developments of the Basel capital framework sought to close the gap between regulatory estimates of risk and firms’ estimates of risk by allowing use of internal models for market, operational and credit risk.
  • An approach that is robust in the face of uncertainty about the future. Estimates of risk based on past outcomes may prove unreliable. We should be wary of very low capital requirements on the basis that assets are nearly risk free. And behavioural responses to the capital framework may change relative risks endogenously. For example, before the crisis, banks became dangerously over-exposed to AAA-rated senior tranches of asset backed securities partly because, wrongly, they saw the risks as very low and partly because the capital requirements were vanishingly small.

Ideally regulators would design a framework for measuring risk exposures that maximises each of these objectives. But trade-offs are likely to be necessary and, in my view, the rank ordering of objectives should be robustness followed by risk sensitivity and simplicity. Prioritising robustness points to combining different approaches in case any single one proves to be flawed. So the PRA uses three ways of measuring risk: risk weightings, leverage and stress testing. By weighting all assets equally regardless of risk, the leverage exposure measure provides a cross check on the possibility that risk weights or stress testing require too little capital against risks judged very low but which subsequently materialise.

In the United Kingdom, the FPC’s view is that leverage ratio should be set at 35% of a bank’s applicable risk-weighted requirements and buffers.1 This is simple to understand and can be seen as setting a minimum average risk weight of 35%. So, for non-systemic banks with risk-weighted requirements and buffers of 8.5%, the minimum leverage ratio would be 3%. But a G-SIB, with a risk-weighted buffer add-on of, say, two percentage points, would a have an additional leverage buffer of 0.7 percentage points. And all firms would be subject to a leverage buffer equal to 35% of any risk-weighted counter-cyclical buffer. Another key advantage of using the same scaling factor and mirroring the different elements of the risk-weighted framework is that it creates consistent incentives for different types of banks and over time. By contrast, for example, setting the same leverage ratio for all firms would amount to setting a lower minimum average risk weight for systemically-important banks than other banks.

Stress testing complements risk weighted and leverage approaches by considering the impact of extreme but plausible forward-looking macroeconomic scenarios of current concern to policymakers. Because buffers are intended to absorb losses in an economic downturn, the natural role of stress testing in the capital framework is to assess the adequacy of the buffers based on the Basel risk-weighted and leverage measures. If an individual bank is shown to be an outlier in a stress test, with a particularly large deterioration in its capital position, supervisors may use Pillar II to increase its capital buffers. The PRA is currently consulting on its approach to Pillar II, including a ‘PRA buffer’ that would be used in this way to address individual bank risks. An advantage of concurrent stress testing across major banks is that policymakers can consider the wider systemic impact of the scenario. They can also test whether buffers are sufficient even if regulators prevent banks from modelling management actions that would be harmful to the wider economy: for example, if banks propose to reduce new lending in order to conserve capital. Used in this way, stress testing may inform calibration of the system-wide, countercyclical buffer if macro-prudential policymakers identify elevated systemic risks.

Leverage and stress testing are best seen as complements rather than alternatives to risk-weighted measures of capital, producing a more robust overall framework. Risk weightings will likely remain the binding constraint for most banks most of the time. A central priority of the Basel Committee over the next year or so is to restore confidence in risk weightings by designing a system that balances most effectively the three objectives of robustness, risk sensitivity and simplicity.

Risk sensitivity points to a continuing role for firms’ internal estimates and models. But that depends on finding solutions for problems with them. First, various studies by the Basel Committee have shown material variations in risk weights between banks for reasons other than differences in the riskiness of portfolios. Models appear to be producing excessive variability in capital outputs, undermining confidence in risk-weighted capital ratios and raising questions about gaming. Second, some models may produce low risk weights because the data underpinning them do not include stress events in the tail of the distribution. This is a particular concern in portfolios where the typical level of defaults is low but defaults may correlate in a systemic crisis: for example, exposures to other banks or high quality mortgages. For major global firms, average risk weights fell almost continuously from around 70% in 1993 to below 40% in 2008, since when they have remained around that level. Third, modelled capital requirements can be procyclical. For example, last year’s concurrent stress test of major UK banks by the Bank of England showed that some banks’ mortgage risk weights increased significantly in the test, particularly where banks took a ‘point in time’ approach whereby probability of default was estimated as a function of prevailing economic and financial conditions.

One solution would be to abandon use of banks own estimates and models entirely and use standardised regulatory risk weights. But standardised approaches have their own weaknesses. For example, finding simple and consistent techniques for measuring risk by asset class that work well across countries with different market structures and risk environments is not straightforward. Regulators typically face a trade-off between simplicity and risk sensitivity. An alternative approach is to find solutions for the problems with models. Some possible ideas might include:

  • Requiring banks to provide more transparency about their risk estimates and models. The work of the Enhanced Disclosure Task Force and Basel’s revised Pillar III templates are steps in this direction. Regular hypothetical portfolio exercises by supervisors can identify banks with more aggressive approaches.
  • Being more selective about where it makes sense to allow internal models and where standardised approaches may be more effective. In the case of credit risk, for example, models may be more robust in asset classes with longer and richer histories of default data; and the value-added of models for risk sensitivity is likely to be greater in asset classes where banks have significant private information about differences in risk.
  • Changing the specification of models to take greater account of potential losses if tail risks crystallise. The Basel Committee has already agreed to move from a value-at-risk to an expected shortfall approach to estimating market risk. For credit risk, increasing the implied correlation of default in the model might be a simple way to produce higher risk weights in asset classes where banks are estimating low probabilities of default but regulators are concerned about tail risks.
  • Broadening the use of so-called ‘slotting’ approaches in which banks use their own estimates to rank order risks but regulators determine the risk weights for each ‘slot’. Slotting makes use of the better information banks have about relative risk within an asset class. But regulators decide the level of capital requirements. Slotting was one of the options considered when regulators first started thinking about use of internal models in the capital framework in the 1990s.
  • Putting floors on the level of modelled capital requirements. The Basel Committee has recently consulted on the design of a floor based on standardised risk weights to replace the existing transitional capital floor based on the Basel I framework. But it has not taken decisions on calibration: in other words, how often the floors would ‘bite’.

The Basel Committee has said that it will consider the calibration of standardised floors alongside its work on finalising revised standardised approaches to credit risk, market risk and operational risk, and as part of a range of policy and supervisory measures that aim to enhance the reliability and comparability of risk-weighted capital ratios. Restoring confidence in risk weights will form a major part of the Committee’s agenda over the next year or so. Meanwhile, at a national level, supervisors can use Pillar II to address risks not adequately captured under internationally-standardised risk weightings. The PRA uses Pillar II actively to ensure banks have adequate capital to support all the risks in their businesses and has recently set out in a transparent way for consultation the methodologies it proposes using to inform its setting of Pillar II capital requirements.

Finally, I want to speak briefly about securitisation as an example of an area where regulators find it hard to measure risk. One reason is that part of the securitisation market grew up in order to exploit weaknesses in risk weightings by allowing banks to maximise reduction in capital requirements while minimising decreases in revenue. A lesson from the past is that the risk of unintended market consequences is high. Risk weighting approaches for securitisation have relied either on external tranche ratings or on regulatory formulae. Both have problems. Formulae may not include all the key dimensions of risk. Ratings agencies can. But their track record in the financial crisis was poor and authorities globally are seeking – and in the US case are required by law – to reduce reliance on rating agencies.

As well as the micro-prudential goal to ensure that banks measure securitisation risks appropriately and hold adequate capital against them, we also have a macro-prudential goal that the securitisation market develops in a sustainable way. These goals are aligned because, as we saw in the crisis, a market that develops in an unhealthy way can mean unexpectedly greater risks for banks. What are the characteristics of a sustainable securitisation market? One in which:

  • banks and other issuers can use securitisation to transfer risk and raise funding but not to manage capital requirements artificially;
  • investors are diverse and predominantly ‘real money’ as opposed to the fragile base of leveraged funds and bank treasuries that collapsed in Europe during the crisis;
  • issuers’ incentives are adequately aligned with those of investors; and
  • investors have the information they need to understand the risks they are taking.

If structured soundly in this way, securitisation markets can be an important channel for diversifying funding sources and allocating risk more efficiently. Overall, the development of a carefully structured securitisation market could enable a broader distribution of financial sector risk, allow institutional investors to diversify their portfolios and banks to obtain funding and potentially remove part of the risk from banks’ balance sheets to free up balance sheet capacity for further lending to the economy.

The Basel Committee published a revised securitisation framework in December last year. Jointly with IOSCO, it also published for consultation a set of criteria to help identify simple, transparent and comparable (STC) securitisation. This year, the Committee will consider how to incorporate such criteria into the securitisation capital framework. In my view, incorporating the STC criteria will serve both micro-prudential and macro-prudential objectives. First, it will add a measure of ‘structure’ risk into the capital framework complementing existing inputs such as the underlying risk weights on the securitised portfolio, maturity and tranche seniority. That should improve risk sensitivity. And more transparency will help regulators as well as investors to measure risk. Second, such criteria will encourage securitisation market to develop in a more healthy and sustainable way. Finally, and returning to my main theme, I conclude that the post-crisis capital regulation for banks globally should be based on different ways of assessing risk, with leverage and stress testing complementing risk-weighted measures within an integrated framework. Such an approach is most likely to achieve the objectives of robustness followed by risk sensitivity and simplicity.

Reforms to the Bank of England’s Market Intelligence programme

Some central banks, including the Bank of England, have moved away from an era of ‘constructive ambiguity’ to greater openness and transparency. The RBA is less transparent, and the Australian Regulatory system is opaque and largely still done behind closed doors and quiet whispers. This is because they are too aligned to the major financial services incumbents, and are over focussed on financial stability. In particular the role and activity of the Council of Financial Regulators is completely opaque. So it is interesting to see where the Bank of England is headed.

The Bank of England today announced the outcome of a root-and-branch review of its Market Intelligence (MI) programme. In a speech at Warwick University, Minouche Shafik said the resulting changes – alongside progressive steps to make the Bank’s liquidity insurance framework more transparent – show clearly that the Bank is not just “open for business” but also “open about our business.”

MI is the ongoing process of discussion with financial market participants to identify insights relevant to policymaking.  The Bank’s Governors and Court of Directors endorsed all 11 recommendations stemming from the MI Review, which will make the gathering and use of MI more transparent, robust and effective.  The recommendations include:

  •  A MI Charter which explains clearly the terms of the Bank’s engagement with financial market participants, and its rationale for gathering MI;
  • A strengthened set of policies that govern MI gathering, supported by expanded training for staff; and
  • A new executive-level committee to oversee the MI programme, to ensure it retains the necessary flexibility, focus and relevance to the policy challenges of today and tomorrow.

The Bank of England also today published its formal response to the recommendations of Lord Grabiner, following the publication of his Foreign Exchange Market Investigation Report in November 2014. At the time, the Bank endorsed the recommendations – which covered documentation, education, and the need for greater clarity over the Bank’s market intelligence role – and committed to implementing them in full and as quickly as possible.

In today’s response, the Bank outlined the actions that have been, or are being, taken to fulfil the recommendations. They will result in stronger systems and controls around the Bank’s engagement with market participants.

In a speech on Thursday – Goodbye ambiguity, hello clarity – Bank of England Deputy Governor Minouche Shafik explains why this greater clarity around the Bank of England’s interactions with financial markets is essential.

Central banks, including the Bank of England, have moved away from an era of ‘constructive ambiguity’ to greater openness and transparency.  For example, the Bank now has a well-defined set of facilities for the provision of liquidity to the financial system that have evolved to meet changing needs. The Bank’s dialogue with markets dates back to 1786 but the days of men in top hats and fireside chats are now a distant memory. The Bank’s MI function is a highly professional network of staff covering 23 different markets and sectors, providing first-hand insights on short-term moves and long-term trends relevant to all the Bank’s policy functions.

In the speech, Minouche says:

“The ability of the Bank’s MI function to provide insights to senior policymakers over the past 8 years, as the first waves of the crisis rushed onto the Bank’s doorstep, and then as solutions flowed back out across the system, has been vital to our effectiveness”.

Welcoming the changes to the MI programme announced today, Minouche said:

“The Bank has been at the centre of one of the world’s major financial centres for hundreds of years. Today the Bank has a broader role than ever before. A clear understanding of the root causes of developments in financial markets must underpin the decisions we make about monetary policy and regulation of financial markets. Aligning our Market Intelligence function closely to the Bank’s mission, so that its purpose is clear and its approach is transparent, will ensure we continue to seize that opportunity”.

Bank Of England’s Research Agenda

The UK Regulator has announced a broader research agenda, recognising the complexities of the current financial environment. Mark Carney, Governor of the Bank of England spoke at the Launch Conference. His remarks summarise the rationale behind the major research themes. They are worth reading, not least because he highlights that the traditional view of macroeconomic policy is changing.

Transformation of the Bank of England

The central challenge of macroeconomic policymaking in the late 1970s and 1980s was the fight against inflation. In no small part due to my predecessors, particularly Lord King, we have today a regime for maintaining monetary stability that is both democratically legitimate and highly effective. It rests on clear remits, delegated by Parliament, sound governance arrangements to support independence, and effective transparency of policymaking. And it provides valuable lessons for the conduct of other policy functions.

Despite these successes, in both theory and practice, a healthy focus on price stability had become a dangerous distraction. The financial crisis was a powerful reminder that price stability is not sufficient for macroeconomic stability. It exposed the convenient fiction that finance is a veil. And we were taught that the dynamics of lending markets are as important as those of labour markets for our shared prosperity.

In response to these painful lessons, the Bank of England has been bestowed with enormous new responsibilities by Parliament. They now span monetary policy, macroprudential policy, and microprudential supervision. They include responsibility for the United Kingdom’s bank notes; its payments systems; oversight of financial markets infrastructure and resolving failed institutions. To help fulfill its mission the Bank has core roles in Europe, at the G20 and at the Financial Stability Board.

Having monetary, macroprudential, and microprudential policy under one roof makes gains from trade possible. It is our duty to exploit complementarities, synergies and economies of scope to maximise our impact by working together. To do so, we need research. And to some extent, research needs us.

The need for research

The way central banks have sought to achieve their objectives – the practice of central banking – has frequently moved ahead of the theory of central banking. Theory, in turn, subsequently catches up, and enriches and refines practice. The history of central banking is replete with examples. Tacit knowledge has often been more instrumental in determining policy outcomes than insights from formal research. Sometimes this works, as in Bagehot’s ‘dictum’ (to lend freely at a penalty rate against good collateral); and sometimes it doesn’t, as in Research has meant that some modes of operation, like Norman’s, have rightly fallen by the wayside. It has helped nuance others, like Bagehot’s. And it guides us as to which practices to retain, reinforce and enhance, and which should be discarded.

The practice of monetary policy in the Great Moderation is another example. It informed theorising and research on Inflation Targeting. And, in turn, through trial, error and refinement, research has helped inform policy with empirical insights and workhorse models.

In the theoretical space, this process led to Woodford’s dictum that in modern central banking very little else matters beyond expectations. In contrast, the practice of central banking in a messy real world where people use various heuristics, including rational inattention, has shown the limitations of such logical extremes.

Research showed how central bank transparency and accountability make essential contributions to policy effectiveness, in a way that is complementary to ensuring central banks have democratic legitimacy. That remains an important insight in the era of enlarged and empowered central banking.Practice moved ahead of theory; theory caught up and refined practice. And the effectiveness of policy improved as a result. The crisis has meant practice has once again leapt ahead of theory. During its depths, the lessons of history and insights from psychology were arguably more valuable than precisions of dynamic programming. Our workhorse models didn’t have financial sectors; meaning questions of financial stability were not even asked, let alone answered. A great deal of improvisation was required to avoid a second Great Depression. It is vital that we draw on the experience from the crisis to rethink the way we understand the economy, the financial system, and the institutions we supervise.

To do so, we need not only to study recent history more deeply but also to apply formal methods to refine our depictions of economic dynamics, as well as the policy tools we have to shape those dynamics in socially desirable ways. We need to catalyse thinking on new approaches towards policies that have assumed greater importance since the crisis from macroprudential policy to bank resolution.

At the Bank of England, our enhanced research function, including a new Research Hub, will bring together staff and thinking from across the institution creating a two-way flow between research and policy. It will seek to foster a shared understanding of the frontier of policy possibilities amongst colleagues and ensure that the insights gained in one policy arena can benefit others. But these efforts will not succeed if they are confined to the corridors of Threadneedle and Moorgate. The Bank recognises that we need to do more to reach out to the wider research community.

Policymaking can benefit tremendously from advances in all fields of economics and finance; from psychology to epidemiology; from computer science to law. That is why I am pleased to see such a diverse range of discussants and attendees at the conference today. In order to focus the conversations we are being clear about the key questions that interest us, as policymakers, the most. Let me now turn to those. Our One Bank Research Agenda is structured around five themes which span all aspects of central banking. The themes are broad. That reflects the diversity of our agenda. They focus on the interactions and intersections between policy areas. They emphasise new challenges and new directions, while recognising that familiar questions facing central banks remain no less important. Today’s conference is organised around them.

The first theme covers ‘policy frameworks and interactions’.
The re-emergence of macroprudential instruments as part of the policy armoury raises fundamental questions about the interaction of monetary policy, macroprudential policy, and microprudential policy. It is essential to improve our understanding of the relationship between credit cycles and systemic risks. Since credit market developments both affect and reflect potential growth in the broader economy, they – and macroprudential measures to influence them – require careful study by financial and monetary policymakers alike. The advent of a new, enhanced policy toolkit raises vital questions about the effectiveness of individual policy tools; their joint operation; and how they interact domestically and across borders.

The second theme covers ‘evaluating regulation, resolution, and market structures’.
The financial crisis precipitated a radical overhaul of the approach towards regulation, supervision and resolution. Regulatory policies have shifted from a near-exclusive focus on microprudential resilience to a more balanced emphasis on minimising systemic risk. In the whirlwind of essential change, there has been, however, relatively little assessment of the overall effect of reform on the financial system as a whole. Moreover, our understanding of the ‘system’ must extend well beyond the banking sector to encompass the whole of market-based finance. The interplay between the reform process and the changing nature of financial intermediation also raises fundamental questions about how incentives and market structures might evolve and what policy might need to do to keep up.

The third theme covers ‘Policy operationalization and implementation’.
The practice of central banking constantly underscores that implementation and communication of policy can be as important as its design.  During the crisis central banks around the world made extensive and imaginative use of their balance sheets in pursuit of their objectives. This extraordinary range of policy responses provides an unparalleled opportunity to take stock of what worked, and why. As Ben Bernanke observed “the trouble with QE is it works in practice, but it doesn’t work in theory.” Understanding better the transmission mechanisms of QE, and the extent to which they are state dependent, can provide enormous insights to its effectiveness as a policy tool specifically, as well as the formation of agents’ expectations and the functioning of financial markets more generally. Recent innovations have not been confined to new tools. As I noted previously, better communication and greater transparency has also made policy more effective. Transparency has taken centre stage as policymakers sought to restore confidence in the financial system, as they conducted and published stress test results to create more transparently resilient banks; and as they gave guidance to clarify their reaction functions. Recently, research informed the recommendations of the Warsh Review into what we could do to enhance monetary policy transparency here at the Bank of England. This research theme continues in that vein by asking what more might be done to enhance effective transparency in all areas of policy.

The fourth research theme covers ‘New data, methodologies and approaches’.
Increasing amounts of data – structured and unstructured, current and historical – are available on almost every aspect of the economy and the financial system. That holds great promise. Computing power has transformed our economies; it needs now to be harnessed to transform our understanding of them. Theoretical and methodological techniques continue to advance. In some cases that will mean measurement ahead of theory. That is one way to advance. In the short term, a black box could be better than none, and with time the patterns it reveals could prompt a greater understanding of the underlying forces. Recall that Kepler needed to uncover the empirics of planetary motion before Newton could conjure the theory of celestial mechanics. It is important to exploit developments in advanced analytics of large data sets to formulate better policy. They’ll improve our understanding of household and corporate behaviour, the macroeconomy and risks to the financial system. They need to be harnessed to enhance our forecasting and stress testing capabilities. To complement this theme, we will release historical data sets including detailed breakdowns of the Bank’s inflation expectations survey, our Agents’ company visit scores, and very long back-runs of key economic and financial variables. This is one of the ways we will look to increase the permeability of our research. We are seeking also new ways to visualise and analyse the increasingly rich information sets that are available.

The fifth and final research theme covers the ‘Response to fundamental changes’.
Fundamental technological and structural trends will have a significant bearing on economic dynamics. Although they are likely to play out over a period that is longer than the Bank’s typical policy horizon, these trends will have profound implications for central banks. They include changing demography, increasing longevity, inequality, climate change, the increasing importance of emerging economies and the development of digital currencies. By affecting a range of phenomena – from the evolution of real interest rates to risks to the financial sector to the future of money and banking itself – all of these trends have the potential to re-shape our policy challenges. We need research to set us on the front foot to face them.

The Impact of Low Interest Rates

In the UK interest rates are artificially low, and in Australia, rates were cut last month. What are the potential implications of a low interest rate policy? Is it good strategy, or a gamble?

In a speech given by Kristin Forbes, External MPC Member, Bank of England, the various impacts of ultra-low interest rates are outlined. It is worth reflecting on the significant range of implications.

By way of background, the UK, recovery is now well in progress and self-sustaining – despite continual headwinds from abroad, unemployment has fallen rapidly, from 8.4% about 3 years ago to 5.7% today, and is expected to continue to fall to reach its equilibrium rate within two years. Wage growth appears to finally be picking up, so that when combined with lower oil prices, families will, at long last, see their real earnings increase. However, one piece of the economy that has not yet started this process of normalization, however, is interest rates. Bank Rate – the main interest rate set by the Bank of England – remains at its emergency level of 0.5%. This near-zero interest rate made sense during the crisis and early stages of the faltering recovery. It continues to make sense today. But at what point will it no longer make sense? Low interest rates provide a number of benefits. For example, they make it easier for individuals, companies and governments to pay down debt. They make it more attractive for businesses to invest – stimulating production and job creation. They have helped allow the financial system to heal. They have played a key role in supporting the UK’s recent recovery. Increases in interest rates – especially after being sustained at low levels for so long – can also involve risks.

But there are also costs and risks from keeping interest rates at emergency levels for a sustained period, especially as an economy returns to more normal functioning. Interest rates sustained at emergency levels could lead to significant issues. We summarise the main arguments.

(1) inflationary pressures
Low headline inflation and stable domestically-generated inflation are unlikely to persist if interest rates remain low. The output gap is closing and there is limited slack left in the economy. The rate of wage growth is increasing – with AWE total pay growth in the three months to December of 2.1% relative to a year earlier, but 5.4% (annualised) relative to 3-months earlier. Since wages are an important component of prices and their recent pick-up has not been matched with a corresponding increase in productivity, these wage increases will support a pickup in inflation. If this pickup is gradual, as expected, inflationary pressures should only build slowly over time, so that interest rates can be increased slowly and gradually as necessary.

(2) asset bubbles and financial vulnerabilities
As rates continue to be low, especially during a period of recovery, the risks to the financial system could grow. More specifically, when interest rates are low, investors may “search for yield” and shift funds to riskier investments that are expected to earn a higher return – from equity markets to high-yield debt markets to emerging markets. This could drive up prices in these other markets and potentially create “bubbles”. This can not only lead to an inefficient allocation of capital, but leave certain investors with more risk than they appreciate. An adjustment in asset prices can bring about losses that are difficult to manage, especially if investments were supported by higher leverage possible due to low rates. If these losses were widespread across an economy, or affected systemically-important institutions, they could create substantial economic disruption.

(3) limited tools to respond to future challenges
There is less “firepower” to respond to future contingencies. There is no shortage of events that could cause growth to slow and inflation to fall in the future – and the first response is normally to reduce interest rates. Reductions in interest rates can be an important tool for stabilizing an economy. If Bank Rate remains around its current level of 0.5%, however, there is obviously not room during the next recession to lower it to the degree that has typically occurred. Bank Rate could go a bit lower than 0.5%. But rates could not be lowered by the average 3.8 percentage points that occurred during past easing cycles without creating severe distortions to the financial system and functioning of the economy. Regulators could instead use other tools to loosen monetary policy – such as guidance on future rate changes or quantitative easing. These tools are certainly viable, but it is harder to predict their impact and harder to assess their effectiveness than for changes in interest rates.

(4) an inefficient allocation of resources and lower productivity
Is there a chance that a prolonged period of near-zero interest rates is allowing less efficient companies to survive and curtailing the “creative destruction” that is critical to support productivity growth? Or even within existing, profitable companies – could a prolonged period of low borrowing costs reduce their incentive to carefully assess and evaluate investment projects – leading to a less efficient allocation of capital within companies? Any of these effects of near zero-interest rates could play a role in explaining the UK’s unusually weak productivity growth since the crisis. These types of concerns gained attention in Japan during the 1990s after the collapse of the Japanese real estate and stock market bubbles. During this period, many banks followed a policy of “forbearance”, during which they continued to lend to companies that would otherwise have been insolvent. These unprofitable companies kept alive by lenient banks were often referred to by the colourful name of “zombies”.

(5) vulnerabilities in the structure of demand
A fifth possible cost of low interest rates is that it could shift the sources of demand in ways which make underlying growth less balanced, less resilient, and less sustainable. This could occur through increases in consumption and debt, and decreases in savings and possibly the current account. Some of these effects of low interest rates on the sources of demand are not surprising and are important channels by which low interest rates are expected to stimulate growth. But if these shifts are too large – or vulnerabilities related to over consumption, over borrowing, insufficient savings, or large current account deficits continue for too long – they could create economic challenges.

(6) higher inequality
A final concern related to an extended period of ultra-accommodative monetary policy is how it might affect inequality. Changes in monetary policy always have distributional implications, but these concerns have recently received renewed attention – possible due to increased concerns about inequality more generally, or possibly because quantitative easing has more immediate and apparent distributional implications. How a sustained period of low interest rates impacts inequality, however, is far from clear cut. There are some channels by which low interest rates – and especially quantitative easing – can aggravate inequality. As discussed above, lower interest rates tend to boost asset values. Recent episodes of quantitative easing have also appeared to increase asset prices – especially in equity markets – although the magnitude of this effect is hard to estimate precisely. Holdings of financial assets are heavily skewed by age and income group, with close to 80% of gross financial assets of the household sector held by those over 45 years old and 40% held by the top 5% of households. As discussed in a recent BoE report, these older and higher income groups will therefore see a bigger boost to their financial savings as a result of low interest rates and quantitative easing. But, counteracting these effects, are also powerful channels by which lower interest rates (and quantitative easing) can reduce inequality and disproportionately harm older income groups. More specifically, one powerful effect of low rates is to reduce pension annuity rates, interest on savings, and other fixed-income payments. This disproportionately affects the older population (who relies on pensions and fixed income as a larger share of their income) and people in the middle of the income distribution (who have some savings, but less exposure to more sophisticated investments that can increase in value from lower rates). In addition to affecting the asset and earnings side of individual’s balance sheets, there can also be distributional consequences on the liability and payment side. As interest rates and the cost of servicing debt fall, individuals with mortgages and other borrowing can benefit. These benefits tend to disproportionately fall on the middle class – for which mortgage and debt payments are a higher share of total income – but can also benefit the wealthy if they have high levels of borrowing.

The full PowerPoint presentation is available here.

UK PRA Sets Out How It Will Hold Senior Managers Accountable

The Prudential Regulation Authority (PRA) has today set out how it will hold senior managers in banks, building societies and designated investment firms to account if they do not take reasonable steps to prevent or stop breaches of regulatory requirements in their areas of responsibility.

In June 2013, the Parliamentary Commission for Banking Standards (PCBS) published its report “Changing Banking for Good” setting out recommendations for legislative and other action to improve professional standards and culture in the UK banking industry. This was followed by legislation in the Banking Reform Act 2013.

The Banking Reform Act introduced new powers which allow the PRA and Financial Conduct Authority (FCA) to impose regulatory sanctions on individual senior managers when a bank breaches a regulatory requirement if the senior manager responsible for the area where the breach occurred cannot demonstrate that they took reasonable steps to avoid or stop it.

The PRA has today published guidance for banks clarifying how it will exercise this new power; including examples of the kind of actions which may constitute reasonable preventive steps and how firms and individuals may evidence them.

The Banking Reform Act also creates a separate offence which could result in individual senior managers being held criminally liable for reckless decisions leading to the failure of a bank. This new criminal offence will, however, be subject to the usual standard of proof in criminal cases (‘beyond reasonable doubt’).

Andrew Bailey, Deputy Governor, Prudential Regulation, Bank of England and CEO of the PRA said

“Senior managers will be held individually accountable if the areas they are responsible for fail to meet our requirements. Our new accountability regime will hold all senior managers, including non-executive directors, to a clear standard of behaviour and we will take action where they fail to meet this.”

Insurers

In November 2014, the PRA consulted on a parallel accountability regime for the insurance sector. The Senior Insurance Managers’ Regime is aligned with the banking regime but it is not identical. The business model of insurers, the risks they pose to the PRA’s objectives and the legislative framework they operate under are different from banks.  Specifically, none of the potential criminal sanctions, nor the ‘presumption of responsibility’ in the banking regime, will apply to senior insurance managers.

The new regime also takes account of the need to introduce measures relating to governance and the fitness and propriety of individuals as part of Solvency II.

Non-executive directors (NEDs)

In November, the PRA indicated that it would issue a further consultation confirming how the PRA will apply the new Senior Managers’ Regime and Senior Insurance Managers’ Regime to NEDs in banks and insurers respectively.

The PRA has now confirmed that it will apply the Senior Managers’ Regime and Senior Insurance Managers’ Regime to the following NEDs:

  • Chairman;
  • Senior Independent Director;
  • Chair of the Risk Committee;
  • Chair of the Audit Committee; and
  • Chair of the Remuneration Committee.

The PRA’s Senior Managers’ Regime and Senior Insurance Managers’ Regime will therefore focus on those NEDs with specific responsibilities for areas or committees directly relevant to a firm’s safety and soundness. In addition to any collective responsibility they may have as members of the board, non-executives in scope of the Senior Managers’ Regime and Senior Insurance Managers’ Regime will be held individually accountable for their areas of responsibility. The PRA is also proposing to require firms to ensure that all board members are held to high standards of conduct.

The paper also includes details of the FCA’s approach to non-executive directors. Following the FCA’s decision to narrow the scope of its Senior Managers’ Regime to include a smaller group of NEDs, the PRA is also consulting on notification and assessment requirements for those NEDs who are not included in the regime. This will allow the UK to comply with its EU requirements to ensure the suitability of all members of a bank’s board.

Whistleblowing

The PCBS also recommended that banks put in place mechanisms to enable their employees to raise concerns internally, and that the PRA and FCA ensure these mechanisms are effective. The PRA and FCA have today set out a package of measures to formalise firms’ whistleblowing procedures. These proposals aim to ensure that all employees are encouraged to blow the whistle where they suspect misconduct, confident that their concerns will be considered and that there will be no personal repercussions.

How Does High-Frequency Trading Impact Market Efficiency?

The Bank of England just published a research paper examining how High-Frequency Trading impacts Market Efficiency. High-frequency trading (HFT), where automated computer traders interact at lightning-fast speed with electronic trading platforms, has become an important feature of many modern financial markets. The rapid growth, and increased prominence, of these ultrafast traders have given rise to concerns regarding their impact on market quality and market stability. These concerns have been fuelled by instances of severe and short-lived market crashes such as the 6 May 2010 ‘Flash Crash’ in the US markets. One concern about HFT is that owing to the high rate at which HFT firms submit orders and execute trades, the algorithms they use could interact with each other in unpredictable ways and, in particular, in ways that could momentarily cause price pressure and price dislocations in financial markets.

Interactions among high-frequency traders Evangelos Benos, James Brugler, Erik Hjalmarsson and Filip Zikes

Using a unique data set on the transactions of individual high-frequency traders (HFTs), we examine the interactions between different HFTs and the impact of such interactions on price discovery. Our main results show that for trading in a given stock, HFT firm order flows are positively correlated at high-frequencies. In contrast, when performing the same analysis on our control sample of investment banks, we find that their order flows are negatively correlated. Put differently, aggressive (market-“taking”) volume by an HFT will tend to lead to more aggressive volume, in the same direction of trade, by other HFTs over the next few minutes. For banks the opposite holds, and a bank’s aggressive volume will tend to lead to aggressive volume in the opposite direction by other banks. As far as activity across different stocks is concerned, HFTs also tend to trade in the same direction across different stocks to a significantly larger extent than banks.

We find that HFT order flow is more correlated over time than that of the investment banks, both within and across stocks. This means that HFT firms tend more than their peer investment banks to buy or sell aggressively the same stock at the same time. Also, a typical HFT firm tends to simultaneously aggressively buy and sell multiple stocks at the same time to a larger extent than a typical investment bank. What does that mean for market quality? A key element of a well-functioning market is price efficiency; this characterises the extent to which asset prices reflect fundamental values. Dislocations of market prices are clear violations of price efficiency as they happen in the absence of any news about fundamental values.

Given the apparent tendency to commonality in trading activity and trading direction among HFTs, we further examine whether periods of high HFT correlation are associated with price impacts that are subsequently reversed. Such reversals might be interpreted as evidence of high trade correlations leading to short-term price dislocations and excess volatility. However, we find that instances of correlated trading among HFTs are associated with a permanent price impact, whereas instances of correlated bank trad- ing are, in fact, associated with future price reversals. We view this as evidence that the commonality of order flows in the cross-section of HFTs is the result of HFTs’ trades being informed, and as such have the same sign at approximately the same time. In other words, HFTs appear to be collectively buying and selling at the “right” time. The results are also in agreement with the conclusions of Chaboud, Chiquoine, Hjalmarsson, and Vega (2014), who find evidence of commonality among the trading strategies of algorithmic trades in the foreign exchange market, but who also find no evidence that such commonality appears to be creating price pressures and excess volatility that would be detrimental to market quality.

A final caveat is in order. The time period we examine is one of relative calm in the UK equity market. This means that additional research on the behaviour of HFTs, particularly during times of severe stress in equity and other markets, would be necessary in order to fully understand their role and impact on price efficiency.

DFA’s perspective is a little different. The underlying assumption in the paper is that more transactions gives greater market efficiency, and therefore HFT is fine. We are not so sure, as first the market efficiency assumption should be questioned, second it appears those without HFT loose out, so are second class market participants – those with more money to invest in market systems can make differentially more profit. This actually undermines the concept of a fair and open market. We think HFC needs to be better controlled to avoid an HFC arms race in search of ever swifter transaction times. To an extent therefore, the paper missed the point.