The Disruptive Power of Digital Currencies

The BIS Committee on Payments and Market Infrastructures just published a report on Digital currencies.  New digital platforms have the potential to disrupt traditional payment mechanisms. Much of the innovation is emerging from non-bank sectors and in a devolved, decentralised, peer-to-peer mode, without connection to sovereign currencies or authority. So how should central banks respond?

One option is to consider using the technology itself to issue digital currencies. In a sense, central banks already issue “digital currency” in that reserve balances now only exist in electronic form and are liabilities of the central bank. The question is whether such digital liabilities should be issued using new technology and be made more widely available than at present. This raises a wide range of questions, including the impact on the payments system, the privacy of transactions, the impact on private sector innovation, the impact on deposits held at commercial banks, the impact on financial stability of making a risk-free digital asset more widely available, the impact on the transmission of monetary policy, the technology which would be deployed in such a system and the extent to which it could be decentralised, and what type of entities would exist in such a system and how they should be regulated. Within the central bank community, the Bank of Canada and the Bank of England have begun research into a number of these topics.

Overall, the report considers the possible implications of interest to central banks arising from these innovations. First, many of the risks that are relevant for e-money and other electronic payment instruments are also relevant for digital currencies as assets being used as a means of payment. Second, the development of distributed ledger technology is an innovation with potentially broad applications. Wider use of distributed ledgers by new entrants or incumbents could have implications extending beyond payments, including their possible adoption by some financial market infrastructures (FMIs), and more broadly by other networks in the financial system and the economy as a whole. Because of these considerations, it is recommended that central banks continue monitoring and analysing the implications of these developments, both in digital currencies and distributed ledger technology. DFA questions whether this “watch and monitor” response is a sufficient strategy.

Central banks typically take an interest in retail payments as part of their role in maintaining the stability and efficiency of the financial system and preserving confidence in their currencies. Innovations in retail payments can have important implications for safety and efficiency; accordingly, many central banks monitor these developments. The emergence of what are frequently referred to as “digital currencies” was noted in recent reports by the Committee on Payments and Market Infrastructures (CPMI) on innovations and non-banks in retail payments. A subgroup was formed within the CPMI Working Group on Retail Payments to undertake an analysis of such “currencies” and to prepare a report for the Committee.

The subgroup has identified three key aspects relating to the development of digital currencies. The first is the assets (such as bitcoins) featured in many digital currency schemes. These assets typically have some monetary characteristics (such as being used as a means of payment), but are not typically issued in or connected to a sovereign currency, are not a liability of any entity and are not backed by any authority. Furthermore, they have zero intrinsic value and, as a result, they derive value only from the belief that they might be exchanged for other goods or services, or a certain amount of sovereign currency, at a later point in time. The second key aspect is the way in which these digital currencies are transferred, typically via a built-in distributed ledger. This aspect can be viewed as the genuinely innovative element within digital currency schemes. The third aspect is the variety of third-party institutions, almost exclusively non-banks, which have been active in developing and operating digital currency and distributed ledger mechanisms. These three aspects characterise the types of digital currencies discussed in this report.

A range of factors are potentially relevant for the development and use of digital currencies and distributed ledgers. Similar to retail payment systems or payment instruments, network effects are important for digital currencies, and there are a range of features and issues that are likely to influence the extent to which these network effects may be realised. It has also been considered whether there may be gaps in traditional payment services that are or might be addressed by digital currency schemes. One potential source of advantage, for example, is that a digital currency has a global reach by design. Moreover, distributed ledgers may offer lower costs to end users compared with existing centralised arrangements for at least some types of transactions. Also relevant to the emergence of digital currency schemes are issues of security and trust, as regards the asset, the distributed ledger, and the entities offering intermediation services related to digital currencies.

Digital-Money-BIS-1Digital currencies and distributed ledgers are an innovation that could have a range of impacts on many areas, especially on payment systems and services. These impacts could include the disruption of existing business models and systems, as well as the emergence of new financial, economic and social interactions and linkages. Even if the current digital currency schemes do not persist, it is likely that other schemes based on the same underlying procedures and distributed ledger technology will continue to emerge and develop.

The asset aspect of digital currencies has some similarities with previous analysis carried out in other contexts (eg there is analytical work from the late 1990s on the development of e-money that could compete with central bank and commercial bank money). However, unlike traditional e-money, digital currencies are not a liability of an individual or institution, nor are they backed by an authority. Furthermore, they have zero intrinsic value and, as a result, they derive value only from the belief that they might be exchanged for other goods or services, or a certain amount of sovereign currency, at a later point in time. Accordingly, holders of digital currency may face substantially greater costs and losses associated with price and liquidity risk than holders of sovereign currency.

The genuinely innovative element seems to be the distributed ledger, especially in combination with digital currencies that are not tied to money denominated in any sovereign currency. The main innovation lies in the possibility of making peer-to-peer payments in a decentralised network in the absence of trust between the parties or in any other third party. Digital currencies and distributed ledgers are closely tied together in most schemes today, but this close integration is not strictly necessary, at least from a theoretical point of view.

This report describes a range of issues that affect digital currencies based on distributed ledgers. Some of these issues may work to limit the growth of these schemes, which could remain a niche product even in the long term. However, the arrangements also offer some interesting features from both demand side and supply side perspectives. These features may drive the development of the schemes and even lead to widespread acceptance if risks and other barriers are adequately addressed.

The emergence of distributed ledger technology could present a hypothetical challenge to central banks, not through replacing a central bank with some other kind of central body but mainly because it reduces the functions of a central body and, in an extreme case, may obviate the need for a central body entirely for certain functions. For example, settlement might no longer require a central ledger held by a central body if banks (or other entities) could agree on changes to a common ledger in a way that does not require a central record-keeper and allows each bank to hold a copy of the (distributed) common ledger. Similarly, in some extreme scenarios, the role of a central body that issues a sovereign currency could be diminished by protocols for issuing non-sovereign currencies that are not the liability of any central institution.

There are different ways in which these systems might develop: either in isolation, as an alternative to existing payment systems and schemes, or in combination with existing systems or providers. These approaches would have different implications, but both could have significant effects on retail payment services and potentially on FMIs. There could also be potential effects on monetary policy or financial stability. However, for any of these implications to materialise, a substantial increase in the use of digital currencies and/or distributed ledgers would need to take place. Central banks could consider – as a potential policy response to these developments – investigating the potential uses of distributed ledgers in payment systems or other types of FMIs.

Proposed Basel Market Risk Framework Will Demand More Capital

Trading banks will find their capital requirements rising by more than 2%, according to the Basel Committee on Banking Supervision who has today published the results of its interim impact analysis of its fundamental review of the trading book. The report assesses the impact of proposed revisions to the market risk framework set out in two consultative documents published in October 2013 and December 2014. Further revisions to the market risk rules have since been made, and the Committee expects to finalise the standard around year-end.

The analysis was based on a sample of 44 banks (including 2 from Australia) that provided usable data for the study and assumed that the proposed market risk framework was fully in force as of 31 December 2014. It shows that the change in market risk capital charges would produce a 4.7% increase in the overall Basel III minimum capital requirement. When the bank with the largest value of market risk-weighted assets is excluded from the sample, the change in total market risk capital charges leads to a 2.3% increase in overall Basel III minimum regulatory capital.

Compared with the current market risk framework, the proposed standard would result in a weighted average increase of 74% in aggregate market risk capital. When measured as a simple average, the increase in the total market risk capital requirement is 41%. For the median bank in the same sample, the capital increase is 18%.

Compared with the current internally modelled approaches for market risk, the capital requirement under the proposed internally modelled approaches would result in an increase of 54%. For the median bank, the capital requirement under the proposed internally modelled approaches is 13% higher.

Compared with the current standardised approach for market risk, the capital requirement under the proposed standardised approach is 128% higher. For the median bank, the capital requirement under the proposed standardised approach is 51% higher.

Revisiting Three Intellectual Pillars of Monetary Policy Received Wisdom

A really excellent speech by Claudio Borio, Head of the Monetary and Economic Department of the BIS, at the Cato Institute in which he questions three deeply held beliefs that underpin current monetary policy received wisdom and cites Mark Twain “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

The speech draws two conclusions. First, the well known trend decline in real interest rates is, at least in part, a disequilibrium phenomenon, not consistent with lasting financial, macroeconomic and monetary stability and unusually easy monetary policy spreads globally.

Second there is a need to adjust current monetary policy frameworks so that monetary policy plays a more active role in preventing systemic financial instability and its huge macroeconomic costs. This calls for taking financial booms and busts more systematically into account. Financial booms sap productivity by misallocating resources.

One slide in particular caught my attention. It is Table 1: Early warning indicators for banking distress – risks ahead.

BIS-Nov-2015The three beliefs he questions are worth thinking about:

  1. Is it appropriate to define equilibrium (or natural) rates as those consistent with output at potential and with stable prices (inflation)?
  2. Is it appropriate to think of money (monetary policy) as neutral, ie as having no impact on real outcomes, over medium- to long-term horizons relevant for policy – 10-20 years or so, if not longer?
  3. Is it is appropriate to set policy on the presumption that deflations are always very costly?

He finds that there are good reasons to question these three deeply held beliefs underpinning monetary policy received wisdom.

First, defining equilibrium (or natural) rates purely in terms of the equality of actual and potential output and price stability in any given period is too narrow an approach. An equilibrium rate should also be consistent with sustainable financial and macroeconomic stability – two sides of the same coin. Here, he highlighted the role of financial booms and busts, or financial cycles.

Second, money (monetary policy) is not neutral over medium- to long-term horizons relevant for policy – 10–20 years or so, if not longer. This is precisely because it contributes to financial booms and busts, which give rise to long-lasting, if not permanent, economic costs. Here he highlighted the neglected impact of resource misallocations on productivity growth.

Finally, deflations are not always costly in terms of output. The evidence indicates that the link comes largely from the Great Depression and, even then, it disappears if one controls for asset price declines. Here he highlighted the costs of asset price, especially property price, declines and the distinction between supply-driven and demand-driven deflations.
Therefore, the long-term decline in real interest rates since at least the 1990s may well be, in part, a disequilibrium phenomenon, not consistent with lasting financial, macroeconomic and monetary stability. Here he highlighted the asymmetrical monetary policy response to financial booms and busts, which induces an easing bias over time.

There is a need to adjust monetary policy frameworks to take financial booms and busts systematically into account. This, in turn, would avoid that easing bias and the risk of a debt trap. Here he highlighted that it is imprudent to rely exclusively on macroprudential measures to constrain the buildup of financial imbalances. Macroprudential policy must be part of the answer, but it cannot be the whole answer.

Basel III Implementation In Australia: Slow But Sure?

A progress report (the ninth) on adoption of the Basel regulatory framework was issued today by BIS.  This report sets out the adoption status of Basel III regulations for each Basel Committee on Banking Supervision (BCBS) member jurisdiction as of end-September 2015. It updates the Committee’s previous progress reports, which have been published on a semiannual basis since October 2011.

Regarding the consistency of regulatory implementation, the Committee has published its assessment reports on 22 members – Australia, Brazil, Canada, China, nine members of the European Union, Hong Kong SAR, India, Japan, Saudi Arabia, Mexico, Singapore, South Africa, Switzerland and the United States – regarding their implementation of Basel III risk-based capital regulations, which are available on the Committee’s website. This includes all members that are home jurisdictions of global systemically important banks (G-SIBs). The Committee has also published five assessment reports (Hong Kong SAR, India, Saudi Arabia, Mexico and South Africa) on the domestic adoption of the Basel LCR standards. The assessments of Russia, Turkey, South Korea and Indonesia are under way, including consistency of implementation of both risk-based capital and LCR standards. Further, preparatory work for the assessment of G-SIB standards has already started in mid-2015 and its assessment work will start later this year. By September 2016, the Committee aims to have assessed the consistency of risk-based capital standards of all 27 member jurisdictions and the consistency of G-SIB standards of all five member jurisdictions that are home jurisdictions of G-SIBs.

The Basel III framework builds on and enhances the regulatory framework set out under Basel II and Basel 2.5. The structure of the attached table has been revamped (effective from October 2015) to monitor the adoption progress of all Basel III standards, which will come into effect by 2019. The monitoring table no longer includes the reporting columns for Basel II and 2.5, as almost all BCBS member jurisdictions have completed their regulatory adoption. The attached table therefore reviews members’ regulatory adoption of the following standards:

  • Basel III Capital: In December 2010, the Committee released Basel III, which set higher levels for capital requirements and introduced a new global liquidity framework. Committee members agreed to implement Basel III from 1 January 2013, subject to transitional and phase-in arrangements.
  1. Capital conservation buffer: The capital conservation buffer will be phased in between 1 January 2016 and year-end 2018, becoming fully effective on 1 January 2019.
  2. Countercyclical buffer: The countercyclical buffer will be phased in parallel to the capital conservation buffer between 1 January 2016 and year-end 2018, becoming fully effective on 1 January 2019.
  3. Capital requirements for equity investment in funds: In December 2013, the Committee issued the final standard for the treatment of banks’ investments in the equity of funds that are held in the banking book, which will take effect from 1 January 2017.
  4. Standardised approach for measuring counterparty credit risk exposures (SA-CCR): In March 2014, the Committee issued the final standard on SA-CCR, which will take effect from 1 January 2017. It will replace both the Current Exposure Method (CEM) and the Standardised Method (SM) in the capital adequacy framework, while the IMM (Internal Model Method) shortcut method will be eliminated from the framework.
  5. Securitisation framework: The Committee issued revisions to the securitisation framework in December 2014 to strengthen the capital standards for securitisation exposures held in the banking book, which will come into effect in January 2018.
  6. Capital requirements for bank exposures to central counterparties: In April 2014, the Committee issued the final standard for the capital treatment of bank exposures to central counterparties, which will come into effect on 1 January 2017.
  • Basel III leverage ratio: In January 2014, the Basel Committee issued the Basel III leverage ratio framework and disclosure requirements. Implementation of the leverage ratio requirements began with bank-level reporting to national supervisors until 1 January 2015, while public disclosure started on 1 January 2015. The Committee will carefully monitor the impact of these disclosure requirements. Any final adjustments to the definition and calibration of the leverage ratio will be made by 2017, with a view to migrating to a Pillar 1 (minimum capital requirements) treatment on 1 January 2018 based on appropriate review and calibration.
  • Basel III liquidity coverage ratio (LCR): In January 2013, the Basel Committee issued the revised LCR. It came into effect on 1 January 2015 and is subject to a transitional arrangement before reaching full implementation on 1 January 2019.
  • Basel III net stable funding ratio (NSFR): In October 2014, the Basel Committee issued the final standard for the NSFR. In line with the timeline specified in the 2010 publication of the liquidity risk framework, the NSFR will become a minimum standard by 1 January 2018.
  • G-SIB framework: In July 2013, the Committee published an updated framework for the assessment methodology and higher loss absorbency requirements for G-SIBs. The requirements will be introduced on 1 January 2016 and become fully effective on 1 January 2019. To enable their timely implementation, national jurisdictions agreed to implement by 1 January 2014 the official regulations/legislation that establish the reporting and disclosure requirements.
  • D-SIB framework: In October 2012, the Committee issued a set of principles on the assessment methodology and the higher loss absorbency requirement for domestic systemically important banks (D-SIBs). Given that the D-SIB framework complements the G-SIB framework, the Committee believes it would be appropriate if banks identified as D-SIBs by their national authorities were required to comply with the principles in line with the phase-in arrangements for the G-SIB framework, ie from January 2016.
  • Pillar 3 disclosure requirements: In January 2015, the Basel Committee issued the final standard for revised Pillar 3 disclosure requirements, which will take effect from end-2016 (ie banks will be required to publish their first Pillar 3 report under the revised framework concurrently with their year-end 2016 financial report). The standard supersedes the existing Pillar 3 disclosure requirements first issued as part of the Basel II framework in 2004 and the Basel 2.5 revisions and enhancements introduced in 2009.
  • Large exposures framework: In April 2014, the Committee issued the final standard that sets out a supervisory framework for measuring and controlling large exposures, which will take effect from 1 January 2019.

They published an assessment of Australia’s progress:

Oz-Status-Key OZ-Status-9Still a long way to go; highly complex, and this is before Basel IV arrives. Is more complexity better?

Digital Darwinism and the financial industry

In a speech entitled Digital Darwinism and the financial industry – a supervisor’s perception, Dr Andreas Dombret, Member of the Executive Board of the Deutsche Bundesbank, says in the context of digital disruption, there is an “open end” to the evolution of the digital financial sector and if banks don’t think “digitally”, they’re going to find it difficult to compete for digital customers.

Dinosaurs are an often-used means of illustrating how “Darwinism” works. We still don’t really know why those creatures – which ruled the earth for millions of years – suddenly became extinct. Some make volcanic eruptions responsible; others cite meteorites or a sharp drop in sea level as a possible explanation. In any case, the assumption is not that dinosaurs ceased to exist because they could not cope with their new environment or adapt quickly enough. Applying this diagnosis to the financial sector, where banks have reigned throughout the last few centuries, and supposing that the digital transition indeed constitutes a new environment for banks, one may pose a rather provocative question: are banks dinosaurs that will one day become extinct? You may guess that I do not share this doom scenario, so let me start out by describing my views on the evolution of the banking business.

The digital era may indeed be considered a new environment for banks. Digitalisation of the financial sector is an irrevocable change that came about due to several factors.

First, the digitalisation of the financial sector has been fuelled by the development of highly effective, state-of-the-art technologies like broadband networks, advances in data processing and the ubiquity of smartphones. And there is a premise that is common to virtually all technological advances in market economies in the last few centuries: when a product becomes available, sooner or later it creates its own demand and puts market forces into action.

That means technological and social changes are intertwined. Bank customers are becoming increasingly open to digital banking. Think of innovative concepts such as online video consultation services, digital credit brokerage and the incorporation of social media into banking. Banking is still a “people business”, but it’s no longer reduced to proximate and personal relationships. So there is plenty of demand for use of the technological potential of digital banking: cheap and quick automatised processes, solutions for complex financial issues, service tailored to customers’ individual needs.

A fundamental challenge that banks now face is that in some business fields, we may expect a sudden and rapid change of the game that is being played. One rather obvious case in point is that of payment services. Service providers such as PayPal or Apple have implemented payment systems geared to consumers in a digital environment. Once customers become used to a new way of paying, competitors offering similar products will certainly have difficulties trying to convince customers to switch providers. The pioneer may have a decisive advantage.

Now, in evolutionary terms, the question is whether banks can adapt quickly enough. Banks have used IT for decades, but these fast-moving times present wholly new possibilities for its use: P2P lending becomes feasible, internet and mobile applications are sprouting up and internet giants such as Google or Facebook are cultivating “big data” methods. These enterprises have grown up with – at times – entirely new perceptions of business, work and life. And they have the appropriate staff. That may be crucial. It is one thing to build new ideas, but quite another to incorporate them into the company DNA.

Traditional banks, on the other hand, typically do not have a digital DNA. Theirs is an analogue world in which they have refined their knowledge about banking over decades and built up a customer relationship based on trust. Think of areas like investment advice and corporate finance as well as banks’ own business of generating synergies between business strands. The question now is: what part of their knowledge is still valuable and what part do banks need to reframe?

However, we cannot predict how the financial sector will look in ten years’ time. There are just too many “unknown unknowns”. Still, there is a recurring fallacy that reduces evolution to a narrow one-way street. If there are new market entrants whose businesses are well-adjusted to the digital environment, banks should be inclined to imitate their behaviour. But – to be clear – there is no one-size-fits-all strategy for digital banking. As in other industries, there will always be demand for more differentiated strategies, for example individual and personalised services as opposed to algorithm-based advice. Also, we should not be surprised to see the focus return to a key component of the banking business: establishing safety and trust.

Furthermore, the digital age does not simply redistribute market shares of a fixed revenue pie: there are also entirely new opportunities for desirable businesses.

Convenient banking is valuable. Banks could benefit from this, either through greater customer loyalty or through additional business volumes resulting from extra services. Win-win schemes are also conceivable in credit markets. “Big data” methods can generate highly informative individual risk profiles. This could enable banks to extend loans to private customers and small businesses which would otherwise not receive any financing. Even investment counselling could benefit. Video-based consulting, for example, does more than reflect modern life style of customers; it may also reduce costs for banks by rendering some branch offices unnecessary.

Other keywords of digital openings are “co-creation”, where customers participate in the development of products, and “multichannel banking”. But my aim today is not to present an all-encompassing overview on digital bank business ideas. Instead, let us move one step back and look at the bigger picture. Reshaping the financial sector doesn’t need to be left to new market entrants. This creative challenge can also be taken up by established banks. Supervisors, too, have an interest in seeing banks engage in innovation if this enhances the functionality of the financial sector and stabilises profitability in the medium and long run.

To sum up, there is an “open end” to the evolution of the digital financial sector. If you ask diehard evolutionists for a forecast of the future, they will merely point to a trial-and-error process that should eventually give us an answer. For an individual company, that is of course not helpful. As a banking supervisor, I am not inclined to attempt a market forecast. Still, there is a bottom line for banks from the line of thought I outlined earlier, namely that it is appropriate neither to blindly imitate nor to stick to old habits. The message to every player in the financial industry is simple: rather than being caught off-guard, banks have to participate actively in shaping future banking services. A new game is being played, and new strategies need to be developed and executed decisively.

3. Cyber risks – an evolutionary attachment to the digital bank

Along with the digitalisation of industries, there is another evolution that warrants our attention. It is a development that is neither intended by the visionaries and trailblazers of the digital world nor beneficial. I am referring to the evolution of cybercrime.

While we cannot predict how banking will look in ten or twenty years’ time, we can be almost certain that risks of fraud, theft and manipulation in banks through cyberspace will continue to rise. The reason is straightforward: digital channels can be used to steal a lot of assets with comparatively little effort today.

Nowadays, a large proportion of banks’ assets and value-generating capability is stored on hard drives and servers. The technical infrastructure facilitates the managing of bank accounts and grants access to money. But it also provides access to vast sources of data. There have been several incidents recently of truly large-scale data theft. Company secrets, too, are at stake. If, for example, the trading algorithms of your bank became known to others through illegal activities, they could be exploited in the market, causing huge losses to banks. In the same way, politically motivated acts of sabotage jeopardise trust in financial functions and integrity.

Looking at those on the other side of cybercrime, the potential attackers – they often have access to far more powerful weapons than before and convenient access through the internet. Targeted attacks on IT systems can originate from anywhere in the world. Hackers often need little more than a laptop with internet access.

Why do we have to expect a continuous evolution in this field? Attack vehicles like computer viruses differ widely and may target any chink in a bank’s defence, rather as human viruses attack biological systems. Its logic follows the arms race between criminals and law enforcers that can be traced down through human history, but is now taking place with digital weapons. What makes this evolution more dangerous still is that we now face a highly complex digital world where progress is constantly being made in technologies and innovations. But, crucially, you cannot risk a trial-and-error process here. Once an easy point of attack is identified in the IT infrastructure, the word will quickly spread and criminals from all over the world will try to exploit the weakness.

On top of this, we need to bear in mind that cyber and general IT risks are not only of a technical nature. The human factor often plays a crucial part. Employees may act in gross negligence, or they may be tricked by a Trojan horse or a phishing mail. In complex IT systems, even small system errors can quickly cause enormous damage. The error-prone human factor can only be eliminated by installing an appropriate system of controls and incentives. In today’s world, this is an important management task.

4. Adaptation as a managerial task

Before IT-related problems came to affect the very core of the digital economy, they were commonly shifted to the IT department. But this approach to IT risks is outdated. Awareness of digital risks and setting up a strategy are now a leader’s duties. If your business crucially relies on digital processing, any strategic decision at the company level requires knowledge and understanding of risks. Besides, we frequently observe that banks find it difficult to reorganise their IT systems. While a complete, “big bang” overhaul may be preferable, it often meets with resistance from many parts of the company. To avoid being locked into more and more outdated structures, banks should not just consider the expected short-term benefits when designing their IT strategy.

Furthermore, the digital world demands from banks’ managers something I would describe as unbiased attentiveness towards new technologies. If banks don’t think “digitally”, they’re going to find it difficult to compete for digital customers.

They have to reassess their client relations and even rethink different lifestyles and social trends. The individual needs and wishes of customers are more pivotal than ever before. Take a look at social networks, at online shopping or even at information research – consumers are already used to having their own needs catered for. Consequently, banks will have to get into the habit of looking at things from the customer’s perspective.

Let us also bear in mind that competition is becoming more global and more transparent, the competitors more diverse. In addition to FinTech companies, other industries with a strong IT focus are only one step away from the banking world. This means that the lines between industries are becoming blurred. Now more than ever, banks need to be aware of what the competition is doing so that they can review and refine their own strategies.

From an evolutionary perspective, adaptability is another essential attribute. The digital world welcomes experimentation, is prone to sudden trends, and is constantly changing. Although the banking industry may not always be subject to all of this constant movement, adaptability is definitely becoming more important. So a flexible IT infrastructure that supports adaptability, for example, will be vital. Business models can also be more open and flexible in structure. Just think of the “digital ecosystem” strategies banks are now deploying.

5. Towards a resilient sector

As a banking supervisor, I am wholeheartedly in favour of the goal of a stable sector. But this should not be understood as adopting a static view towards stability. For a workable financial industry, it is not decisive whether services are provided offline or online, by humans or by automated services. Our yardstick needs to measure whether the sector continues to fulfil its duty towards the real economy, which is to transform risks and provide payment and other financial services. That’s what is meant by a resilient financial sector.

To that end, we have to ensure there are no “dead ends” to the digital evolution in the financial industry. I refer to IT-related risks in particular. If we rely on computers and digitalised processes, we have to make sure that they are reliable and trustworthy. Sector-wide reputation and functionality are at stake. Nowadays, a customer’s personal payment information is stored not only at the bank but at a multitude of service providers and retailers as well. How can a bank ensure the safety of its payment services against a cyber-attack on a retailer’s network or on that of a third-party vendor? Combined efforts should be seen as insurance. You never know who will be the next victim of an attack. And attacks don’t stop at borders, so cooperation of this kind is also needed at the global level. In an interconnected and therefore interdependent financial sector, strengthening the common defence should also be in the banks’ very own interest.

6. Conclusion

Let me restate my views on “digital Darwinism”. Adaptation to a digitalised financial world does not simply require banks to develop new and ground-breaking ideas. It has more to do with a well-adjusted strategy – which means that it’s not just a race between development departments, but between leaders. As a supervisor, I therefore urge that we do not interpret digital competition as a race merely for the most advanced technologies, but for the right mix. This is why I am not in favour of comparing banks to dinosaurs. Traditional banks may typically have a pre-digital DNA, but they are capable of learning, adapting to a digital landscape and cooperating with technological pioneers. And each bank needs to find its own strategy. Banking business itself is as irreplaceable as ever before.

So what will not change? Business success will continue to hinge on entrepreneurial skills. In an increasingly digital finance sector, the role of banking will still be to serve the real economy. And banking is based on trust. To keep this in mind will be key to ensuring a thriving and stable financial sector.

Latest Global Basel III Monitoring Results Announced Today

The Bank for International Settlements has just released its latest Basel III Monitoring Report.  This included five banks from Australia, four group one, and one group two bank. This report presents the results of the Basel Committee’s latest Basel III monitoring exercise. The study is based on the rigorous reporting process set up by the Committee to periodically review the implications of the Basel III standards for banks. The results of previous exercises in this series were published in March 2015, September 2014, March 2014, September 2013, March 2013, September 2012 and April 2012.

Data have been provided for a total of 221 banks, comprising 100 large internationally active banks (“Group 1 banks”, defined as internationally active banks that have Tier 1 capital of more than €3 billion) and 121 Group 2 banks (ie representative of all other banks).

The results of the monitoring exercise assume that the final Basel III package is fully in force, based on data as of 31 December 2014. That is, they do not take account of the transitional arrangements set out in the Basel III framework, such as the gradual phase-in of deductions from regulatory capital. No assumptions were made about bank profitability or behavioural responses, such as changes in bank capital or balance sheet composition. For that reason, the results of the study are not comparable to industry estimates.

Data as of 31 December 2014 show that all large internationally active banks meet the Basel III risk-based capital minimum requirements as well as the Common Equity Tier 1 (CET1) target level of 7.0% (plus the surcharges on global systemically important banks – G-SIBs – as applicable). Between 30 June and 31 December 2014, Group 1 banks reduced their capital shortfalls relative to the higher Tier 1 and total capital target levels; the additional Tier 1 capital shortfall has decreased from €18.6 billion to €6.5 billion and the Tier 2 capital shortfall has decreased from €78.6 billion to €40.6 billion. As a point of reference, the sum of after-tax profits prior to distributions across the same sample of Group 1 banks for the six-month period ending 31 December 2014 was €228.1 billion.

Under the same assumptions, there is no capital shortfall for Group 2 banks included in the sample for the CET1 minimum of 4.5%. For a CET1 target level of 7.0%, the shortfall narrowed from €1.8 billion to €1.5 billion since the previous period.

The average CET1 capital ratios under the Basel III framework across the same sample of banks are 11.1% for Group 1 banks and 12.3% for Group 2 banks.

Basel III’s Liquidity Coverage Ratio (LCR) came into effect on 1 January 2015. The minimum requirement is set initially at 60% and will then rise in equal annual steps to reach 100% in 2019. The weighted average LCR for the Group 1 bank sample was 125% on 30 June 2014, up from 121% six months earlier. For Group 2 banks, the weighted average LCR was 144%, up from 140% six months earlier. For banks in the sample, 85% reported an LCR that met or exceeded 100%, while 98% reported an LCR at or above 60%.

Basel III also includes a longer-term structural liquidity standard – the Net Stable Funding Ratio (NSFR) – which was finalised by the Basel Committee in October 2014. The weighted average NSFR for the Group 1 bank sample was 111% while for Group 2 banks the average NSFR was 114%. As of December 2014, 75% of the Group 1 banks and 85% of the Group 2 banks in the NSFR sample reported a ratio that met or exceeded 100%, while 92% of the Group 1 banks and 93% of the Group 2 banks reported an NSFR at or above 90%.

How Much Income is Used for Debt Payments?

Australian Households have some of the highest debt service ratios (DSRs) in the world according to a new database from the Bank for International Settlements. In this post we overview the BIS analysis and discuss some of the results. They confirm earlier analysis that households here are highly leveraged and so at risk should interest rates rise, especially when incomes are static or falling in real terms.

We have charted the raw outputs, for the main countries, and focused on households. If we look at the relative position of Australia, UK, Canada and USA, Australia has a higher DSR, not least because we have so far not experienced a significant drop in house prices, and mortgage lending is very high. This is consistent with previous analysis and is also the recommended measure for macroprudential purposes.

BIS-DRS-1Looking more broadly at the 17 countries showing similar data, we sit fourth behind Netherlands, Sweden and Denmark. Note also the significant gap between these four and the rest of the set.

BIS-DSR-2By way of background, DSRs provide important information about the interactions between debt and the real economy, as they measure the amount of income used for interest payments and amortisations. Given this pivotal role, the BIS has started to produce and release aggregate DSRs for the total private non-financial sector for 32 countries from 1999 onwards. For the majority of countries, DSRs for the household and the non-financial corporate sectors are also available.

DSR is important, as it captures the share of income used for interest payments and amortisations. These debt-related flows are a direct result of previous borrowing decisions and often move slowly as they depend on the duration and other terms of credit contracts. They have a direct impact on borrowers’ budget constraints and thus affect spending. Despite this, in Australia there is no comprehensive reporting, just gross household debt and household repayments.

Since the DSR captures the link between debt-related payments and spending, it is a crucial variable for understanding the interactions between debt and the real economy. For instance, during financial booms, increases in asset prices boost the value of collateral, making borrowing easier. But more debt means higher debt service ratios, especially if interest rates rise. This constrains spending, which offsets the boost from new lending, and the boom runs out of steam at some point. After a financial bust, it takes time for debt service ratios, and thus spending, to normalise even if interest rates fall, as principal still needs to be paid down. In fact, the evolution of debt service burdens can explain the dynamics of US spending in the aftermath of the Great Financial Crisis fairly well. In addition, DSRs are also highly reliable early warning indicators of systemic banking crises.

BIS has developed a methodology to enable comparisons to be made across countries.  The DSR is defined as the ratio of interest payments plus amortisations to income. As such, the DSR provides a flow-to-flow comparison – the flow of debt service payments divided by the flow of income.

At the individual level, it is straightforward to determine the DSR. Households and firms know the amount of interest they pay on all their outstanding debts, how much debt they have to amortise per period and how much income they earn. But even so, difficulties can arise. Many contracts can be rolled over so that the effective period for repaying a particular loan can be much longer than the contractual maturity of the specific contract. Equally, some contracts allow for early repayments so that households or firms can amortise ahead of schedule. Given this, deriving aggregate DSRs from individual-level data does not necessarily lead to good estimates. And such data are rarely comprehensive, if available at all. For this reason, BIS derive aggregate DSRs from aggregate data directly.

While interest payments and income are recorded in the national accounts, amortisation data are generally not available and hence present the main difficulty in deriving aggregate DSRs. To overcome this problem, BIS follow an approach used by the Federal Reserve Board to construct debt service ratios for the household sector which measures amortisations indirectly. It starts with the basic assumption that, for a given lending rate, debt service costs – interest payments and amortisations – on the aggregate debt stock are repaid in equal portions over the maturity of the loan (instalment loans). The justification for this assumption is that the differences between the repayment structures of individual loans will tend to cancel out in the aggregate. They also make a range of assumptions about average loan durations.  You can read about the full methodology here.

 

 

 

New report examines payment aspects of financial inclusion

The Committee on Payments and Market Infrastructures (CPMI) and the World Bank Group today issued a consultative report on Payment aspects of financial inclusion. The report examines demand and supply-side factors affecting financial inclusion in the context of payment systems and services, and suggests measures to address these issues.

Financial inclusion efforts – from a payment perspective – should aim at achieving a number of objectives. Ideally, all individuals and businesses should have access to and be able to use at least one transaction account operated by a regulated payment service provider, to: (i) perform most, if not all, of their payment needs; (ii) safely store some value; and (iii) serve as a gateway to other financial services.

Benoît Cœuré, member of the Executive Board of the European Central Bank (ECB) and CPMI Chairman, says that, “With this report, the Committee on Payments and Market Infrastructures and the World Bank Group make an important contribution to improving financial inclusion. Financial inclusion efforts are beneficial not only for those that have no access to financial services, but also for the national payments infrastructure and, ultimately, the economy.”

Gloria M. Grandolini, Senior Director, Finance and Markets Global Practice of the World Bank Group, comments that, “This report will help us better understand how payment systems and services promote access to and effective usage of financial services. It provides an essential tool to meeting our ambitious goal of universal financial access for working-age adults by 2020.”

The report outlines seven guiding principles designed to assist countries that want to advance financial inclusion in their markets through payments: (i) commitment from public and private sector organisations; (ii) a robust legal and regulatory framework underpinning financial inclusion; (iii) safe, efficient and widely reachable financial and ICT infrastructures; (iv) transaction accounts and payment product offerings that effectively meet a broad range of transaction needs; (v) availability of a broad network of access points and interoperable access channels; (vi) effective financial literacy efforts; and (vii) the leveraging of large-volume and recurrent payment streams, including remittances, to advance financial inclusion objectives.

In summary, The CPMI-World Bank Group Task Force on the Payment Aspects of Financial Inclusion (PAFI) started its work in April 2014. The task force was mandated to examine demand and supply side factors affecting financial inclusion in the context of payment systems and services, and to suggest measures that could be taken to address these issues. This report is premised on two key points: (i) efficient, accessible, and safe retail payment systems and services are critical for greater financial inclusion; and (ii) a transaction account is an essential financial service in its own right and can also serve as a gateway to other financial services. For the purposes of this report, transaction accounts are defined as accounts (including e-money accounts) held with banks or other authorised and/or regulated payment service providers (PSPs), which can be used to make and receive payments and to store value.

The report is structured into five chapters. The first chapter provides an introduction and general overview, including a description of the PAFI Task Force and its mandate, a brief discussion of transaction accounts, and the barriers to the access and usage of such accounts. The second chapter gives an overview of the retail payments landscape from a financial inclusion perspective. The third chapter forms the core analytical portion of the report and outlines a framework for enabling access and usage of payment services by the financially excluded. Each component of this framework is discussed in detail in the report.

The fourth chapter of the report describes the key policy objectives when looking at financial inclusion from a payments perspective, and formulates a number of suggestions in the form of guiding principles and key actions for consideration. In this context, financial inclusion efforts undertaken from a payments angle should be aimed at achieving a number of objectives. Ideally, all individuals and micro- and some small-sized businesses – which are more likely to lack some of the basic financial services or be financially excluded than larger businesses – should be able to have access to and use at least one transaction account operated by a regulated payment service provider:
(i) to perform most, if not all, of their payment needs;
(ii) to safely store some value; and
(iii) to serve as a gateway to other financial services.

The guiding principles for achieving these objectives of improved access to and usage of transaction accounts are the following:
• Commitment from public and private sector organisations to broaden financial inclusion is explicit, strong and sustained over time.
• The legal and regulatory framework underpins financial inclusion by effectively addressing all relevant risks and by protecting consumers, while at the same time fostering innovation and competition.
• Robust, safe, efficient and widely reachable financial and ICT infrastructures are effective for the provision of transaction accounts services, and also support the provision of broader financial services.
• The transaction account and payment product offerings effectively meet a broad range of the target population’s transaction needs, at little or no cost.
• The usefulness of transaction accounts is augmented by a broad network of access points that also achieves wide geographical coverage, and by offering a variety of interoperable access channels.
• Individuals gain knowledge, through financial literacy efforts, of the benefits of adopting transaction accounts, how to use those accounts effectively for payment and store-of-value purposes, and how to access other financial services.
• Large-volume and recurrent payment streams, including remittances, are leveraged to advance financial inclusion objectives, namely by increasing the number of transaction accounts and stimulating the frequent usage of these accounts.

Finally, the fifth chapter of the report addresses a number of issues in connection with measuring the effectiveness of financial inclusion efforts in the context of payments and payment services, with a particular emphasis on transaction account adoption and usage.

 

Does Macroprudential Limit Risky Lending?

An interesting BIS working paper “Higher Bank Capital Requirements and Mortgage Pricing: Evidence from the Countercyclical Capital Buffer (CCB)”, examines the impact of implementing CCB on the mortgage market in Switzerland. Does the CCB have the potential to shift lending from less resilient to more resilient banks, and from riskier to less risky borrowers? This paper looks beyond just trying to control total credit growth. They conclude that the CCB does affect the composition of mortgage supply and raises the prices of more risky loans. In fact banks try to pass on the extra capital costs of previously issued mortgages to new customers. However, it does not stop more risky lending, because the link between borrower risk characteristics (here, loan-to-value (LTV) ratios) and capital requirements is too weak to actively discourage banks from offering mortgages to high-LTV borrowers after the CCB is activated.

Macroprudential policies have recently attracted considerable attention. They aim at both strengthening the resilience of the financial system to adverse aggregate shocks and at actively limiting the build-up of financial risks in the sense of “leaning against the financial cycle”. One reason for the appeal of such policies is that, by explicitly taking a system-wide perspective, they complement macroeconomic and prudential measures in seeking to address systemic risks arising from externalities (such as joint failures and procyclicality) that are not easily internalised by financial market participants themselves. Against this background, the new Basel III regulatory standards feature the Countercyclical Capital Buffer (CCB) as a dedicated macroprudential tool designed to protect the banking sector from the detrimental effects of the financial cycle. We provide the first empirical analysis of the CCB based on data from Switzerland – which became the first country to activate such a buffer on February 13, 2013. To reinforce banks’ defenses against the build-up of systemic vulnerabilities, the activation of the CCB raised their regulatory capital requirements, thereby contributing to the sector’s overall resilience. However, little is known about the CCB’s contribution towards the second macroprudential objective: higher requirements might slow bank lending or alter the quality of loans during the boom and thereby enable policy-makers to “lean against the financial cycle”. Up to now, policy debates have focused mainly on the quantity of aggregate credit growth. We aim to shift the focus of the debate towards the quality, namely the composition of lenders and how tighter capital requirements interact with borrower risk characteristics. Does the CCB have the potential to shift lending from less resilient to more resilient banks, and from riskier to less risky borrowers? Based on our findings, our analysis advances the understanding of some mortgage supply side aspects about whether the CCB can contribute towards the second objective of macroprudential policy, the “leaning against the financial cycle”.

To answer these questions, we examine how the CCB affects the pricing of mortgages. Our unique dataset obtained from an online mortgage platform allows us to separate mortgage supply from demand: each mortgage request receives several binding offers from several different banks, and, each bank can offer mortgages to many different households with distinct borrower risk characteristics. To identify the CCB effect on mortgage supply, we exploit lagged bank balance sheet characteristics that might render a bank more sensitive to the regulatory design of the CCB. To examine whether risk-weighting schemes that link borrower risk characteristics to capital requirements do, in fact, amplify the CCB effect, we use comprehensive information as specified in the mortgage request. The procedures of the online mortgage platform warrant that banks submit independent offers that draw precisely on the same set of anonymized hard information observed by their competitors (and available to us), undistorted by any private or soft information.

Two sets of results stand out. First, the CCB affects the composition of mortgage supply. Once the activated CCB imposes higher capital requirements, capital-constrained banks with low capital cushions raise their mortgage rates relatively more than their competitors. Further, after the CCB is activated, specialized banks that operate a very mortgage-intensive business model also raise their mortgage rates to a greater degree in relative terms. In fact, the CCB applies to new mortgages as well as to the stock of all mortgages held on a bank’s balance sheet. Our results for specialized mortgage lenders thus suggest that banks try to pass on the extra capital costs of previously issued mortgages to new customers. Both insights are indicative of changes in the composition of mortgage supply. Based on the assumption that, ceteris paribus, households prefer lower mortgage rates over more expensive ones,2 we conclude that the CCB tends to shift new mortgage lending from relatively less well capitalized banks to relatively better capitalized ones, and from relatively more to relatively less mortgage-exposed banks. For these reasons, both changes in the composition of mortgage supply are broadly supportive of the second macroprudential objective in that they tend to allocate new mortgage lending to banks that are more resilient.

Our second set of core findings incorporates the borrower side and the effectiveness of common risk-weighting schemes that translate borrower risk into bank capital requirements. We find that banks generally claim extra compensation for granting riskier mortgages (ie, by charging higher mortgage rates). However, these risk-weighting schemes do not appear to amplify the effect of the CCB on mortgage rates or mortgage creation. Apparently, the link between borrower risk characteristics (here, loan-to-value (LTV) ratios) and capital requirements is too weak to actively discourage banks from offering mortgages to high-LTV borrowers after the CCB is activated.

Our paper contributes to the literature in three different respects. First, our empirical setup allows us to advance the understanding of the effects of the CCB as a macroprudential policy tool, particularly in the context of Basel III. More generally, our insights also contribute to a better understanding of how higher capital requirements impact the pricing of loans to private households. Second, our dataset allows us to disentangle mortgage supply from mortgage demand. By merging bank-level information with the respective offers, we can attribute changes in the composition of mortgage supply to distinct bank balance sheet characteristics that shape a bank’s pricing of mortgages. These dimensions of our data set our approach apart from standard analyses based on mortgage contracts, which have a blind spot with respect to the spectrum of all offered (but non-concluded) rates. Third, our analysis informs the debate on the effectiveness of risk-weighting schemes, a standard concept in bank regulation.

Note that BIS Working Papers are written by members of the Monetary and Economic Department of the Bank for International Settlements, and from time to time by other economists, and are published by the Bank. The papers are on subjects of topical interest and are technical in character. The views expressed in them are those of their authors and not necessarily the views of the BIS.

Strengthening Macroprudential Policy in Europe

Speech by Mr Vítor Constâncio, Vice-President of the European Central Bank, at the Conference on “The macroprudential toolkit in Europe and credit flow restrictions.”

The current euro area environment, with policy rates required to stay low for a prolonged period of time and an apparent disconnect between the business and financial cycle, clearly points to a situation where monetary policy cannot deviate from price stability objectives to influence the financial cycle. This is the task of macroprudential policy. While acknowledged in principle, this fact has not yet been fully reflected in our policy frameworks.

In summary, two major moves are required. First, macroprudential policy must place greater emphasis on preventing large fluctuations in the financial cycle, rather than simply increasing resilience to shocks when they occur. In addition to the bank-side capital based measures enhancing banks’ resilience, borrower-based instruments (such as LTVs or DSTIs), which have proved to be more effective in curtailing excessive credit growth, and are also applicable in a time-varying fashion, should gain more prominence. In particular, borrower-based measures should be properly embedded in European legislation, which is not the case at present. Second, a broader macroprudential toolkit is needed to address risks stemming from the shadow banking sector due to its increasing role in credit intermediation. This could involve measures such as redemption gates and loading fees to provide additional safeguards. Guided stress tests can provide comparable assessments of the health of individual institutions and of the resilience of the financial system as a whole. Appropriate policy responses to mitigate growing risks need however to be calibrated, in order to ensure a contained impact on credit supply to the real economy.