Substack

Thursday, May 5, 2011

Mapping and disseminating the risk topography

It is now widely acknowledged that arguably the biggest contributor to the sub-prime financial market crisis was a failure to anticipate and act on the build-up of risks across the sector. This constitutes failures at two levels - having access to information on the evolution of various cross-sectional risks with time and then acting appropriately to mitigate these emerging risks.

Robert Shiller points to a 2010 paper by Donald L. Kohn, Matthew J. Eichner and Michael G. Palumbo, who argue that the underlying themes were similar to previous crises,

"Although the instruments and transactions most closely associated with the financial crisis of 2008 and 2009 were novel, the underlying themes that played out in the crisis were familiar from previous episodes: Competitive dynamics resulted in excessive leverage and risktaking by large, interconnected firms, in heavy reliance on short-term sources of funding to finance long-term and ultimately terribly illiquid positions, and in common exposures being shared by many major financial institutions."


Taking cue from this, they point to the need for policy makers to have "better and earlier indications regarding these critical, and apparently recurring, core vulnerabilities in the financial system". In particular with the sub-prime crisis, they point to two information failures. One was the failure to anticipate "the underlying credit risk associated with the rapid growth of home mortgages and a consequent increase in the vulnerability of borrowers to a downturn in home prices or incomes". The other was the inability to assess the growth of financial vulnerability outside the traditional banking sector because of "a greater reliance on short-term funding for longer-term financial instruments". They emphasise the need to fill up these data gaps and have real-time information on them to have a comprehensive early warning system in place.

The authors also argue that merely collecting data, even analyzing them, would not serve much purpose. It is critical that the analysis focus on the relevant areas - specific instruments and their transactions - and then it should be rendered in the most effective manner to evoke the desired response among all the relevant stakeholders. In this context, as a large number of researchers have argued, the presence of automatic stabilizing mechanisms (like say, dynamic capital buffers and reserve requirements) can eliminate the risk of stakeholders not acting (for whatever reasons) even when faced with information on emerging risks. They write,

"More fundamental, in our view, is the need to use data in a different way — in a way that integrates the ongoing analysis of macro data to identify areas of interest with the development of highly specialized information to illuminate those areas, including the relevant instruments and transactional forms... We can easily imagine specifying ex ante a program of data collection that would look for vulnerabilities in the wrong place, particularly if the actual act of looking by macro- or microprudential supervisors causes the locus of activity to shift into a new shadow somewhere else."


In this context, in a recent working paper, Markus K. Brunnermeier, Gary Gorton, and Arvind Krishnamurthy have sought to identify the kinds of risk measurements of leverage and liquidity that should be collected and how it should be interpreted in terms of modern financial theory to provide real-time decision support for financial market participants. They conceptualize and design a risk topography that outlines a data acquisition and dissemination process that informs policymakers, researchers and market participants about systemic risk. They write,

"Our approach emphasizes that systemic risk (i) cannot be detected based on measuring cash instruments, e.g., balance sheet items and income statement items; (ii) typically builds up in the background before materializing in a crisis; and (iii), is determined by market participants’ response to various shocks. We propose that regulators elicit from market participants their (partial equilibrium) risk as well as liquidity sensitivities with respect to major risk factors and liquidity scenarios. General equilibrium responses and economy-wide system effects can be calibrated using this panel data set."


This is one such excellent data reresentation technique that maps the build-up of co-related risks. HSBC researchers use heat maps to identify the changes in correlations between different categories of asset classes.

No comments: