Tuesday, March 20, 2018

The economic efficiency Vs social stability trade-off

The challenge posed by the rise of robots and resultant automation is well known. This is a nice summary of the evidence, 
Multiple studies suggest this is just the beginning and that there is more pain that lies ahead. A study by a real estate firm CBRE suggests 50% of occupations today will be gone by 2020. Then there is one by Oxford in 2013 that forecasts 47% of jobs will be automated by 2034. Yet another study has figured that only 13% of manufacturing job losses were due to trade. The rest has happened due to automation. And to make things worse, a McKinsey reckons 45% of knowledge work activity can be automated.
It is not a hyperbole to describe automation as the apotheosis of the search for economic efficiency. Robots make no mistakes, are more adaptable, can work 24X7, are much cheaper than labour, pose not threat of unionisation, and so on. Robots are super-efficient.

But this pursuit of efficiency in the modern economy sits with another trend - declining productivity in services sectors, those presumably most likely amenable to technological disruption and automation. 
Many of these services have seen increase in their share of the US labor force. Noah Smith has a very good article which captures the dilemma posed by this apparent weakness of the services sectors,  
The U.S. economy is sending more and more people into the sectors where productivity is either growing slowly or even falling... Is the stampede of American workers into unproductive industries really a bad thing?

Most economists would answer “yes” -- if construction, health care, education and the rest became more productive, workers would be freed up to go do other, more productive things, perhaps in industries that don’t even exist yet. But it’s also possible that some of these workers would otherwise just choose not to work -- to sit in their parents’ basements and play video games -- or to try to strike it rich in black-market sectors like drug sales. It’s also possible that the economic pressures of automation and trade, combined with the difficulty of retraining for new careers, would be sending some of these workers onto the welfare rolls instead of into new, better jobs.
And this conclusion is very sobering but rarely discussed in the mindless pursuit of efficiency,
So it’s possible that construction, health care, education and other industries are now functioning partly as giant make-work programs. Instead of giving a few people obviously useless jobs, this make-work system hides little bits of useless work in everybody’s job. That could be preserving the dignity of work for thousands, or even millions, of men and women standing around on construction sites, filling out paperwork in hospitals, or filing briefs for frivolous lawsuits. And that dignity, in turn, could be saving the U.S. from greater social unrest than it’s already experiencing.
In the efficiency and evidence maximising world-view that has gripped the worlds of business and academia respectively, the aforementioned would constitute an inefficient and therefore bad equilibrium. But when we step back and take a more comprehensive view, this may actually be a desirable situation. 

Change, especially by way of technological and social progress, is generally good. But such changes generally have an appropriate pace. Expedite the change and stresses will develop to disrupt the system, especially those with too many moving parts. There is no escape from the law of unintended consequences. Despite all its struggles, an organic evolution without too many mutations is the best response to such situations. The role of public policy should be confined to facilitating the process as well as mitigating the adverse consequences. 

This applies as much to a plunge towards automation and efficiency, as with the journey to become formal and shrink informality, or embrace digital technologies to reduce corruption. Press the pedal too early and too fast, and breakdowns or crashes are inevitable. 

Sunday, March 18, 2018

Weekend reading links

1. The Economist on the decline of publicly listed companies in the US,
According to Jay Ritter of the University of Florida, the number of publicly listed companies peaked in 1997 at 8,491 (see chart). By 2017 it had slumped to 4,496... Mr Ritter attributes much of the decline in the number of companies that are listed to the difficulty of being a small public company... listing requirements have become more burdensome over time. For example, he notes that the prospectus for Apple Computer’s public offering in 1980 ran to a mere 47 pages and listed no risk factors, despite its novel product, inexperienced leaders and formidable competitors. The prospectus for Blue Apron, a meal-delivery company that listed last year, weighed in at 219 pages, with 33 devoted to risks, presumably intended to pre-empt litigation. One of those risks was the possibility that Blue Apron would not “cost-effectively acquire new customers”.
2. Staying on with the declining of public markets, Craig Doidge, Kathleen M. Kahle, G. Andrew Karolyi, René M. Stulz have a paper which analyses the trends in US equity markets. Their findings are striking,
Since reaching a peak in 1997, the number of listed firms in the U.S. has fallen in every year but one. During this same period, public firms have been net purchasers of $3.6 trillion of equity (in 2015 dollars) rather than net issuers. The propensity to be listed is lower across all firm size groups, but more so among firms with less than 5,000 employees. Relative to other countries, the U.S. now has abnormally few listed firms. Because markets have become unattractive to small firms, existing listed firms are larger and older. We argue that the importance of intangible investment has grown but that public markets are not well-suited for young, R&D-intensive companies. Since there is abundant capital available to such firms without going public, they have little incentive to do so until they reach the point in their lifecycle where they focus more on payouts than on raising capital.
But, this trend may be unique to the Wall Street capitalist that US follows 

The challenge posed by intangible assets intersects with both the limitations of prevailing accounting practices as well as the excessive transparency of disclosure requirements, 
Public markets are better suited for firms with mostly tangible assets than for firms with mostly intangible assets. This is especially true when the usefulness of the intangible assets has yet to be proven on a large scale. Sometimes the market is extremely optimistic about some intangible assets, which confers a window of opportunity on firms with such assets to go public. But otherwise, firms with unproven intangible assets may very well be better off to fund themselves privately. Accounting information conveyed by U.S. GAAP for such firms is of limited use because GAAP treats investments in intangible assets mostly as expenses, so that these assets may very well not show up on firms’ balance sheets. Private funding allows firms to convey information about intangible assets more directly to potential investors who often have specialized knowledge, something that they could not convey publicly... The issue with disclosure of intangible assets is not what firms have to disclose. Rather, it has to do with the nature of the intangible assets they need to disclose. Once an idea is made public it becomes possible for other firms to use it... Investment in intangible assets is highly sensitive to the legal environment in which a firm operates and to the pace of financial development it experiences. A plant is hard to steal. A new idea is not...
As intangible assets continue to increase in importance, it should not surprise us to see a further eclipse of public markets. This stalling of public equity market development should be more pronounced in a country like the U.S., where intangible assets are relatively more important for the corporate sector... this evolution also reflects that U.S. financial development has evolved in such a way that some types of firms can be financed more efficiently through private sources than through public capital markets because the intrinsic properties of intangible assets make it harder for them to be financed in public markets. No deregulatory action is likely to restore the public markets in this case. Instead, we should focus on creating a fertile ground for investment in intangible assets by having appropriate laws, appropriate financing mechanisms, and maybe new types of exchange markets, as these assets appear to be the way of the future for corporations.
3. Another paper by René M. Stulz, with Söhnke M. Bartram and Gregory W. Brown explore another consequence of the reduction in listed companies - correlatedness among stocks.
Since 1965, average idiosyncratic risk (IR) has never been lower than in recent years. In contrast to the high IR in the late 1990s that has drawn considerable attention in the literature, average market-model IR is 44% lower in 2013-2017 than in 1996-2000. Macroeconomic variables help explain why IR is lower, but using only macroeconomic variables leads to large prediction errors compared to using only firm-level variables. As a result of the dramatic change in the number and composition of listed firms since the late 1990s, listed firms are larger and older. Larger and older firms have lower idiosyncratic risk. Models that use firm characteristics to predict firm-level idiosyncratic risk estimated over 1963-2012 can largely or completely explain why IR is low over 2013-2017. The same changes that bring about historically low IR lead to unusually high market-model R-squareds.
4. Times reports of California's aggressive embrace of transit-oriented development (TOD) by way of a legislative Bill to allow eight-storied buildings around transit stations even if local communities object. The Bill proposes to allow apartments of upto 85 feet tall within half mile of train stations and a quarter-mile of high-frequency bus stops. It would overcome one fo the biggest stumbling blocks to increased densification in the region, entrenched local opposition.

5. Noah Smith point again to the rising business concentration in the US economy.
He describes the resultant dynamics a "toxic cycle"
As industries grow more concentrated, dominant companies become a bigger piece of the stock market, and their profit margins push stock valuations higher. Politicians naturally will be less willing to take steps to make markets more competitive, allowing superstar companies to become even more powerful. All the while, retirement accounts do OK, but workers’ wages and the economy suffer from decreasing competition.
6. Alec Schierenbeck makes the case for a progressive approach to the imposition of all forms of financial penalties. In simple terms, people should be made to pay fines based on their respective income levels. He argues that scaled fines, like in Finland and Argentina which have had them for than 100 years, are more equitable and have greater deterrence value.

7. One more consequence of quantitative easing - rising property prices in the world's largest cities. FT writes,

Over the past 10 years, the life-cycles of global cities such as London, New York and Sydney start to look very similar. They begin with central banks cutting rates; then foreign buyers are welcomed in, prices go up, high-end homes are built, capital appreciation drops and then cities are left with a lot of stock which is too expensive to sell.

8. Finally, from a nice FT essay on long-haul flights, to put air transportation in perspective,
The world has never been smaller, as it spins beneath a web of flight paths; at any one time, there are an average of almost 10,000 aeroplanes in the air, carrying 1.2m people between countries and continents at more than 500mph.
Yes, 1.2 million people in air at any time! 

Thursday, March 15, 2018

Two graphs on India's credit market

It is true that the share of incremental credit flows to non-financial corporates is nowadays more or less equally split between bank and non-bank sources.

But, even among all its peers, the share of the stock of private non-financial sector credit coming from banks is the highest in India. In fact, just about 5% of it comes from non-bank sources. No other major country has this skewed distribution.

Further, the stock of credit to non-financial sectors from all sectors at market value as a share of GDP has hardly changed in India over the past 16 years and is well below the average for emerging economies.

Wednesday, March 14, 2018

Do we need evidence on the efficacy of rural roads and electricity?

This paper on rural electrification program in Kenya finds,
We do not find meaningful medium-term impacts on economic, health, and educational outcomes nor evidence of spillovers to unconnected households. These results suggest that current efforts to increase residential electrification in rural Kenya may reduce social welfare. 
This paper on India's rural roads construction program finds,
There are no major changes in consumption, assets or agricultural outcomes, and nonfarm employment in the village expands only slightly. Even with better market connections, remote areas may continue to lag in economic opportunities.
The fact that we are trying to generate evidence on rural roads and electricity baffles me. What is it that we takeaway from such papers? What is it that the "headline readers" among development professionals would takeaway?

Is it that the high upfront investments that are required with any kind of rural infrastructure (since it cannot leverage economies of scale) is social surplus reducing, and therefore undesirable (in econ-speak)? Or is it that grid electrification/BT roads is not cost-effective to comparable alternatives? Or that rural infra works have leakages which make them social surplus reducing or engendering incentive distortions? Or that the measurement approach that the researchers take is limited in that it is not able to capture all the potential general equilibrium effects - after all without electricity and roads you cannot have a life of modernity, which I guess is a salient material objective of development? Or is it even that developing countries should make choices between roads and electricity for their villages and nifty innovations like deworming, nudges, shiny technology Apps, micro-insurance, self-help groups, cash transfers, and so on? Or is it that the short-term benefits of roads and power are small - if so, what about the long-term benefits? 

Note that neither paper even qualify its findings with such a preface.

In fact, a cursory reading of the abstract, as is what happens most often, can easily leave one with the takeaway that rural roads and rural electrification are a bad use of public money. We only need to look at how much damage this one work contributed to misleading the fiscal austerity debate. Clearly, this time is no different. No pun intended.

A more appropriate response to such papers and the comes from Lant Pritchett's description of these as "kinky development",
What the World Bank chose to highlight in its official publicity about Dr. Jim Kim’s visit (to Somalia) was that it had figured out a way to use mobile phone surveys to track the poverty status of people in Somalia on a quarterly basis. Imagine the joy and celebration among Somalis to know that the World Bank was going to promote Somalia’s national development not with a port upgrade, or a road or electricity or water, or even a school or a clinic, but by being able to track and tell them every quarter just how poor they really are—something I suspect they know quite well already...
Perhaps promoting energy source diversification is why President Obama, while touring a power plant in Africa, thought it politically expedient to promote the Soccket ball. For those of you who still have not been introduced to this technological marvel, the Soccket ball is a soccer ball containing a battery that is charged by the kinetic energy of being kicked. This contraption is perhaps one of the best illustrations of the gap between development realities (the average Ethiopian consumes 52 kwh of electricity and the average American 13,246 kwh) and the “solutions” being proposed by the world’s elite: ban coal and limit hydro and if Africans want power, let them kick some soccer balls round.
This nails it
In another 2013 Center for Global Development study, Benjamin Leo used the authoritative Afrobarometer and Latinobarometer surveys to document the discrepancy between poor country citizen preferences and U.S. foreign assistance allocations. In Africa, surveys asked the question “In your opinion, what are the most important problems facing this country that government should address?” and grouped the responses into eight broad categories. In Africa, 71 percent mention jobs/income among their top three problems, 52 percent mention infrastructure, and 63 percent name either jobs/income, infrastructure or economic/financial policies as their priority. Independently of what we may think African priorities ought to be, only seven percent named health, four percent education, and one percent governance as among the top three problems the government should address. Yet of American assistance to Africa from all agencies (e.g. USAID, PEPFAR, and MCC), only six percent goes to jobs/income and only 16 percent to jobs/income and infrastructure. Fully 60 percent of American assistance goes to areas that the Africans surveyed think are distinctly lower-tier priorities.

Monday, March 12, 2018

The implementation validity problem with RCTs

Randomised Control Trials (RCTs) are extensively used in development today. Funding decisions by multilaterals and other donors are swayed by evidence from RCTs. In fact, in certain rarefied confines of international development, RCTs have become the definition of evidence itself. 

A typical RCT is a small experimental pilot with the smallest sample size consistent with statistical requirements done under the supervision of some principal investigators and with the field support of smart and committed research and field assistants. This poses a problem.

How do we separate these two effects?
  1. The immense energies of reputed researchers and their committed and passionate young RAs to protect the integrity/fidelity of the experiment
  2. The effect on Sl No 1 (which would be absent in business as usual implementation) contributing to the experiment's effective implementation. 
In other words, how do we evaluate the treatment (or the innovation being proposed) in the business as usual environment?

An RCT typically establishes the efficacy of the treatment. But it does not tell much about its effectiveness, a measure of both the efficacy and implementation fidelity.

This assumes great significance since the same innovation or treatment is generally implemented through government systems, which are notoriously enfeebled. In fact, given the weak state capacity, trying out innovations whose efficacy has been established through RCTs is akin to pumping all kinds of exotically engineered liquids through pipes which leak everywhere.

It is also the reason why practitioners are lukewarm to many headline RCT findings which generate intense interest among academics and the global development talkshops.

Do we call this the implementation validity problem?

It is surprising that while papers and books have been written about the internal and external validity problems associated with RCTs, this arguably more important challenge, given the weak state capacity in all the implementation environments, hardly gets a mention. 

Saturday, March 10, 2018

Weekend reading links

1. Is there a bigger endorsement of the death of PPPs and the return to nationalisation than this,
A recent poll by the Legatum Institute found that 83 per cent of respondents favoured renationalising the water industry that Margaret Thatcher, then prime minister, sold in 1989. For energy and the railways, 77 per cent and 76 per cent respectively backed the reversal of their privatisations.
Another interesting snippet from the article is that three-quarters of UK's train operating companies in the 20 franchises are foreign state-owned firms.
Talk about privatisation of rail in one country to public operators of other countries! Are the most efficient rail operators state-owned?

2. Rajan Govil's assessment of Indian economy is not very promising.
The government’s expenditure policy does not appear to be conducive to increasing investment or potential GDP in the near or medium term. The central government’s overall capital expenditure declined from 1.9% of GDP to 1.6% of GDP in 2017-18 and is expected to stay at 1.6% in 2018-19 at a time when investment has stagnated. Additionally, not all of the capital expenditures are for investment—some of these are for bank recapitalization. Moreover, expenditure on “social services”, which include education, public health, water supply and sanitation, has been reduced progressively from 0.61% of GDP in 2016-17 to 0.59% of GDP in 2017-18 (revised estimate) and finally to 0.57% of GDP in 2018-19. Current expenditure for the Central government increased in 2017-18 by 0.5% to 11.6% of GDP from 2016-17 and it would be very difficult in an election year for this to be reduced even to 11.4% of GDP in 2018-19 as per budget estimates. At the same time, the government wants to provide a higher subsidy to the farmers as well that might prove to be too costly.
3. Underlining the importance of public subsidy for urban mass transit, the FT reports that Transport for London (TfL) - the public body formed in 2000 which oversees Tube, overground trains, and buses - will run an operating deficit of close to £1 billion in 2018-19.

4. Lucas Chancel busts three myths of globalisation.

Globalisation has increased inequality,
The top 1% income share rose from 7% to 22% in India, and 6% to 14% in China between 1980 and 2016... Between 1980 and 2016, inequality between the world’s citizens increased, despite strong growth in emerging markets. Indeed, the share of global income accrued by the richest 1%, grew from 16% in 1980 to 20% by 2016. Meanwhile the income share of the poorest 50% hovered around 9%. The top 1% – individuals earning more than $13,500 per month – globally captured twice as much income growth as the bottom 50% of the world population over this period.
Income doesn't trickle down - high growth at the top is necessary to achieve some growth at the bottom,
When we compare Europe with the U.S., or China with India, it is clear that countries that experienced a higher rise in inequality were not better at lifting the incomes of their poorest citizens. Indeed, the U.S. is the extreme counterargument to the myth of trickle down: while incomes grew by more than 600% for the top 0.001% of Americans since 1980, the bottom half of the population was actually shut off from economic growth, with a close to zero rise in their yearly income. In Europe, growth among the top 0.001% was five times lower than in the U.S., but the poorest half of the population fared much better, experiencing a 26% growth in their average incomes. Despite having a consistently higher growth rate since 1980, the rise of inequality in China was much more moderate than in India. As a result, China was able to lift the incomes of the poorest half of the population at a rate that was four times faster than in India, enabling greater poverty reduction.
Policy, and not technology, is responsible for inequality,
The U.S. and Europe, for instance, had similar population size and average income in 1980 — as well as analogous inequality levels. Both regions have also faced similar exposure to international markets and new technologies since, but their inequality trajectories have radically diverged. In the U.S., the bottom 50% income share decreased from 20% to 10% today, whereas in Europe it decreased from 24% to 22%... After the neoliberal policy shift of the early 1980s, Europe resisted the impulse to turn its market economy into a market society more than the US — evidenced by differences on key policy areas concerning inequality. The progressivity of the tax code — how much more the rich pay as a percentage — was seriously undermined in the U.S., but much less so in continental Europe. The U.S. had the highest minimum wage of the world in the 1960s, but it has since decreased by 30%, whereas in France, the minimum wage has risen 300%. Access to higher education is costly and highly unequal in the U.S., whereas it is free in several European countries. Indeed, when Bavarian policymakers tried to introduce small university fees in the late 2000s, a referendum invalidated the decision. Health systems also provide universal access to good-quality healthcare in most European countries, while millions of Americans do not have access to healthcare plans.
5. Fascinating Businessweek essay on Chinese outsourcing of textiles making to Ethiopia. Buoyed by fiscal concessions in the newly established industrial estates, Chinese companies have been flocking to invest in the country. China has given the country $10.7 bn in loans in the 2010-15 period.

6. I have been a China pragmatist, not believing the bears, despite all the economic pressure points building up. But I am inclined to agree with Nitin Pai about the risk of the beginning of difficult period for China in the aftermath of Xi Jinping's ascension as a dictator. There is some inherent strength to institutional consensus of whatever kind. It can hold systems together against centrifugal forces. But once the institutional glue is replaced by that of an individual, howsoever impregnable he may look in the present, it is one crisis away from threatening to unravel.

7. Latin American bureaucracies are going through their version of decision paralysis. The trigger has been the Lava Jato investigations involving political influence buying by large corporates, especially infrastructure contractors,  
A big setback was the Lava Jato (Car Wash) investigation, which began as a money-laundering case in Brazil and has engulfed the governments of a dozen Latin American countries. Odebrecht, a Brazilian firm that built highways, dams, power plants and sanitation facilities across the region, admitted to paying $788m in bribes. Its money financed political campaigns, including those of Colombia’s president, Juan Manuel Santos and Juan Carlos Varela, now Panama’s president. Pedro Pablo Kuczynski, Peru's President, has admitted that companies linked to him have taken (legal) payments from Odebrecht. The scandal has left a trail of unfinished projects, frightened politicians and bureaucrats, and wary bankers. A $7bn contract with Odebrecht to build a pipeline to transport natural gas from the Amazon basin across the Andes to Peru’s coast has been annulled and work has been suspended. Ruta del Sol 2, a 500km (300-mile) stretch of highway to help connect Bogotá to Colombia’s Caribbean coast, has stalled. Panama’s government cancelled a contract with Odebrecht for a $1bn hydroelectric project. Mexico’s biggest scheme, a new airport near the capital, has been plagued by corruption allegations. Andrés Manuel López Obrador, the front-runner in Mexico’s presidential election, scheduled for July 1st, has threatened to scrap it.
And the modus operandi of PPPs remain the same,
Public-private partnerships (PPPs) are open to abuse by construction firms such as Odebrecht, which make low bids to secure contracts and then renegotiate them to push up the cost, often by bribing a politician or two. More than three-quarters of Latin American PPP contracts in transport have been renegotiated within about three years of signing, according to José Luis Guasch, a professor of economics at the University of California.

Friday, March 9, 2018

The psychology of the financial markets

Consider this. Someone scares you by inflating the likelihood of an imminent disaster. If the disaster does not materialise, you are relieved and happy. But does the happiness have any real basis?

The same applies to financial markets. They respond favourably to a positive news about the economy. But it is perhaps not incorrect to say that it responds even more positively in relief when an anticipated negative news does not materialise. Consider the following instances
  1. The collapse of global financial markets in the aftermath of the GFC and its impact on the economy in the form of a repeat of Great Depression
  2. Catastrophe in Europe with the Greek and Irish crisis spilling over to Italy and Spain, thereby causing the unravelling of the EU itself
  3. The debt-bomb ticking in China would bring down the entire economy
  4. The end of commodity cycle, global economic slowdown, and imminent collapse of China would herald the end of the emerging market story
  5. The exit from quantitative easing would lead to a rise in rates and devastate debt-laden governments, corporate sector and households
  6. World economy has entered a deflationary loop and negative rates are here to stay
  7. In the aftermath of Brexit, far-right parties will emerge as important players in the political scene across Europe
  8. The Trump Presidency will lead to protectionism and trade-wars, exist from NAFTA, American isolationism, and global economic collapse
It cannot be denied that there was a likelihood of each one of the above. And the consequences could have been bad. But what is debatable is whether they were as grave and imminent as was made out to be by public commentators and academic scholars. I am inclined to believe that their views of these scenarios were painted as doomsday prophecies in the financial press and opinion makers.

These narratives shaped expectations and prayers that they not materialise. In the circumstances, once the likelihood of their incidence declined, markets responded with relief. In fact, once the danger passed over, markets rebounded excessively. 

Over the last few years, each one of these dangers have receded, thereby boosting market confidence and the associated animal spirits. The extended bond and equity market booms owe a lot to animal spirits engendered by the market relief from having avoided these dangers. But do they have any real basis?

Tuesday, March 6, 2018

The Great Indian Banking Cleanup?

Tamal Bandyopadhyay has an outstanding chronicle of India's banking sector saga. What stands out is the progressive evolution and tightening of the process of recognition, 
the banks have been directed to do many things—ranging from integrating the core banking system with SWIFT to checking all bad loans worth Rs50 crore and more for possible frauds to consolidating their foreign operations, among others—to get their house in order... (under) the 12-February midnight directive of RBI... all existing frameworks for addressing stressed assets have been withdrawn and the joint lenders’ forum (JLF), an institutional mechanism that was overseeing them, has been dismantled. Now, the banks have no choice but to classify all large loans worth at least Rs2,000 crore as non-performing assets (NPAs) immediately when they restructure it. The clock started ticking from 1 March 2018. Such an NPA should be resolved within 180 days, failing which the account gets referred to the Insolvency and Bankruptcy Code (IBC) court. Simply put, when a borrower fails to pay a bank loan in time, it becomes a defaulter, unlike in the past when the account was classified as “stressed” – often an excuse for the banks to postpone the inevitable...

The war against NPAs started with the so-called asset quality review, or AQR, in the second half of 2015 under which RBI inspectors checked the books of all banks and identified bad assets. Bankers were directed to come clean and provide for all bad assets by March 2017. On top of that, the central bank started forcing banks to disclose the divergence between RBI’s assessment of the loan books and the banks’ recognition of bad assets in the notes to accounts to their annual financial statements to depict “a true and fair view of the financial position” of each bank. An ordinance was promulgated in 2017, amending the Banking Regulation Act, 1949, giving powers to the central bank to push the banks hard to deal with bad assets. It authorized RBI to direct the banks to invoke the IBC against the loan defaulters... Also, from now on, banks need to report all default cases involving at least Rs5 crore every week (at the close of business hours every Friday) to a central repository of information.
When you step back and see the banking sector cleanup that is being played out, assisted by the Bankruptcy Code, you cannot but not feel that this is perhaps a paradigm shift in India's banking sector - it beggars my belief as to why the regulator did not even have the basic reporting requirements on the different types of stressed assets till recently. If our banking regulation stood on such shaky foundations, what confidence can we have about the other, arguably less competent and more compromised, regulators being able to effectively regulate the capital and other financial markets?

Anyways, this is one massive achievement for this government. Maybe, when the history books are written, this would count as 2-3 of the biggest achievements of the government. It should shout from the roof tops and I would not mind...

After all, the muck within the banking system - the ever-greening, gold-plating of loans, the diversion to other purposes, political cronyism in loan advances etc - were well known to insiders for long, and could have been addressed by previous governments too. I believe the regulator would have resisted excessive reporting and disclosures, transparency and intervention (like breaking the distinction between restructuring and NPA and aggressive provisioning requirements) arguing that it would shake the confidence in the banking system. And the government, like the Bootlegger to the RBI's Baptist, would have happily acceded. Is there something this government had going differently - its anti-corruption commitment, arrival of IBC etc? 

For sure several things can (and likely will) still slow down the process - banker's becoming risk averse thereby slowing down lending; the IBC led resolution process getting stuck in a few high profile corruption scandals; the capital markets not being able to absorb the deluge of assets hitting the market; the courts intervening to slowdown the process, and so on. But that would be par for the course with such transformational changes. It cannot detract from the achievements till date. 

It would be great to hear someone, preferably an insider, write the story of the evolution of the Great Indian Banking Cleanup - how much of it was government's and how much RBI's contribution; how much despite the government and how much despite the RBI; how much role did specific individuals - in govt and RBI - play; how were the corporate vested interests overcome; were the consequences of each step of tightening oversight discussed in detail and how did a risk-averse system pull itself to bite the bullet at each stage; how much of the clean-up momentum owed to initiatives like the anti-corruption drive and the demonetisation; how much of the momentum owed to the public pressures from high-profile cases like Vijay Mallaya and now Nirav Modi; how much of it was shaped by various other less known opportunistic shifts and how much part of conscious comprehensive planning; is there a path dependency associated with where we are now - should we have necessarily done CDR, S4A, SDR, AQR, breaking the distinction between restructuring and NPA etc before going for broke with blanket disclosure and transparency...

I am sure this would be one hell of a chronicle of how public policy is made emerges. 

Google and the anti-trust challenge

Charles Duhigg has an excellent longform chronicle in the Times on the rise and rise of Google. And it is a disturbing story, symptomatic of the pervasive trend of regulatory capture and business concentration.

This is the anti-trust challenge that is being faced,
Standard Oil controlled 64 percent of the market for refined petroleum when the Supreme Court broke it into dozens of pieces. Google and Facebook today control an estimated 60 to 70 percent of the U.S. digital advertising market.
Anti-trust lawyer Gary Reback has this comparison about the tactics of Standard Oil and Google,
“They don’t need dynamite or Pinkertons to club their competitors anymore. They just need algorithms and data.”
Are algorithms and data any less morally and legally reprehensible than dynamites and snooping?

The legal problem, especially in the US, is with the narrow definition of what constitutes the requirement for anti-trust action. US courts have been more inclined to interpret anti-trust action in terms of consumer interest than competition - to what extent are consumers, rather than competitors, being harmed by Google? But such narrow framing poses problems,
Antitrust has never been just about costs and benefits or fairness. It’s never been about whether we love the monopolist. People loved Standard Oil a century ago, and Microsoft in the 1990s, just as they love Google today. Rather, antitrust has always been about progress. Antitrust prosecutions are part of how technology grows. Antitrust laws ultimately aren’t about justice, as if success were something to be condemned; instead, they are a tool that society uses to help start-ups build on a monopolist’s breakthroughs without, in the process, being crushed by the monopolist. And then, if those start-ups prosper and make discoveries of their own, they eventually become monopolies themselves, and the cycle starts anew... Put differently, if you love technology — if you always buy the latest gadgets and think scientific advances are powerful forces for good — then perhaps you ought to cheer on the antitrust prosecutors. Because there is no better method for keeping the marketplace constructive and creative than a legal system that intervenes whenever a company, no matter how beloved, grows so large as to blot out the sun. If you love Google, you should hope the government sues it for antitrust offenses — and you should hope it happens soon, because who knows what wondrous new creations are waiting patiently in the wings.
And this counterfactual is very difficult to establish,
If Microsoft had crushed Google two decades ago, no one would have noticed. Today we would happily be using Bing, unaware that a better alternative once existed. Instead, we’re lucky a quixotic antitrust lawsuit helped to stop that from happening. We’re lucky that antitrust lawyers unintentionally guaranteed that Google would thrive.
The popular narrative, resonant with the pervasive anti-government and pro-market ideological beliefs, holds that the decade long anti-trust pursuit of Microsoft in the nineties failed to achieve anything beyond token gains, and finally it required a more innovative competitor, Google to keep Microsoft away from dominating the search business.
it was companies like Google, rather than government lawyers, that humbled Microsoft
This narrative is repeated ad nauseum by opponents of regulatory actions to support their arguments in favour of the market mechanism. But a less discussed counter-narrative claims that the anti-trust actions "made all the difference" and "condemning Microsoft as a monopoly is why Google exists today". There was a  transformation in Microsoft's internal culture and business practices,
In the days when federal prosecutors were attacking Microsoft day and night, the company might have publicly brushed off the salvos, insiders say. But within the workplace, the attitude was totally different. As the government sued, Microsoft executives became so anxious and gun-shy that they essentially undermined their own monopoly out of terror they might be pilloried again... In public, Bill Gates was declaring victory, but inside Microsoft, executives were demanding that lawyers and other compliance officials — the kinds of people who, previously, were routinely ignored — be invited to every meeting. Software engineers began casually dropping by attorneys’ desks and describing new software features, and then asking, in desperate whispers, if anything they’d mentioned might trigger a subpoena... Every time a programmer detailed a new idea, the executive turned to the official, who would point his thumb up or down like a capricious Roman emperor... as Microsoft lived under government scrutiny, employees abandoned what had been nascent internal discussions about crushing a young, emerging competitor — Google... Microsoft was so powerful, and Google so new, that the young search engine could have been killed off, some insiders at both companies believe. “But there was a new culture of compliance, and we didn’t want to get in trouble again, so nothing happened,” Burrus said. The myth that Google humbled Microsoft on its own is wrong. The government’s antitrust lawsuit is one reason that Google was eventually able to break Microsoft’s monopoly.
And this only presaged earlier similar occasions where newer technologies spread,
In 1969 the Justice Department started a lawsuit against IBM for antitrust violations that lasted 13 years. The government eventually surrendered, but in an earlier attempt to mollify prosecutors, IBM eliminated its practice of bundling hardware and software, a shift that essentially created the software industry. Suddenly, new start-ups could get a foothold simply by writing programs rather than building machines. Microsoft was founded a few years later and soon outpaced IBM.
Or consider AT&T, which was sued by the government in 1974, fought in court for eight years and then slyly agreed to divest itself of some businesses if it could keep its most valuable assets. Critics complained AT&T was getting the deal of a lifetime. But then start-ups like Sprint and MCI made millions building on technologies AT&T championed, and AT&T found itself struggling to compete. It’s completely wrong to say that antitrust doesn’t matter, Reback argues. “The internet only exists because we broke up AT&T. The software industry exists because Johnson sued IBM.”

Friday, March 2, 2018

Liberal hypocrisy and the rise of Trump

At a time when the rise and rise of tech giants has engendered serious and immediate concerns on a host of problems about very proximate issues - privacy, inequality, political capture, stifling of competition, tax avoidance, fake news, social addictions etc - Anne-Marie Slaughter has this article which can even charitably be only described as a cop out.

Ignoring all these pressing and immediate issues, she labours painfully and incoherently on the dehumanising aspects of technology. When contrasted with honourable and insightful articles like this, Ms Slaughter's looks more like an exercise in digression from the real issues. 

And what on earth does this mean?
Going local will also be an important way to recover a belief in truth. With the decline of traditional trusted intermediaries, and the discovery that social media account holders may well be bots, we will crave verifiability. Blockchain technologies can help.
Note that there is not even a disclosure in her signature indicating that Google is a major funder of New America Foundation, of which she is the President, and which was at the centre of this disreputable incident which revealed people's true allegiances and convictions. 

What else can be expected from this poster child of liberalismfeminism and a host of other feel-good isms? 

When you have such people as role models representing the elite, why should we be surprised by the rise of Trump?