John Cochrane on finance
John Cochrane has an excellent article in the Journal of Economic Perspectives, discussing a wide range of finance problems. Here’s a small sample:
The period after a news announcement often features high price volatility and trading volume, in which markets seem to be fleshing out what the news announcement actually means for the value of the security. For example, Lucca and Moench (2012, Figure 6) show a spike in stock-index trading volume and price volatility in the hours just after the Federal Reserve announcements of its interest rate decisions. The information is perfectly public. But the process of the market digesting its meaning, aggregating the opinions of its traders, and deciding what value the stock index should be with the new information, seems to need actual shares to trade hands. Perhaps the common model of information”” essentially, we all agree on the deck of cards, we just don’t know which one was picked””is wrong.
That is something I’ve noticed as well. Here’s a proposed solution to high frequency trading:
Suppose that an exchange operated on a discrete clock, as a computer does in order to let signals settle down before processing them. The exchange could run a once-per-second, or even once-per-minute, matching process, with all orders received during the period treated equally. If there are more buy than sell at the crossing price, orders are filled proportionally. Such an exchange would eliminate extremely high-frequency trading, because there would be no gain or loss from acting faster than a minute.
Here Cochrane discusses whether finance is too big:
Demand that shifts out can shift back again. Demand for financial services evaporated with the decline in housing and asset values in the 2008 recession and subsequent period of sclerotic growth. Much of the “shadow banking system” has disappeared. For example, asset-backed commercial paper outstanding rose from $600 billion in 2001 to $1.2 trillion in 2007″”and now stands at $300 billion. Financial credit market debt outstanding in the flow of funds rose from $8.6 trillion in 2000 to $17.1 trillion in 2008″”and now stands at $13.8 trillion. Employment in financial activities rose from 7.7 million in 2000 to 8.4 million in 2007″”and is now back to 7.7 million (according to the Bureau of Labor Statistics). Study of “why is finance so big,” using data that stops in 2007, may soon take its place alongside studies of “why are Internet stocks so high” in 1999 or studies of “why is there a Great Moderation” in 2006. . . .
. . . It is possible that there are far too are far too few resources devoted to price discovery and market stabilization. In the financial crisis, we surely needed more pools of cash prepared to pounce on fire sales, and more opportunities for negative long-term views to express themselves.
Surveying the current economic literature on these issues, it is certain that we do not very well understand the price-discovery and trading mechanism, nor the economic forces that allowed high-fee active management to survive so long.
Note that economic theory predicts that society will devote too few resources to ferreting out useful information about corporate values.
PS. I thought David Henderson made a very good point in this critique of Krugman on waste in finance:
Now to the three possibilities:
1. If the payments made by Thomson-Reuters and others who get the information earlier are needed to give U. of M. the appropriate incentives to gather quality information, then the payments are not wasted.
2. If the payments made by Thomson-Reuters and others who get the information earlier are not needed to give U. of M. the appropriate incentives to gather quality information, then the payments are producer surplus to the U. of M. and there is no social loss from the payments–it’s just a transfer.
3. If in case #2 above, the producer surplus is used for low-value uses at U. of M.–this is both a non-profit university and a government university, after all–then there is a waste.So only in case 3 above is it “unproductive finance.” I’m pretty sure Krugman isn’t going with case 3.
Tags:
14. June 2013 at 06:33
“Suppose that an exchange operated on a discrete clock, as a computer does in order to let signals settle down before processing them”
So Cochrane doesn’t know now HFT works…. There is a clock, that clock is your subscription to price data. The most common subscription in the HFT space is… Once per second. This is the fastest, reasonably priced service. When people discuss HFT they actually mean low-latency. The price data is transmitted on the on the second from the exchange then must reach traders, traders must compute what to do and respond with their orders. latency. Orders are filled in priority of FIFO at each price level. You can see this effect in volume graphs, most of the volume is squished up against these one second ticks.
14. June 2013 at 06:38
What do you think about the last post of David Beckworth?
14. June 2013 at 06:46
Why isn’t there a race for more sub second data…too much data to follow and act on is one reason. HFT is about picking up pennies. So you need to be dealing in many many stocks if you have even a modest sized fund. Second, there have been some nice papers showing that common trading algorithms work better in volume-time not real time. That is, the algorithm gains an information advantage only by trading once every N transactions. There is no advantage to trading more frequent as measured by the wall-time because there is no actionable signal in the data. This is known as the volume clock.
Very accessible slide deck from Ohara’s group at Cornell:
http://www.orie.cornell.edu/engineering2/customcf/iws_events_calendar/files/cfem_20120314_0.pdf
14. June 2013 at 07:00
Eugene Fama, the founder of EMH and obviously one of the foremost experts on finance doesn’t agree with Cochrane. He says that markets can adjust to new information through bid-ask spreads without a single trade.
14. June 2013 at 07:30
Finance is big because money has been tight since the early 1980s – debt growth and financials growth tracks the below-trend NGDP trend almost exactly.
When base money is tight, NGDP slows. When NGDP slows, incomes stagnate and yields fall: cash flows become scarce. Falling yields gives a present value boost to cash-generating asset value, and lowers the cost of indebtedness- which produces more debt. Savers stretch into risk for yield, and substitute debt for income as incomes stagnate.
Lower yields, slower incomes produce more debt – intermediated and traded by financials – in a cash-flow-poor economy.
The price of information is simply the cost of maintaining the fair relative value of the debt stock, and it is a competitive process. Not much cost there, and there are arguably positive externalities.
Insofar as tight money – which maintains the absolute high price of cash flow financial instruments – results in wasted resources (unemployment) or malinvestment, this is the major cost to society.
14. June 2013 at 07:58
Jon, Thanks for that info.
Paul, I’ll do a post.
John. You said;
“Eugene Fama, the founder of EMH and obviously one of the foremost experts on finance doesn’t agree with Cochrane. He says that markets can adjust to new information through bid-ask spreads without a single trade.”
I agree with both, I don’t see a conflict.
jknarr, I don’t see tight money as the cause of the long 30 year downtrend in real rates. It has certainly played a role in the recent sharp fall in rates, but finance has not done well during this period, as Cochrane points out.
Lower inflation expectations have probably help fiance, compared to 1978, but not compared to 1958.
14. June 2013 at 08:01
Here’s the part I was saying that Fama would disagree with.
“But the process of the market digesting its meaning, aggregating the opinions of its traders, and deciding what value the stock index should be with the new information, seems to need actual shares to trade hands.”
Fama was arguing that actual shares didn’t need to trade hands. That seems like a contradiction to me.
14. June 2013 at 08:26
Scott, fair enough, but I don’t wholly understand the focus on real rates — debt volumes and interest service/refinancing, and financial sector size are nominal phenomena, not real.
I’d argue that finance has done very well in the crisis. They took the risks, and government bailed them out when they were wrong.
The financial sector has since been handed the largest profit margins on record, and low rates/tight money is the backbone of these profits — if money eases, NGDP accelerates and yields rise, you will see very quickly how the financial sector shrinks, I am certain.
14. June 2013 at 08:28
The overshoot from news announcements — there does seem to be a process as different players “argue” over the meaning of new news, the price swings, until eventually the consensus expresses itself. There is also the fact that people perception of the news can change in matter of minutes, moving from “that is shocking” to “that is not such a big deal.” To use a non-finance example, If you watched Games of Thrones a couple of weeks ago, you may have said, “how could they do that, I am never watching this again.” To, “that was shocking, but I have to find out what happens now.”
High Frequency Trading — why is it a problem? If you really wanted it to go away, they could increase the tick size. Congress mandated decimal pricing several years ago. The theory was that it would benefit the small player vs. the institutional players. Instead, it created incentives to form dark pools, and HF traders. It killed the PCOES.
John, subscribing to data that refreshes faster than one per second isn’t that expensive.
“Note that economic theory predicts that society will devote too few resources to ferreting out useful information about corporate values.”
Can you explain? EMH says that people will work hard enough to exploit an inefficiency, to make the inefficiency small enough that any potential gains from further exploitation, would equal the cost of the effort to exploit it.
14. June 2013 at 08:41
There are a bunch of different issues with HFT. The one I often complain about is unequal access. Suppose I submit a limit buy order (and the best bid) on a public exchange at $25.25. A seller then issues a marketable sell. I should get filled since I’m the best bid, right? Nope. Often the sell will suddenly get executed at $25.26, or $25.2501, or even $25.25 (utilizing a different ECN from my order.)
Some combination of flash trading, exchange fragmentation, and low latency allows computer systems to intercept the order and get in front of my best bid — essentially cutting in line. In theory, the HFT bot is still offering the best price, but they didn’t take the adverse selection risk of a standing limit order.
Why is this a problem? Because it drives true liquidity providers out of the market, i.e., people who post public bids and offers. The adverse selection costs are too high, so these people get replaced by robots pinging 100 share microsecond bids and offers based on their private reserve prices.
The end result is a market prone to flash crashes, because all liquidity is held in private reserve, rather than publicly displayed, so a cascade of orders can crash the market before new private reserve orders can get submitted.
This is why I ridiculed the NYSE’s decision to expunge trades from the consolidated data feed if they feel those trades occur at “aberrant prices”. The market structure is designed to produce aberrant prices on purpose. It’s also why the single stock circuit breaker of 10% idea is a scam; it’s an attempt to create bounded dysfunction, so machines can rip people off, but not enough ever to make the headline news.
14. June 2013 at 08:53
This is good;
‘When outcomes seem puzzling…we embark on a three-pronged investigation. First, we work harder to find how supply and demand might really operate, in the humble knowledge that initially puzzling institutions and outcomes have often taken us years to comprehend. Second, maybe there is a “market failure””” an externality, public good, natural monopoly, asymmetric information situation, or missing market””that explains our our puzzle. Third, we often discover a “government failure,” that the puzzling aspect of our world is a consequence of laws or regulation, either unintended or the result or the result of capture.’
14. June 2013 at 08:56
The issue with fast trading around news releases is slightly different.
Normally, when a stock has a material news release, it gets halted pending public dissemination. This doesn’t happen with macro news, however. Rational traders usually cancel their bids and asks around macro news releases — you see this in a slowing of volume and a widening of bid-asks as the event draws near, especially in high information markets like the options exchanges.
However, the stock market doesn’t halt prior to macro news, and there are enough traders who don’t understand game theory (an odd combination of retail investors and institutions with big program orders) who leave standing orders out through news releases. These people get picked off by those who pay for early access. Rationally, if people knew a news release was pending, especially one with selective disclosure like the UMich Consumer Confidence, they cancel all orders prior to 9:54:58, and the value to selling that access would also go away.
As for the ethics of U Mich? That’s complicated. They have a right to sell market research data if they want, that’s absolutely clear. The question is whether they chirp like they are providing a public good, while profiting in secret. Anyone can do research, position themselves in front of that research and then publicize their opinions. Eventually that book-talking approach loses credibility, though, as people begin to question the integrity of the research.
UMich needs to convince people that the research is still a public good, even though they seek to profit privately from selling advanced access to it. It’s a slippery slope in terms of credibility.
14. June 2013 at 09:41
I completely disagree with attempting the venture of stopping High-Frequency Trading. So long as traders are profitable, they are providing a valuable service of smoothing out price adjustments and providing liquidity.
Cochrane’s proposal will slow price discovery and cause big jumps in prices that have the potential to cause additional panic selling. It would make the problem of instability he is trying to solve worse.
Problems like the “flash crash” or even the crash of 1987 happen because traders don’t buy low and sell high. For instance in both 1987 and 2010, a decline in stocks triggered sell signals which overloaded the market and caused rapid declines totally out of line with fundamentals. The portfolio insurance of 1987 had traders sell S&P indexes with declines and wait until prices recovered to buy back in. You don’t have to be Warren Buffett to know that this isn’t a good investing strategy (Buffett criticized it heavily).
Over time, HFTers will iron out unprofitable mistakes like portfolio insurance and as they do high profile debacles like the flash crash will stop and markets will work better over time. Trying to stop all panic selling is a pointless venture. Prices need to adjust quickly to new information and, for better or worse, people can sometimes have a herd mentality.
Markets are about price discovery and resource allocation not fairness. The fact that HFT discourages day trading or frequent buying and selling in general among the casual investing public is a virtue and not a vice. Long term investors should absolutely not be buying and selling every time news breaks. They’d probably be better off just sending their money directly to Wall Street.
14. June 2013 at 09:59
John, I think Cochrane was saying you need shares to change hands to get the final equilibrium price. That’s because the trades themselves convey information. I agree. I think Fama is saying prices can change to the expected new equilibrium, without any shares changing hands. I agree. But when trades occur, a still better equilibrium is achieved.
But I see why it looks like a contradiction.
Jknarr, Do you think the data in Cochrane’s paper is incorrect? He has data that finance has been shrinking in recent years.
Bailouts helped, but that’s different from low interest rates.
Doug, It’s not my area of expertise, but I believe people point to the “public good” nature of information. Much of the gains from new information goes to others.
Steve, I had the same reaction when they announced that they would not execute trades during flash crashes. That seemed crazy to me. You want to punish people buying and selling at stupid prices.
Patrick, Yes, there’s lots of good stuff in the paper.
14. June 2013 at 10:08
Jon, I don’t know where the heck you’re getting the once-per-second limit to market data fact from. It’s wildly incorrect.
In any case, I grow tired of HFT complaints for a number of reasons. It’s such a non-issue.
With regards to the specific suggestions above, some products are already matched irrespective of arrival time; those matching algorithms have their own problems. For example, Eurodollar STIR futures are matched by price THEN order size, similar to the suggestion. It’s not obviously better. It gives people incentive to submit wildly oversized orders.
Also, if you really think the “arms race” for speed is such a problem, congratulations! It’s almost over! You can never go below zero seconds, and we’re almost there. Why would Cochrane think that 1 second is so much better than 0 seconds and so much better than 10 seconds or 10 minutes?
14. June 2013 at 10:16
Finance has shrunk — but it should have shrunk by much, much, more in 2008. At the zero bound, demand for new debt is all tapped out — their business models die there, if a new source of leverage is not found (they found it with the Fed).
The bailouts kept them as going concerns, and then they tapped the ultra-low rates for massive profit margins and carry. Not bad for a dead-to-dying business model at the zero bound.
Why do you think that they keep 25bps in FF and IOR — to keep money market funds from imploding: QED, the Fed favors tight money to keep the size of the financial sector (relatively) large.
14. June 2013 at 11:03
I’m very much with David Henderson on this one, and I’d take it many notches further – http://ashokarao.com/2013/06/14/insiders-rents-and-reuters/
I’m also skeptical of adding finance to DSGEs. It’s seems like a flavor-of-the-month without any extra theoretical validity.
14. June 2013 at 17:31
Jason braswell,
See the data in the paper, “low latency trading”, hasbrouck and starr 2011
There is an extremely strong one second period in order volume. You can get quote data faster but most do not buy that level of service. It isn’t necessary for most strategies.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1695460
18. June 2013 at 10:56
Scott,
I found this article and I wondered if your views on money agree with this author. I found the concept easy to understand, but it seems too simplistic somehow…
http://neweconomicperspectives.org/2013/06/let-it-be-done-an-alternative-narrative-for-building-what-america-needs.html
20. June 2013 at 20:55
All free trade is productive.
———
Is this post an unstated apologia for why long term rates rose after the Fed’s tightening announcement? That “volatility” is the get out of jail card? “That is something I’ve noticed as well”…was that before or after you berated the Cantillon effect fans?
24. September 2013 at 06:15
Hey everyone, what is the top palce to get automobile accident info?