Press Center

 Remarks by Counselor to the Secretary Antonio Weiss at The Brookings Institution


8/3/2015
Thank you for that kind introduction, and thank you Doug for organizing this event.  As you know, Treasury staff, working with staff from the New York Fed, Federal Reserve Board, SEC and CFTC, recently released a report on the volatility in Treasury markets last October 15th.  I would like to begin by acknowledging the teams at all five agencies for their hard work and close collaboration.  The final report is the result of their rigorous analysis, allowing us, for the first time, to have a fact-based conversation about the events of that day and, importantly, the evolution of Treasury markets.  
 
The Treasury market is the deepest, most liquid bond market in the world.  There are over $12.5 trillion in marketable securities outstanding, with an average of roughly $500 billion traded every day.  Treasuries are the uncontested safe haven during periods of turbulence for investors all across the globe and an essential element for primary and secondary market functioning.  October 15th remains an anomaly, but our analysis of that day underscores the profound changes occurring in the structure of the Treasury market, in particular the effects of technology.  
 
Today, I will provide a short description of what happened on October 15th and discuss the factors that contributed to the volatility that day.  I will then highlight some questions regarding market structure and suggest a preliminary framework for evaluating potential risks and policy responses.  
 
The Events of October 15
 
Let’s start with the events of October 15th.  As you all surely recall, that day was highly unusual for Treasury markets.  The 10-year yield experienced a 37 basis point swing, only to finish 6 basis points lower, with a startling, 16 basis point round trip during a 12-minute interval.  There have been only three other days since 1998 when the 10-year has traded in such a wide range, and each resulted from a significant monetary policy announcement. There was nothing of the kind on, or around, October 15th.
 
At 8:30 that morning, September retail sales were released, modestly below expectations.  Ordinarily, a slight “miss” might drive yields down by a couple of basis points.  Instead, over the course of the next hour, Treasury yields declined 20 basis points.  Then, at 9:33 a.m., yields fell precipitously.  Over the next six minutes, the 10-year yield dropped an additional 16 basis points, only to retrace the entire move in the following six minutes.  This 16 basis point round trip is what many have dubbed the “flash rally,” and what the report refers to as the event window.
 
At that time, the head of one trading desk described conditions as panicked.  
 
In the days and weeks that followed, many explanations for what happened were put forward.  Some suggested that an algorithm had “misfired” or that a human trader had accidentally submitted an order to purchase a large number of Treasuries. Others suggested that algorithmic traders had simply shut off their machines.   
 
All of these suggestions, however, were speculative and, as it happens, inaccurate. At the time, no one – including official institutions – had examined detailed trade and order book data to test these hypotheses. The staff report represents the most comprehensive study of trading in the U.S. Treasury markets since a 1992 report following the Salomon Brothers bidding scandal and a regulatory report to Congress in 1998.
 
Report Findings
 
We find that a confluence of events laid the groundwork for the volatility that day. 
 
As confidence in the economic recovery improved earlier in the year, bets that U.S. interest rates would rise had become popular.  In fact, by the end of September, levered short positions in near-term interest rate futures reached record levels.
 
But instead of increasing, interest rates marched steadily lower.  In the first two weeks of October, growth and deflation risks in the Eurozone, accompanied by uncertainty about the ECB’s response, generated real doubt on the part of investors about the prospects for global growth.  The pessimistic tone of the annual IMF/World Bank meetings the weekend prior to October 15th exacerbated sentiment.  Investors, and the public at large, were also concerned about the unprecedented outbreak of Ebola. In response to these risks, investors turned to Treasuries, driving up prices and lowering yields.  Levered funds, with record levels of short positions, were caught on the wrong side of the trade and began to unwind the bets, which further compressed yields.
 
All of this set the stage for the initial volatility on October 15th – the first 20 basis points. But none of it explains the timing nor the distinctive pattern of Treasury yields during the event window.  In those twelve minutes, trading volumes exploded and market depth, the amount available for purchase or sale in the order book, collapsed.  Volumes in the futures market surged to nine times typical levels, while market depth thinned by up to 80 percent.  
 
Much of this trading activity was conducted by firms that engage almost exclusively in algorithmic trading – labeled Principal Trading Firms or “PTFs” in the report.  In other words, instead of turning off their machines en masse, as some had claimed, PTFs not only stayed in the market but increased both their level and proportion of volume.  In fact, during the roundtrip in prices, PTFs accounted for 70 to 75 percent of total trading in both the cash and futures markets, up from about 50 percent on “normal” days.   
 
At the same time, bank dealers responded to the extreme volatility by widening their bid-ask spreads and, at times, withdrawing altogether from the offer side of the order book.  Both PTFs and bank dealers responded rationally in an economic sense on October 15th. It made sense for PTFs, who avoid accumulating large intraday positions, to reduce order size. It also made sense for bank dealers to increase bid-ask spreads to compensate them for the positions they took in the midst of the volatility. But the result of these rational individual decisions was a dramatic reduction in overall market depth.  
 
While prices moved at extraordinary speed over those 12 minutes, price action remained continuous. Trading occurred all the way down in yield and then all the way up.  There were no gaps.  Indeed, many PTFs with whom Treasury spoke (unlike the traditional broker-dealers) noted that in their view the market continued to function well throughout the day. It was not broken like the equity markets during the May 6, 2010 “flash crash.” 
 
In addition to debunking the notion that PTFs turned off their machines, the analysis did not find any evidence of a “fat finger mistake” – an accidental large order – or a runaway algorithm. There was also no cascade of stop-loss orders, as is often the case with “mini” flash crashes observed in other markets.  
 
Key Questions 
 
So what was the cause? From what we know, there was no direct causal link between any one player and the activity we saw on October 15th. But a large part of the story of what happened between 9:33 and 9:45 a.m. appears to turn on the interaction at very high speeds of complex algorithms that generate massive amounts of trading activity at speeds far too fast for human beings to track. High-speed trading algorithms, which are employed by many bank dealers and hedge funds, as well as by PTFs, appeared to respond to an extraordinarily one-sided trading environment, with far more willing buyers than sellers, by generating a rapid rally in treasuries. And human traders took several minutes to react. 
 
One active market participant contrasted this dynamic to trading in Boeing stock in mid July 2013, when the price suddenly dropped by 7 percent after an unoccupied Dreamliner caught fire at Heathrow.  On October 15th, this trader spent five minutes or so scanning news headlines, checking Twitter feeds, making phone calls.  Only when he was able to confirm that there was no identifiable “event,” i.e. there was no fire in this case, was he willing to step in.
 
The conclusions of the October 15th report were unsatisfying to some, because we did not uncover a single "smoking gun."  As is often the case in complex systems, it is difficult to reduce outcomes down to simple causes. Moreover, the broader findings of the report were equally, if not more, significant. The evolution of treasury markets, in particular, cash markets, represents a fundamental, technology driven shift in market structure. We have all been aware of this shift for some time, but the report has cast a brighter light on it. 
Algorithmic trading has been well established in equities and futures since the late 1990s, and now accounts for a majority of trading in most standardized, liquid securities, including more than half of activity on inter-dealer trading platforms for cash Treasuries. 
 
High-frequency trading, which is a subset of algorithmic trading, is applied in a number of strategies by an array of firms, ranging from owner-operated start-ups to the largest hedge funds and broker-dealers. It is, quite simply, a disruptive technological innovation, which has reshaped an entire industry structure. It has exerted competitive pressure on traditional players by tightening pricing parameters and creating informational advantages.  As with any major technology disruption, there are potential benefits and risks.  The key for policymakers is to recognize that the technology is here to stay, and we need to be forward thinking about its effects on market functioning. 
 
At Treasury, we are engaging with market participants, academics and other policy makers to better understand the new market structure. Our view is that any policy proposals should be carefully tailored to address the risks they are designed to address.  Today, I will share a few thoughts on four broad categories of risk that will help to guide our future inquiry: operational, oversight, fair dealing, and market resiliency.
 
First – operational risk. Operational risk is, of course, inherent in any financial transaction. But the extraordinary speed and automated nature of execution mean these risks may be heightened in the case of high-frequency trading and the race for ever-faster technology. The constant pursuit to save one more millisecond not only consumes resources potentially better invested elsewhere, but increases the pressure on the plumbing of the system to handle ever-increasing speeds and message traffic.  So we need to consider whether the race for speed, at this already advanced stage, helps or hurts market functioning.  
 
A second, related category is oversight and risk management. Many of the large new entrants in Treasury markets are not subject to regular oversight, and may not be sufficiently capitalized to withstand unexpected losses. As we saw with the collapse of Knight Trading in 2012, untested software can wreak havoc in markets and put one of the largest trading firms out of business in a matter of hours. We need to assess how minimum standards are set for testing new systems and introducing new algorithms.  And there is the question of margin requirements, which may only be assessed a couple of times per day, whereas thousands of trades are now executed in a fraction of a second.   
 
The third category might broadly be described as “fair dealing.” While trading practices that are manipulative or fraudulent are not new to financial markets, automated trading may provide traders with additional tools to beat the system.  The report discusses “self-trading,” where one legal entity is on both sides of a trade.  During the “flash rally,” the amount of “self-trading” was elevated, especially during the first half of the event window. Not all self-trading is illegal, and I want to be clear that I am making no assumptions regarding the legality of this activity that day or any other day. We need to learn much more about the reasons for self-trading, and should consider whether there are benefits that outweigh the potential appearance of impropriety. 
 
The Treasury Markets Practices Group recently adopted a revised set of best practices encouraging firms to submit bona fide offers and avoid potentially manipulative practices, such as submitting a barrage of orders to increase latency – in effect, clogging the pipes – or submitting orders with the intent to cancel them before execution. Best practices are an important complement to an appropriate regulatory structure, but they are not universally adopted.  Moreover, many participants in the Treasury market are not even required to register with regulatory authorities.  So additional work is required to determine best practices and to identify and close regulatory gaps.
 
The fourth category, the broadest, is market resiliency. Perhaps the key question asked by the report is whether an improvement in average liquidity conditions may have come at the cost of rare but severe bouts of volatility.  Numerous factors are changing the way our markets function and the way liquidity is provided.  The persistence of low bid-ask spreads on October 15th, together with record volumes and severe deterioration in market depth, may be an indication that we need to rethink and modernize our measures of liquidity. 
 
Underlying all of these questions is the essential role of access to data, and coordination across markets and agencies. 
 
The only readily available data are futures market transaction reports provided to the CFTC. Order book data are available on request by the CFTC. Information about activity in cash Treasury markets is not readily accessible, and authorities have virtually no visibility on dealer-to-customer activity, which by some estimates is well over 40% of the cash market.  Moreover, regulatory authority over Treasury cash and futures markets is fragmented, making cooperation essential, but also slowing response times considerably.  
 
Put simply, we cannot get the information we need to analyze risk across Treasury markets in anything that approaches real time, and that has to change. 
 
Conclusion
 
The report on October 15th and the work to follow – including today’s event – represent an opportunity to undertake the most comprehensive review of the treasury market since 1998 – years before algorithmic trading even began in Treasuries.   
 
It is not clear what the end-state for treasury markets will be. In this period of transition, policy makers and market participants have an opportunity to shape events, and we should learn from our experience in other markets as we do so. 
 
As we undertake this important work, we will be deliberate in our analysis, diligent in gathering the best ideas, and fact-based in our policy prescriptions. And policy should ideally be tailored to the specific risks.  But the Treasury market is evolving, and we need not only to keep pace with events but to plan wisely for the future.
 
Thank You.

Bookmark and Share