By Adam Nunes, Head of Business Development, Hudson River Trading
For a while now, there has been a lot of debate and handwringing about the relative slowness of the Securities Information Processor (“SIP”). It has been identified as a single point of failure and been blamed for creating a two-speed marketplace since SIP data moves slower than exchange direct-feed data. Nasdaq and NYSE have been working closely with the industry to ensure the resiliency of the SIP, but the SIP is still not as fast as direct feeds. It’s time to take the discussion a few steps ahead and enact concrete proposals to fundamentally update this critical piece of market infrastructure mandated by regulatory directives such as Reg NMS and the vendor display rule.
Last year, BlackRock wrote a letter to the Securities Exchange Commission (SEC) suggesting “exchanges should make the necessary investments in technology to reduce the latency between the SIP and private data feeds to market acceptable standards.” Two months prior, SIFMA recommended “the central SIP structure should be eliminated and replaced with commercially competitive Market Data Aggregators.” And nearly 15 years ago, the SEC’s venerable Seligman Commissionrecommended “competing consolidators to evolve from the current unitary consolidator model.”
I believe these goals can be reached with a simple change in the technical structure of the SIP that would eliminate the latency difference between direct market data feeds and the SIP, remove the SIP as a single point of failure and create a platform that makes the market better for all investors.
Here is how the SIP currently works for a firm trading NYSE stocks on Nasdaq using the SIP data:
• Nasdaq receives an order to trade an NYSE stock in its data center from a dealer in their very same data center
• That order leads to an update in that stock’s quote that must be reported to the NYSE SIP in NYSE’s data center
• Nasdaq sends the update to the NYSE SIP in a data center several miles away
• The NYSE SIP processes the update
• The NYSE SIP sends the updated information back to the trading firm in the Nasdaq data center
• The trading firm in the Nasdaq data center can then process and use the data
If this strikes you as extremely inefficient it’s because it is. It’s like if you’ve ever spoken to someone on a cell phone when you are close enough to talk to them in person. The lag of your voice and theirs pinging a cell tower and returning back to your phones is much less efficient than just communicating directly. Think of your live voice in earshot is the way direct feeds are disseminated, the cell phone lag is how the SIP is disseminated. It doesn’t have to be this way.
A better method is the “competing consolidator” model. Under this proposal: 1) firms would order the SIP data as they do today, by contacting their vendor or the SIP administrator, 2) the firm/vendor connecting to the SIP would get a connection to each exchange to listen to their data where the data is produced (rather than getting the data from a central location) and 3) the firm would receive and process the data similarly to how it handles direct market data feeds.
A minority of members on the Seligman committee noted the technical challenges associated to moving to this model. But given the technical advances in the 15 years since, the challenges have been solved. And surely there would be a host of vendors competing to consolidate the data much quicker than the current SIP model. By moving to this more open architecture model, we could substantially improve the performance of the SIP and introduce competition into the process. In addition, because there would be multiple consolidators using different code bases, we would remove the single point of failure that exists with the current SIP structure.
Instead of focusing on just increasing the speed of the SIP, I think it is time to consider changing its structure to be quicker, more resilient and better positioned to deliver data that is critical to the efficiency of the securities markets. In the process, it would alleviate the concentration risk for a processor solely responsible for all NASDAQ stock quotes and a separate processor solely responsible for all NYSE and Tape B quotes and help bolster investor confidence in the markets. Unleashing these competitive forces would be an important first step in improving investors’ access to market data.