Now that the Dodd-Frank Wall Street Reform Act is signed into law, there lies a mountain of work ahead for regulators. Making sense of the 2,000+ page document and turning it into viable recommendations and regulations will be an arduous process.
The Commodity Futures Trading Commission's second Technology Advisory Committee meeting titled “Technology: Achieving the Statutory Goals and Regulatory Objectives of the Dodd-Frank Act,” will be held on October 12, 2010 at 1:00 p.m., in Washington, D.C. (http://tinyurl.com/2vfdp4n). At the meeting, my committee colleagues and I will discuss some of these goals and objectives. Specifically, as a result of the SEC & CFTC's report on the May 6th flash crash, CFTC Commissioner Scott O'Malia has said that he wants to take a look at whether algorithms that cause disruption in markets - rogue algorithms - should be treated as if they were rogue traders.
Commissioner O’Malia said in the announcement of the October 12 meeting: “While I do not believe that the flash crash was the direct result of reckless misconduct in the futures market, I question what the CFTC could have done if the opposite were true. When does high frequency or algorithmic trading cross the line into being disruptive to our markets? And, along those same lines, who is responsible when technology goes awry? Do we treat rogue algorithms like rogue traders?"
This is an interesting topic. When does an algorithm 'go bad'? Is it the algorithm's fault? Of course not, an algorithm does not decide to go rogue. It is down to human error - either in the programming or the execution thereof. In the case of the flash crash a mutual fund chose a 'dumb' execution algorithm preset with inappropriate parameters to execute a large futures sell order in a market that was - by all accounts - ready to plummet. This circumstance illustrates how rogue algorithms can evolve as an unintended consequence of circumstance and/or human misjudgment.
When a trader goes rogue it is more deliberate. It can be because he is losing money and hiding it - as in the case of Jerome Kerviel at SocGen, or maybe he had too much to drink at lunchtime and was feeling invincible - like Steve Perkins at PVM. The former lost the bank over $6bn, the latter lost his brokerage $10m. These were very human errors, effectively the work of scoundrels.
What rogue traders and rogue algorithms have in common is that both can, in many circumstances, be detected early - or even prevented - through the use of better technology. Comprehensive pre-trade analysis, including backtesting algorithms under a wide range of circumstances, could have prevented the 'dumb' May 6th algo from having its way with the market. Thorough real-time risk management and monitoring could have spotted Kerviel's limit-busting trading patterns and his hiding the trades. Pre-trade risk controls would have kicked the PVM trader out of the system before he got in too deep.
It is no longer acceptable to blame rogues and scoundrels for market anomalies or for banks, brokers and buyside firms losing money. The technology is there, it simply needs to be used.
View all posts from The Progress Team on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.
Subscribe to get all the news, info and tutorials you need to build better business apps and sites
Copyright © 2019 Progress Software Corporation and/or its subsidiaries or affiliates.
All Rights Reserved.
Progress, Telerik, Ipswitch, and certain product names used herein are trademarks or registered trademarks of Progress Software Corporation and/or one of its subsidiaries or affiliates in the U.S. and/or other countries. See Trademarks for appropriate markings.