Deliver Awesome UI with the most complete toolboxes for .NET, Web and Mobile development
Automate UI, load and performance testing for web, desktop and mobile
Rapidly develop, manage and deploy business apps, delivered as SaaS in the cloud
Build, protect and deploy apps across any platform and mobile device
Automate decision processes with a no-code business rules engine
A complete cloud platform for an app or your entire digital business
Deploy automated machine learning to accurately predict machine failures with technology optimized for Industrial IoT.
Optimize data integration with high-performance connectivity
Connect to any cloud or on-premises data source using a standard interface
Build engaging multi-channel web and digital experiences with intuitive web content management
Last week we published some predictions for capital markets would evolve in 2010. I’d like to say a bit more about them.
Firstly, we predict there will be a big uptake of the use of technology for regulatory compliance and enforcement. Whilst the turmoil of the last 18 months was primarily caused by the over-the-counter, credit derivatives market, the effects of increased regulatory scrutiny are being felt throughout the trading industry. The power and authority of regulators has been bolstered, exchanges and alternative trading venues understand there is a greater need to monitor trading activity, and brokers and buy-side firms want to monitor and control their own and their clients’ trading activities better. There has been a significant debate in the media in the last few months on the merits of high frequency trading and variants of. This started in the specialist trade media, then made it to mainstream news outlets such as the BBC and it has been a topic deemed sufficiently important to be discussed by members of the US Congress and the British government. This has resulted in pressure on market participants to really up their game as far as trade monitoring is concerned. So how will technology be used to better enforce regulation and control over trading activity? Let’s start with the liquidty venues – the exchanges, the MTFs, the ECNs and the dark pools. Regulated exchanges in advanced markets generally do real-time monitoring of trading activity already to spot patterns of market abusive behaviour. They will need to continue to invest to ensure that they have the technology that can scale and be flexible enough to evolve with changing patterns of trading behaviour. In contrast, exchanges in emerging markets do not often have adequate monitoring systems. This will change – we’re seeing substantial interest in Apama for exchange surveillance in less developed markets. At the other end of the liquidity spectrum, the regulation around dark pools will change. It is likely that there will be limits imposed on the proportion of stock that can be traded through dark pools and operators will need to to disclose more information. Furthermore, regulators will insist that dark pool operators prove they have adequate monitoring systems in place. It won’t be just a paper exercise – they’ll have to prove it. Brokers will be in a similar position. Each participant in the trading cycle in the UK, for example, has a responsibility to ensure that the market is working fairly. The UK regulator, the FSA, is putting pressure on brokers to show that they have proper trade monitoring technology in place so customer and internal flow can be understood better.
Let’s move on to hosted services. “Cloud computing” is certainly a term du jour. What does it mean for capital markets? The first thing to say is that cloud computing, as in the provision of hosted software and computing resources, is a very familiar technology concept in capital markets, even though until recently people may not have used the term “cloud computing”. Anyone using a Reuters market data feed, or passing over orders over FIX to a broker, or accessing a single bank foreign exchange portal, is accessing services in the cloud. In fact electronic trading relies to a great extent upon cloud services. In 2010 however, more hosted services of a richer functional nature are going to become available. Instead of just building blocks – the market data, the DMA access etc. – more services will become available for algorithmic trading and risk management. Brokers do offer hosted algo services already, but they are broker specific. An example of a hosted algo service is one we launched with CQG recently. These will mature and broaden their scope. These types of services are invaluable to mid-sized trading organisations who can’t, or don’t want to, build a whole range of systems themselves.
Lastly, our prediction about emerging markets. We’re seen significant growth this year in demand for Apama in Brazil, India and China. Brazil particularly, because of continued economic growth and market liberalisation, has led the way (for example, Progress has 15 customers using Apama in Brazil now). India and China are getting there. They have further to go in market liberalisation to fuel the demand for algorithmic trading, but to attract extra investment and liquidity to their domestic markets they’ll be left with little choice. Hong Kong is an exception – algorithmic trading is used extensively both by global and regional players and it provides a window onto developed markets that mainland China can learn from.
Capital markets will evolve quickly in 2010, as in every year. That's what makes it such an interesting area to work in.
View all posts from The Progress Guys on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.
Copyright © 2017 Progress Software Corporation and/or its subsidiaries or affiliates.
All Rights Reserved.
Progress, Telerik, and certain product names used herein are trademarks or registered trademarks of Progress Software Corporation and/or one of its subsidiaries or affiliates in the U.S. and/or other countries. See Trademarks for appropriate markings.