The Algorithmic Arms Race
One risk of using algorithms for trading is that other traders can take advantage of your algorithms if they decipher how they work. Consider a wave-trading algorithm that places large waves into the market every 30 minutes. If a human or algorithm realizes the pattern, they could front-run the orders; that is, they could buy at a discount before the next order causes the price to go up. The more common the trading strategy, the easier it is to reverse engineer. As the uptake of algorithmic trading has increased in recent years, firms are under increased pressure to take measures to ensure their strategies don't fall victim to this practice. During 2006, algorithmic trading entered the mainstreamalgorithmic techniques and the technology that powers them are now highly influential in the way that financial instruments, both in exchange and OTC markets, are traded. Prior to this widespread use, firms could gain competitive advantage just by using algorithmic techniques. Today, it is the way in which they use trading algorithms that gives them a competitive edge.
Trading algorithms become interestingand more effectivewhen traders combine different techniques in new and complex ways to create unique algorithms that are more difficult to reverse engineer. The markets change everyday and new opportunities continually emerge. It is in the interest of traders to be able to create these new algorithms as quickly as possible, to capitalize on opportunities before their competitors. In this way, algorithmic trading is forming its own "arms race," where the slightest technological advantage can make the difference between substantial profit and loss.
Algorithm Creation Using Complex Event Processing
As already discussed, key advantages can be gained by shortening the time between algorithm conception and algorithm implementation. Using traditional approaches, like development in C++ or Java, algorithms can take a long period of time to implement. Sometimes it can be weeks or months before algorithms are integrated into the markets, tested, and put into production. To stay ahead of their competitors, firms require an approach to shorten this time to develop, test, and deploy algorithms.
One promising approach is that of Complex Event Processing (CEP). CEP is a new paradigm of computing that allows organizations to quickly respond to data that is continuously changing. CEP allows firms to monitor, analyze, and act on events in milliseconds. This can have a profound impact on the operations of many businesses, as, in today's market, organizations must deal with exploding volumes of fast-moving data, driven by new modes of connectivity and the demands that this connectivity brings.
Traditional data processing is typically database driven and requires you to store and index the data prior to query. That can be time consuming, particularly for certain apps where responsiveness is crucial to effectiveness. CEP allows you to, in effect, determine the queries in advance by setting certain parameters and then "stream" the data through them, so the relevant data may be selected. This makes it possible to monitor, analyze, and act on those rapidly moving events more quicklywithout dependence on a database. This provides a much more time-sensitive response to the eventsin effect, responding as they happen.
In trading, CEP takes a new approach to constructing algorithms. It is particularly suited to algorithmic trading; however, it is also suited to many other types of event-driven algorithms. CEP algorithms are structured as sets of event-based rules. These rules monitor items in incoming data streamstermed "events" (see Figure 2). Each event represents an update within the system. An example of a CEP rule is: "When the spread between the price of Shell and Exxon exceeds level x, then buy Shell and sell Exxon." Using this approach, complex algorithms can be constructed quickly. An example of a complex CEP rule is shown in Figure 3. CEP rules are hosted inside CEP engines, which efficiently monitor and execute rules. CEP engines can be permanently connected to a wide range of trading venues, so algorithms can be injected into the engine and immediately start monitoring real-time data streams, and take real-time actions on remote services.