Algorithmic Trading

In the algorithmic trading space, an "algorithm" describes a sequence of steps by which patterns in real-time market data can be recognized and responded to.


March 09, 2007
URL:http://www.drdobbs.com/parallel/algorithmic-trading/197801615

John is the founder and Vice President of Apama Products, Progress Software.


Anyone with a computer science background is familiar with the concept of algorithms for tasks such as the searching or sorting of data. However, what about algorithms that specify a sequence of steps to make money in the capital markets? That's exactly what the latest excitement around "algorithmic trading" is all about.

In the algorithmic trading space, an "algorithm" describes a sequence of steps by which patterns in real-time market data can be recognized and responded to in order to detect trading opportunities and place and manage orders in the market. The term "algorithmic trading" has only become commonly used within the financial sector over the past few years—although trading algorithms have been around for longer. Historically, large investment banks have deployed armies of Ph.D.s to custom build trading algorithms. Now, an advanced technology approach called "Complex Event Processing" (CEP) is making it much quicker and easier to build, deploy, and manage trading algorithms, with fewer personnel necessary.

What Do Algorithms Replace?

Before the days of automated algorithms within the financial markets, traders manually carried out the process of building and managing a trading strategy. Sitting at specialized trading stations, with four or eight screens, traders watched as real-time market data changed. By manually tracking analytics and patterns, possibly in a spreadsheet, traders worked out when and where to place orders into the market and then managed these orders to see if they were fulfilled. The trader understood the workings of the algorithm, but each step was a manual process. Now, apart from initiating a particular algorithm, the trader does not have to be involved at all. In most cases, the trader will monitor the algorithms using a graphical dashboard. In fact, a trader can now initiate and manage hundreds, or even thousands, of independent algorithms—as opposed to doing one thing at a time manually. This way, the trader is scaled to become much more productive. However, as will be discussed later in this article, the algorithm doesn't replace the trader. It is the trader and his or her team of quantitative analysts who devise new algorithms and tailor existing ones.

What Makes Trading Algorithms Possible?

Trading algorithms have been made possible by the open availability of electronic APIs, to enable connectivity to exchanges and other electronic trading venues. In equities and some futures trading, exchanges provide a centralized venue to buy and sell stocks and futures. Foreign exchange is similar, but there are many more independent electronic venues, rather than a centralized exchange (this is called an over-the- counter, or OTC, market). Streaming data can be received by connecting directly to the trading venue or through an information provider, such as Reuters. The streaming data represents the changing prices and availability of instruments on the venue's order book. It is also possible to send orders into the venue's order book, thus enabling buying and selling at an available price, or registering a bid or offer at a certain limit.

Giving an algorithm access to these multiple APIs enables it to watch the changing market data and place orders when certain desirable levels are met.

What Do Trading Algorithms Do?

There are a variety of algorithms in common use within the financial industry. However, the battle for supremacy in algorithmic trading exists in the creation of new and bespoke algorithms. The aim is to develop the most profitable algorithm at the expense of all others.

The two main parts of a trading algorithm are sequences of steps determining when to trade and how to trade.

Determining when to trade is a decision that revolves around watching the changing market data and detecting opportunities within the market. This is the analytic part of the strategy. As an example, consider a "pairs trading" strategy (see Figure 1). This strategy examines pairs of instruments that are known to be statistically correlated. For example, consider Shell and Exxon. Both are oil stocks and so, to a large degree, are likely to move together. Knowledge of this trend creates an opportunity for profit, as on the occasions when these stocks break correlation for an instant, the trader may buy one and sell the other at a premium. This is what a pairs-trading strategy is all about. Here, the algorithm involves monitoring for any changes in the price of either instrument and then recalculating various analytics to detect a break in correlation. This can be calculated, for example, by identifying that the spread between the two instruments has exceeded certain standard deviations (so-called "Bollinger Bands").

[Click image to view at full size]

Figure 1: Pairs trading strategy (statistical arbitrage).

Determining how to trade centers on placing and managing orders in the market. As an example, consider a "wave-trading" strategy, which breaks up a large order into smaller orders and places them sequentially into the market over time. The benefit of this is that large orders can get a poor price and can also have a major impact in moving the market overall. Smaller orders are more likely to flow under the market's radar, and subsequently have fewer consequences at a higher level. The wave-trading algorithm simply calculates a number of smaller slices based on trader input and then at prescribed intervals, it places the next wave into the market, barely creating a ripple.

The Algorithmic Arms Race

One risk of using algorithms for trading is that other traders can take advantage of your algorithms if they decipher how they work. Consider a wave-trading algorithm that places large waves into the market every 30 minutes. If a human or algorithm realizes the pattern, they could front-run the orders; that is, they could buy at a discount before the next order causes the price to go up. The more common the trading strategy, the easier it is to reverse engineer. As the uptake of algorithmic trading has increased in recent years, firms are under increased pressure to take measures to ensure their strategies don't fall victim to this practice. During 2006, algorithmic trading entered the mainstream—algorithmic techniques and the technology that powers them are now highly influential in the way that financial instruments, both in exchange and OTC markets, are traded. Prior to this widespread use, firms could gain competitive advantage just by using algorithmic techniques. Today, it is the way in which they use trading algorithms that gives them a competitive edge.

Trading algorithms become interesting—and more effective—when traders combine different techniques in new and complex ways to create unique algorithms that are more difficult to reverse engineer. The markets change everyday and new opportunities continually emerge. It is in the interest of traders to be able to create these new algorithms as quickly as possible, to capitalize on opportunities before their competitors. In this way, algorithmic trading is forming its own "arms race," where the slightest technological advantage can make the difference between substantial profit and loss.

Algorithm Creation Using Complex Event Processing

As already discussed, key advantages can be gained by shortening the time between algorithm conception and algorithm implementation. Using traditional approaches, like development in C++ or Java, algorithms can take a long period of time to implement. Sometimes it can be weeks or months before algorithms are integrated into the markets, tested, and put into production. To stay ahead of their competitors, firms require an approach to shorten this time to develop, test, and deploy algorithms.

One promising approach is that of Complex Event Processing (CEP). CEP is a new paradigm of computing that allows organizations to quickly respond to data that is continuously changing. CEP allows firms to monitor, analyze, and act on events in milliseconds. This can have a profound impact on the operations of many businesses, as, in today's market, organizations must deal with exploding volumes of fast-moving data, driven by new modes of connectivity and the demands that this connectivity brings.

Traditional data processing is typically database driven and requires you to store and index the data prior to query. That can be time consuming, particularly for certain apps where responsiveness is crucial to effectiveness. CEP allows you to, in effect, determine the queries in advance by setting certain parameters and then "stream" the data through them, so the relevant data may be selected. This makes it possible to monitor, analyze, and act on those rapidly moving events more quickly—without dependence on a database. This provides a much more time-sensitive response to the events—in effect, responding as they happen.

In trading, CEP takes a new approach to constructing algorithms. It is particularly suited to algorithmic trading; however, it is also suited to many other types of event-driven algorithms. CEP algorithms are structured as sets of event-based rules. These rules monitor items in incoming data streams—termed "events" (see Figure 2). Each event represents an update within the system. An example of a CEP rule is: "When the spread between the price of Shell and Exxon exceeds level x, then buy Shell and sell Exxon." Using this approach, complex algorithms can be constructed quickly. An example of a complex CEP rule is shown in Figure 3. CEP rules are hosted inside CEP engines, which efficiently monitor and execute rules. CEP engines can be permanently connected to a wide range of trading venues, so algorithms can be injected into the engine and immediately start monitoring real-time data streams, and take real-time actions on remote services.

[Click image to view at full size]

Figure 2: Complex Event Processing (CEP).

[Click image to view at full size]

Figure 3: An example of a complex rule and the concepts of CEP.

Graphical Algorithm Modeling

One interesting innovation that is being employed in conjunction with CEP platforms is the ability to implement new algorithms graphically. Graphical programming has always been a challenging area. Using graphical development environments to develop new programs on top of traditional languages, it can take as much time and knowledge as simply typing in the text of the language syntax. However, graphical modeling tools have been very successfully used in conjunction with CEP platforms. Modeling state flow and rules in an event-based system is well suited to graphical abstractions (see Figure 4).

[Click image to view at full size]

Figure 4: Graphical algorithm modeling using CEP.

As well as graphically modeling the logic inside their algorithms, today's tools give the traders the ability to visualize, in real time, all runtime activity once their algorithm is running. Real-time "dashboards" can display representations of the changing real-time variables within the algorithms, with automatic alerts when complex conditions or exceptions are detected. Dashboard design studios and runtime rendering frameworks act as a complete design and deployment environment with a wide range of visual objects, including meters, scales, tables, grids, bar and pie charts, along with trend and x-y charts—all of which change dynamically as events occur in real time (Figure 1 shows an example of a deployed dashboard). Elements are accessible through a design palette from which the objects can be selected, placed on a visual canvas, and parameterized. This capability removes the reliance on the technical development team traditionally required for the creation and adaptation of trading strategies.

The Future

One question that is occupying the minds of many with an interest in algorithmic trading is: "Will this ultimately replace the trader?" The answer is no—for now. Algorithms have expanded the capabilities of the trader, making each trader much more productive. It still falls to humans to devise new algorithms by analyzing, with computer help, opportunities in the changing market.

Algorithmic trading technology will only begin to replace humans if algorithms are actually devised, developed, tuned, and managed by other algorithms. There are already some techniques being deployed to this end.

One approach is the automatic selection of an appropriate algorithm to use in a particular circumstance, based on events occurring in the market at that point.

Another approach is the use of "genetic" algorithms, whereby a large number (potentially thousands) of variants of an algorithm are created—each with slightly different operating parameters. Each variant can be fed with real market data, but rather than actually trading, can calculate the profit or loss it would be making if it was live in the market. Thus, the most profitable algorithm variants can be swapped live into the market on a continuing basis.

In all of these approaches, Complex Event Processing offers a compelling platform for the creation and management of trading algorithms. The promise of CEP is in providing a powerful platform to enable even the nonprogrammer to encode an event-based algorithm. This year, we will see increased adoption of this approach.

Algorithmic trading is just the first of many exciting applications of CEP—in the financial markets, use in risk management and compliance are the obvious next steps. As we move into 2007, CEP will continue to revolutionize trading on the capital markets as we know it.

Algorithmic Trading and Stream Processing by William Hobbib

Bill is vice president of marketing for StreamBase. He can be contacted at [email protected].


On Wall Street and other global exchanges, electronic-trading data feeds can generate tens of thousands of messages per second, and latencies of even one second are unacceptable. Consequently, technologies such as "stream processing" have been developed to address the challenges of processing high-volume, real-time data. Stream processing enables parallel processing of a specified series of operations on multiple data streams with high levels of efficiency and performance. It is being implemented on applications ranging from financial trading to computer gaming. StreamBase Systems (www.streambase.com), for instance, provides a Stream Processing Engine leveraged by investment firms and hedge funds in areas like algorithmic/automated trading, risk management, transaction cost analysis, and compliance management. One requirement for streaming applications is that they be capable of storing and accessing current or historical state information, preferably using familiar SQL-type commands. Storage of state data is almost universally desired. In addition, for many situations, events of interest depend partly on real-time data and partly on history. An extension of this requirement comes from firms that want to test a trading algorithm on historical data to see how it performs, then test alternative scenarios. When the algorithm works well on historical data, it can be seamlessly switched over to a live feed without application modification.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.