Algorithmic Trading and Stream Processing by William HobbibBill is vice president of marketing for StreamBase. He can be contacted at email@example.com.
On Wall Street and other global exchanges, electronic-trading data feeds can generate tens of thousands of messages per second, and latencies of even one second are unacceptable. Consequently, technologies such as "stream processing" have been developed to address the challenges of processing high-volume, real-time data. Stream processing enables parallel processing of a specified series of operations on multiple data streams with high levels of efficiency and performance. It is being implemented on applications ranging from financial trading to computer gaming. StreamBase Systems (www.streambase.com), for instance, provides a Stream Processing Engine leveraged by investment firms and hedge funds in areas like algorithmic/automated trading, risk management, transaction cost analysis, and compliance management. One requirement for streaming applications is that they be capable of storing and accessing current or historical state information, preferably using familiar SQL-type commands. Storage of state data is almost universally desired. In addition, for many situations, events of interest depend partly on real-time data and partly on history. An extension of this requirement comes from firms that want to test a trading algorithm on historical data to see how it performs, then test alternative scenarios. When the algorithm works well on historical data, it can be seamlessly switched over to a live feed without application modification.