Swing Trading Signals


Since 2013

  • 100% Quantified, data-driven and Backtested
  • We always show our results!
  • Signals every day via our site or email
  • Cancel at any time!

The History of Algorithmic Trading

Last Updated on 23 July, 2024 by Trading System

The History of Algorithmic Trading

 

According to analysts, about 70% of US equities in 2013 were carried out via algorithmic trading, and the use of trading algorithms has continued to grow since then. But algorithmic trading didn’t just emerge out of nowhere. It’s good we take a look at the history.

Algorithmic trading emerged with the advent of the internet in the late 1980s and early 1990s. However, it wasn’t until 1998 when the U.S Securities and Exchange Commission (SEC) authorized electronic exchanges that computerized high-frequency trading became mainstream.

In this post, we will take a look at the following:

  • What is algorithmic trading?
  • The events that paved way for algo trading
  • The early phase
  • Period of refinement and growth
  • The boom
  • What it’s like today
  • Social and sentiment integration

What is algorithmic trading?

Algorithmic trading is a method of executing orders using trading algorithms, which are automated computer programs with trading instructions that account for timing, price, and volume. This type of trading converts profitable strategies into codes that instruct the computer when to buy and sell a security, the right price to trade, and the volume to trade.

Also known as automated trading algorithmic trading (or algo trading, for short) leverages the speed and 24/7 availability of computers when compared to human traders. It uses specialized software to automatically implement a variety of trading strategies and complex mathematical models. Some of the strategies can be trend following, while some may be mean reversal.

They can also be spread-betting or arbitrage strategies, many of which fall into the category of high-frequency trading (HFT) and characterized by high turnover and high order-to-trade ratios. For these, computer algorithms are set up to make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe.

Algorithmic trading has been gaining traction with both retail and institutional traders since the turn of the century. In fact, a 2019 study showed that around 92% of trading in the Forex market was performed by trading algorithms rather than humans. Most investment banks, pension funds, mutual funds, and hedge funds use algorithmic trading to spread out the execution of their larger orders.

The events that paved way for algo trading

Before algo trading became popular, many events happened in the financial markets to pave way for it. These are some of the most important ones:

  • The first trading rule-based fund was launched in 1949: An American trader, Richard Donchian, launched Futures, Inc., a publicly-held commodity fund trading the futures markets. The fund was the first to use a set of predetermined rules to generate actual trading buy and sell signals. It used a mathematical system based on moving averages of commodity market prices. Since there was no internet to support it, the developers had to manually chart the markets using data from ticker tapes. With its rule-based system, this can be seen as the earliest attempt to automate trading.
  • Harry Max Markowitz introduced the Markowitz Model in 1950: Markowitz introduced computational finance to address the issue of portfolio selection and it became the basis of the Modern Portfolio Theory or MPT, which was featured in The Journal of Finance in 1952. Markowitz is known as the father of quantitative analysis.
  • The first arbitrage trade using computers happened in 1960: Hedge fund managers Ed Thorp and Michael Goodkin partnered with Harry Markowitz to create a system that used computers for arbitrage trading. With the introduction of personal computers in the late 1970s and early 1980s, many computational finance applications were developed and signal processing methods, such as time series analysis and optimization became commonplace.
  • The NYSE’s MDS I and MDS-II were launched in 1965: The New York Stock Exchange’s trade reporting system, the Market Data System I (MDSI) was launched to provide automated quotes. The success of MDS I led to the development of the MDS-II, which was 3x better than MDSI. MDSII became fully operational by July 1972.
  • The creation of Instinet Trading System in 1967: Jerome M. Pustilnik and Herbert R. Behrens, in 1967, created Instinet, the oldest electronic communications network on Wall Street. The introduction of Instinet enabled large institutional investors to trade pink sheet or over-the-counter securities directly with one another in an electronic set-up, which made Instinet a big competitor of the NYSE.
  • The formation of Nasdaq: Nasdaq was formed in 1971 to offer fully automated over-the-counter (OTC) trading. Initially offering only quotations, Nasdaq later started providing electronic trading, making it the first to offer online trading.
  • The launch of the Intermarket Trading System in 1978: The Nasdaq Intermarket Trading System (ITS) was a major game-changer. The network, which is managed by Securities Industry Automation Corporation (SIAC), is an electronic network that linked the trading floors of various exchanges and allowed real-time communication and trading between them. With the network, any broker on the floor of a participating exchange could respond to real-time price changes and place an order under coordination manages the ITS.
  • The Launch of Renaissance Technologies in 1982: Renaissance Technologies is a quant fund founded by Jim Simons. The fund leveraged mathematical models to predict price fluctuations of financial instruments, using its $10 billion black-box algorithmic program trading. It only used quantitative analysis to pick the trades to enter.
  • The launch of the NYSE computerized order flow in 1984: Although the computerization of the order flow began in the 1970s, the turning point was in 1984, when the New York Stock Exchange launched the “designated order turnaround” system (DOT) which later became the SuperDOT. The DoT routed orders electronically to the proper trading post, which executed them manually. The SuperDOT facilitated a market order transmission from a member firm directly to the NYSE trading floor for execution such that once the order was executed on the floor, the firm got an order confirmation. The SuperDOT system marked a significant step in equities trade execution both in terms of speed and volume, as it allowed orders up to 2,000 shares to be electronically routed to a specialist.
  • Interactive Brokers was founded in 1993: Founded by Thomas Peterffy in 1993, Interactive Brokers was the pioneer in digital trading. The firm popularized the technology that Timber Hill developed (the first handheld computer for trading) for electronic network and trade execution services to customers. Before founding Interactive Brokers, Thomas Peterffy had, in 1987, created the first fully automated algorithmic trading system, which used an IBM computer that could extract data from a connected Nasdaq terminal and carry out trades fully automated.
  • The launch of Island-an ECN in 1996: Also known as Island at that time, the electronic communication network (ECN) was launched in 1996. The network-enabled subscribed traders to receive information on stocks through an electronic feed. It constantly provides real-time execution price and volume information.

The early developments

It was in 1998 that the US Securities and Exchange Commission (SEC) allowed alternative trading systems, which made it possible to have the electronic exchanges that paved the way for computerized high-frequency trading.

The SEC’s newly adopted rules and rule amendments made it possible for alternative trading systems to decide whether to register as broker-dealers or national securities exchanges and comply with additional requirements under Regulation ATS, depending on their activities and trading volume. The regulation brought credibility and transparency to the newly emerging field of algorithmic trading and helped it to grow faster.

Another development that helped the mass adoption of algo trading was the completion of the US Decimalization process in 2001. This process, which changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share, brought new changes to the market structure by allowing more minor differences between the bid and offer prices.

While the switch was done to comply with standard international trading practices, it also benefited investors in several ways. For example, it made it easier for investors to identify changing price quotes and respond to them. It also made spreads tighter due to the smaller incremental movements of the price.

The Phase-In Period for the UD Decimalization process began on August 28, 2000, and was completed by April 9, 2001. But it was on January 29, 2001, that the NYSE and the American Stock Exchange switched to decimalization. Since that time, all price quotes have always been expressed in the decimal trading format instead of fractions.

Period of refinement and growth

Apart from the 1998 regulation that allowed alternative trading systems, more regulations were brought in to refine and modernize electronic trading. For example, in 2005, the Regulation National Market System (Reg NMS) was formulated but wasn’t established until 2007.

The Reg NMS was a series of initiatives to refine and strengthen the national market system for equities. Its order protection rule mandates exchanges to transmit real-time data to a centralized entity and requires exchanges and brokers to accept the most competitive offer when matching buyers and sellers. It changed the way firms operate on the Trade Through Rule basis and seems to favor high-frequency algo trading.

Another key development was the creation of Pandas at AQR Capital Management in 2008. Pandas is a data manipulation and analysis software package for the Python computer language, which is very useful for performing quantitative research on financial data at a fast speed.

Similarly, another private company, Spread Networks, launched the fastest and most trusted dark fiber services in August 2010. between the greater New York and greater Chicago metropolitan areas. The route of the Spread Networks was the shortest and most diverse between New York and Chicago — covering only 825 fiber miles and taking only 13.3 milliseconds. This improves the speed of algo trading by a lot and boosts the growth of the HFT ecosystem.

The boom

Algorithmic trading witnessed a great boom in the late 2000s. In the early 2000s, algo trading accounted for less than 10% of equity orders, but it grew rapidly that by the end of 2009, algorithmic traders had captured 70% of the US securities markets. According to the NYSE, between 2005 and 2009 alone, algo trading volume grew by 164%.

The boom in algo trading was also accompanied by a significant decrease in trade execution time. For instance, in 2001, HFT trades had an execution time of several seconds, but by 2010, this had shrunk to milliseconds, even microseconds, and subsequently, nanoseconds in 2012.

The high volume and ultrafast execution speed associated with algorithmic trading pose a risk for fast market crashes known as flash crashes. In fact, in May 2010, a flash crash was triggered by an algo-executed sale worth $4.1 billion. The crash wiped out nearly a trillion dollars in market value, with the Dow Jones Index losing 1000 points in a single trading day, including a drop of 600 points within a 5-minutes time frame, before recovering moments later.

Following the flash crash, the SEC introduced circuit breakers to temporarily suspend trading during such high volatility periods.

Similarly, in 2015, the US Commodity Futures Trading Commission (CFTC) approved proposed rules for increased regulation of automated trading on US designated contract markets (DCMs). This was to reduce potential risks from algorithmic trading by implementing risk controls through maximum order message, maximum order size parameters, as well as improve transparency by establishing standards for developing, testing, and monitoring of ATSs.

As technology improves, execution speed even got faster. In 2011, a London-based tech company, Fixnetix, launched nano trading technology with the development of a microchip that could execute trades in nanoseconds. The microchip, which they called iX-eCute, was a field-programmable gate array (FPGA) microchip for ultra-low latency trading. In less than 100 nanoseconds, the microchip could conduct 20+ pre-risk checks. This low latency trading includes algorithmic trading systems and network routes used by financial institutions to connect to stock exchanges and electronic communication networks (ECNs) for the fast execution of transactions.

Another factor that boosted the growth of algo trading was the emergence of Quantopian, which was founded in 2011 by John Fawcett and Jean Bredeche. The company offers open-source resources, including data sources and tools built in Python, for algorithm developers to develop their own trading algorithms and test them for free.

The idea was to create computerized trading systems and strategies which Quantopian could add to its offerings to institutional investors. So, the company organized contests called “Quantopian Open.” Participation was open irrespective of education or work background. Quantopian also attracts institutional investors who invest their assets to be managed by the winning algorithms. Developer-members who created the winning algos were entitled to a royalty or commission from these investor-members, who earned profit from their strategies. However, Quantopian was declared insolvent and was shut down by the authorities in 2020.

Social media analysis and integration into algorithmic trading

In September 2012, a New York-based start-up, Dataminr, launched a brand new service with a $30 million investment, which turns social media communications into actionable trading signals. The aim was to report the latest business news up to 54 minutes faster than traditional news media. The platform was able to identify some distinct “micro-trends” which could help their clients gain unique insights and be able to predict what’s likely to happen in the near future.

But business news reports were just a part of the discussions on social media. Other important social media signals include consumer product reactions, discussion shifts in niche online communities, and growth and decay patterns in public attention.

Some other companies have also developed AI tools to detect linguistic and propagation patterns across the over 340 million messages shared on Twitter daily and perform real-time analysis to spot worthy signals. However, initially, the authorities weren’t cool with the increasing virtually instant impact of social media on securities, and on April 2nd, 2013, the SEC and CFTC placed restrictions on public company announcements through social media.

But the use of Twitter by corporate bodies continued to grow. In fact, two days after the restrictions by the SEC and CFTC on April 4th, 2015, Bloomberg Terminals incorporates live Tweets into its economic data service, with the Bloomberg Social Velocity tracking abnormally spiked in-chatter about specific companies.

A good example of when an abnormal news item affected stocks markets was on April 23rd, 2013, when a false Tweet sent by the Associated Press account stated that the White House was hit by two explosions. The news caused widespread panic on Wall Street, and Dow Jones dropped 1% or 143 points (from 14699 to 14555) in 3 minutes.

What it’s like today

Algo trading has developed tremendously. It now uses AI and machine learning technologies to improve its efficiency. But as it has always been with algo trading, speed is key. To reduce their trade execution latency, institutions and prop trading firms try to locate their computers very close to where an exchange’s computer servers are housed (New York) or where some vital market-significant information comes from (Washington). This enables these firms to access stock prices and market-moving information in a split second before the rest of the investing public.

But given the importance of HFTs in the market stability, the relevant authorities closely monitor what happens in the algorithm trading world.

FAQ

When did algorithmic trading emerge?

Algorithmic trading emerged in the late 1980s and early 1990s with the advent of the internet. It gained mainstream popularity in 1998 when the U.S Securities and Exchange Commission (SEC) authorized electronic exchanges, paving the way for computerized high-frequency trading.

How does algorithmic trading work?

Algorithmic trading uses specialized software to implement various trading strategies and mathematical models. These strategies can include trend following, mean reversal, spread-betting, or arbitrage. The algorithms make trading decisions based on electronic information before human traders can process it.

How did the US Decimalization process impact algorithmic trading?

The completion of the US Decimalization process in 2001, changing the minimum tick size from fractions to decimals, brought changes to market structure. It allowed for more minor differences between bid and offer prices, making it easier for investors to respond to changing price quotes and tightening spreads.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Monthly Trading Strategy Club

$42 Per Strategy

>

Login to Your Account



Signup Here
Lost Password