High frequency trading: how much of our lives is ruled by robots and algorithms?

H

The microeconomics and internal workings of our securities markets and exchanges are incredibly complex, and few know anything beyond that – even the most sophisticated of current investors. If the world of investing were isolated, then the obscurity of its nature would not be entirely concerning. But the reality is that much of society’s infrastructure – literally and figuratively – is tied up in the value of securities that change hands millions of times per day. Retirees depend on a diversified portfolio, insurance firms rely on gains to cover payouts, and retail and institutional investors alike require liquidity in order to ground the confidence required in the first place to part with their investments.

It may not surprise many to know that computers, particularly algorithms, are responsible for the majority of trading globally, but it may surprise onlookers to know the extent of the implications that this reality creates. The chief concern is the ability of securities to be priced at a value that is detached from any underlying, tangible value. The presumption of most market participants is that the liquidity and value of stocks is a direct function of their economic value. A problem caused by algorithmic trading is that the programs often do not take into account the fundamentals of the instruments they are programmed to trade. An algorithm as simple as “If X goes below Y, do Z” means the price of securities can become detached from their underlying value as the program is deliberately designed to avoid consideration of fundamentals.

Granted this may be more of a philosophical concern than a legal one, but the concern remains that the expectations that market participants have come to expect are no longer the reality in today’s electronic markets. The law and regulations may have a role to play when it comes to fair and equal access to the markets. The largest and most profitable high frequency trading firms pay large sums to stock exchanges to place their computers and servers as close as physically possible to the exchange in order to reduce latency and trade ahead of slower investors. The New York Stock Exchange has gone as far as leasing server storage space in its basement to the highest bidder. The speed advantage allows high frequency trading algorithms to see which trades are being made and make their own trades nanoseconds before, making mere pennies on each trade. The volume of trade adds up however, and these firms have become profitable to the tune of billions of dollars per year.

As financial markets have evolved, regulators have not been able to keep pace. In 2010 the Securities and Exchange’s budget was just over a billion dollars, while high frequency trading giant Citadel Securities’ founder personally took home $900 million. In a 2010 Senate Banking Committee hearing, democrat Carl Levin said “Traders today are equipped with the latest, fastest technology. Our regulators are riding the equivalent of mopeds going 20 miles per hour chasing traders whose cars are going 100”. The problems associated with algorithmic trading are numerous, including technical problems such as the 2010 flash crash, where trillions in value was wiped off the face of the earth in a mere thirty minutes, only to return shortly after. The problems exist in Canada, and in any case, Canadians invest billions in American securities every year. The problem is one that investors, securities lawyers and lawmakers should have at the forefront of their thoughts when considering the future of our capital markets.

About the author

Brandon Orr

News Editor

By Brandon Orr

Monthly Web Archives