The stock market is a place where you can market or trade a company’s stock, which the corporation issues through shares in order to raise capital. The people who buy these shares are shareholders, and the term can refer to an individual or an organization. The stock market involves trading of bonds, which is a debt security that stipulates that the issuer of the bonds holds the holders a debt. It is exactly like a loan only that it is in form of a security. These bonds are traded over-the-counter, which means they are traded between the two parties. Instead of promising your money back, companies give a share of ownership. If there are a million shares and you own 1000, you own 0.01% of the company. The stock market is different from the stock exchange, which is primarily concerned with bringing together buyers and sellers of stocks and securities. There are two types of exchanges where stocks can be traded. There is the exchange that has a physical location where verbal trading takes place.
The other type of exchange is the virtual kind where traders deal electronically through computer terminals. Computerization of the order flow in financial markets began in the early 1970s, with some landmarks being the introduction of the New York Stock Exchange’s “designated order turnaround” system (DOT, and later SuperDOT), which routed orders electronically to the proper trading post, which executed them manually. The “opening automated reporting system” (OARS) aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).
The Essay on The Stock Market is a Example of Perfect Competition
The stock market is perfectly competitive because there are a very large number of groups in the market. The stock market, as we know it, is a global community that consists of four different groups: public corporations; market makers; buyers; and sellers. Public corporations are businesses that offer shares, or ownership, to anyone willing to pay money for them. Buyers are investors who want to ...
program trading is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice this means that all program trades are entered with the aid of a computer. In the 1980s program trading became widely used in trading between the S&P500 equity and futures markets.
In stock index arbitrage a trader buys (or sells) a stock index futures contract such as the S&P 500 futures and sells (or buys) a portfolio of up to 500 stocks (can be a much smaller representative subset) at the NYSE matched against the futures trade. The program trade at the NYSE would be pre-programmed into a computer to enter the order automatically into the NYSE’s electronic order routing system at a time when the futures price and the stock index were far enough apart to make a profit. At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black–Scholes option pricing model. Both strategies, often simply lumped together as “program trading”, were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community. Financial markets with fully electronic execution and similar electronic communication networks developed in the late 1980s and 1990s.
In the U.S., decimalization, which changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share, may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers’ trading advantage, thus increasing market liquidity. This increased market liquidity led to institutional traders splitting up orders according to computer algorithms so they could execute orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time-weighted average price or more usually by the volume-weighted average price. A further encouragement for the adoption of algorithmic trading in the financial markets came in 2001 when a team of IBM researchers published a paper at the International Joint Conference on Artificial Intelligence where they showed that in experimental laboratory versions of the electronic auctions used in the financial markets, two algorithmic strategies (IBM’s own MGD, and Hewlett-Packard’s ZIP) could consistently out-perform human traders.
The Essay on Market for Personal Computers in Germany
... the best option for the personal computer market, primarily because of German perception ... initiative being called, "Future Program 2000." This program seeks to curb economic difficulties ... with France being its largest trading partner. The country's principle imports consist ... Germans put a lot of stock into them. Germans view the ... quality of service over cheaper price. Therefore, in order to beat ...
MGD was a modified version of the “GD” algorithm invented by Steven Gjerstad & John Dickhaut in 1996/7; the ZIP algorithm had been invented at HP by Dave Cliff (professor) in 1996. In their paper, the IBM team wrote that the financial impact of their results showing MGD and ZIP outperforming human traders “…might be measured in billions of dollars annually”; the IBM paper generated international media coverage. As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers, because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously. For example Stealth (developed by the Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse), arbitrage, statistical arbitrage, trend following, and mean reversion. This type of trading is what is driving the new demand for Low Latency Proximity Hosting and Global Exchange Connectivity. It is imperative to understand latency when putting together a strategy for electronic trading. Latency refers to the delay between the transmission of information from a source and the reception of the information at a destination. Latency has as a lower bound the speed of light; this corresponds to about 3.3 milliseconds per 1,000 kilometers of optical fibre. Any signal regenerating or routing equipment introduces greater latency than this speed-of-light baseline.
Introduction
Investment decisions have been historically made by employing one of two strategic actions. One is based upon the economic fundamentals of the company issuing the equity and the other on the behavioral patterns of the buyers and sellers of the said equity. Until recently, it was almost impossible to act upon the technical anomalies in a timely manner and at a large enough scale to create tangible benefits. The digitization of trading activities has fundamentally altered the landscape of financial markets. A research conducted by the Aite Group revealed that in 2006, a third of all European Union and United States stock trades were driven by algorithms. In 2009, the Tabb Group estimated that High Frequency Trading firms accounted for as much as 73% of all US equity trading volume.
The Essay on Money Market Trading Strategies
Money market trading strategies Looking at the prediction made, i. e. the money market interest rate will increase for the next six months, the team has come out with a few strategies to be undertaken in order to maximise the bank’s profit. The first instrument will be of the cash products, including overnight cash, 7-day cash and loan, and secondly, the discount security which consists commercial ...
Moreover, On May 6, 2010 U.S. stock market indices, stock-index futures, options, and exchange-traded funds experienced a sudden price drop of more than five percent, followed by a rapid rebound. This brief period of extreme intraday volatility, commonly referred to as the “Flash Crash”, raises a number of questions about the structure and stability of U.S. financial markets. A survey conducted by Market Strategies International in June 2010 revealed that over 80 percent of U.S. retail advisors believe that “overreliance on computer systems and high-frequency trading” were the primary contributors to the volatility observed on May 6. It is evident that with such a widespread adoption and use, these algorithms impact company strategies, investment practices and the very markets in which they’re deployed. The study of the various forms of algorithmic trading and their impact on the market forms the basis for this paper.
What is algorithmic trading?
Algorithmic trading (AT) is any trading activity carried out with the help of algorithms. It can be formally defined as “placing a buy or sell order of a defined quantity into a quantitative model that automatically generates the timing of orders and the size of orders based on goals specified by the parameters and constraints of the algorithm”. The rules built into the model attempt to determine the optimal time for an order to be placed that will cause the least amount of impact on the price of the financial instrument. Algorithmic trading is a way to codify a trader’s execution strategy and it also cuts down transaction costs and allows fund managers to take control of their own trading processes. Algorithmic trading is widely used by pension funds, mutual funds, and other buy side (investor driven) institutional traders, to divide large trades into several smaller trades to manage market impact, and risk. Buy-side firms are gravitating toward rules-based systems. For example, instead of placing 1,00,000-share order, an algorithmic trading strategy may push 1,000 shares out every 30 seconds and incrementally feed small amounts into the market over the course of several hours or the entire day.
The Essay on Agricultural Market Price Prices Government
1. Would the US economy be better off without government intervention in agriculture Who would benefit Who would lose 2. Are large price movements inevitable in agricultural markets What other mechanisms might be used to limit such movements 3. Farmers can eliminate the uncertainties of fluctuating crop prices by selling their crops in "futures" markets (agreeing to a fixed price for crops to be ...
By breaking their large orders into smaller chunks, buy-side institutions are able to disguise their orders and participate in a stock’s trading volume across an entire day or for a few hours. The time frame depends on the traders’ objective, how aggressive they want to be, and constraints such as size, price, and order type, liquidity and volatility of the stock and industry group. More sophisticated algorithms allow buy-side firms to fine-tune the trading parameters in terms of start time, end time, and aggressiveness. Algorithmic trading is appealing to buy-side firms because they can measure their trading results against industry-standard benchmarks such as volume weighted average price (VWAP) or the S&P 500 and Russell 3000 indices. Sell side traders, such as market makers and some hedge funds, provide liquidity to the market, generating and executing orders automatically.
In their never-ending quest to please their customers, being the first to innovate can give a broker a significant advantage over the competition, both in capturing the order flow of early adopters and building a reputation as a thought leader. It is possible to create an algorithm and enjoy a significant time window ahead of the competition if that algorithm addresses a really unique execution strategy. A special class of algorithmic trading is “high-frequency trading” (HFT), in which computers make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. This has resulted in a dramatic change of the market microstructure, particularly in the way liquidity is provided. Algorithmic trading may be used in any investment strategy, including market making, intermarket spreading, arbitrage, or pure speculation (including trend following).
The investment decision and implementation may be augmented at any stage with algorithmic support or may operate completely automatically.
The Term Paper on Options Trading Broker Trade Traders
We often think of ourselves as being too logical of level-headed to be turned around by some bizarre social reality. We never bought a hool a hoop. We know Elvis is truly dead. We all are vulnerable to some extent and we all have our Achilles heels. If you understand how these phenomena of heard psychology work, youll be better prepared to deal with them. And, as an options trader, you must be ...
Evolution of Algorithmic trading
Nearly 40 years ago, when exchanges first contemplated switching to fully automated trading platforms, Fischer Black surmised that regardless of market structure, liquid markets exhibit price continuity only if trading is characterized by a large volume of small individual trades. Large order executions would always exert an impact on price, irrespective of the method of execution or technological advances in market structure. At that time, stock market “specialists” were officially designated market makers, obligated to maintain the order book and provide liquidity. In the trading pits of the futures markets, many floor traders were unofficial, but easily identifiable market makers. Both the stock market specialists and futures market floor traders enjoyed a proximity advantage compared to traders who participated away from the trading floor. This advantage allowed specialists and floor traders to react more quickly to incoming order flow compared to other traders.
As markets became electronic, a rigid distinction between market makers and other traders became obsolete. Securities exchanges increasingly adopted a limit order market design, in which traders submit orders directly into the exchange’s electronic systems, bypassing both designated and unofficial market makers. In today’s electronic markets, High Frequency Traders enjoy a latency advantage which allows them to react to changes in order flow more quickly than other traders. This occurred because of advances in technology, as well as regulatory requirements and also because large institutional investors have been allowed to place their servers in the same building as the stock exchanges, and they are able to gleam at transactions a few milliseconds before the other market players do.
Types of algorithmic trades
The algorithms employed in automated trading can be broadly categorized into two “families” viz. Execution Algorithms and Alpha-Generating Algorithms. Execution Algorithms: These programs execute stock market trades in such a manner that the prices aren’t influenced by momentary swings in the market. Two of the common execution algorithms are the Volume Weighted Average Price (VWAP) and Time Weighted Average Price (TWAP).
The Essay on Oligopolists Market Price Firms
Oligopolists There are four market structures in our economy today: Perfect competition, monopolistic competition, oligopolies and monopolies. This essay shall describe the oligopoly market. The definition of an oligopoly states that in an industry, a small number of firms dominate the market. There are a low number of firms in the industry, becase and adding to the barriers to entry. The barriers ...
The VWAP is calculated by weighting a stock’s price quotes through the trading session with volumes traded at each price. The algorithm’s objective is to execute the order at a price that is as close as possible to this weighted average. The reason why this algorithm is useful can be gleaned from the following example. Let us say that Stock A traded in a range of Rs. 1090 to Rs. 1,110 on a particular day and its VWAP was Rs. 1,100. An investor selling the stock at the VWAP instead of at the day’s low, would have gained as much as Rs. 10 per share. Institutional investors which measure their returns based on end-of-day prices may also use the VWAP for the last 30 minutes to buy or sell their holdings to reduce deviations.
The TWAP strategy simply breaks up a large order into equal parts and then dribbles buy or sell orders into the market evenly over the trading session. This basic strategy is called “iceberging”. This ensures that the price at which the investor buys or sells is not distorted by momentary blips in the market. Using a TWAP is much like using a mutual fund systematic investment plan (SIP) — only compressed into minutes rather than months. Alpha Generating/Seeking Algorithms: Unlike execution algorithms, alpha-seeking algorithms actively try to make money. They track historical relationships between securities, assets or markets and then exploit minor deviations for quick gains. A few of the common alpha seeking strategies are given below: Arbitrage algorithms: When used by academics, an arbitrage is a transaction that involves no negative cash flow at any probabilistic or temporal state and a positive cash flow in at least one state; in simple terms, it is the possibility of a risk-free profit at zero cost. Therefore, much like the traditional arbitrageurs do, arbitrage algorithms earn a spread from trading on
anomalies between securities, trading venues or asset classes. For example, simple arbitrage algorithms may earn a ‘spread’ by buying a stock at Rs. 100 on the BSE and selling it at Rs. 100.50 on the NSE. Arbitrage profits can also be earned by exploiting differentials between futures and cash markets. Arbitrage is possible when one of three conditions is met: 1. The same asset does not trade at the same price on all markets (the “law of one price”).
2. Two assets with identical cash flows do not trade at the same price. 3. An asset with a known price in the future does not today trade at its future price discounted at the risk-free interest rate (or, the asset does not have negligible costs of storage; as such, for example, this condition holds for grain but not for securities).
Arbitrage is not simply the act of buying a product in one market and selling it in another for a higher price at some later time. The transactions must occur simultaneously to avoid exposure to market risk, or the risk that prices may change on one market before both transactions are complete.
In practical terms, this is generally only possible with securities and financial products which can be traded electronically, and even then, when each leg of the trade is executed the prices in the market may have moved. Missing one of the legs of the trade (and subsequently having to trade it soon after at a worse price) is called ‘execution risk’ or more specifically ‘leg risk’. In the simplest example, any good sold in one market should sell for the same price in another. Traders may, for example, find that the price of wheat is lower in agricultural regions than in cities, purchase the good, and transport it to another region to sell at a higher price. This type of price arbitrage is the most common, but this simple example ignores the cost of transport, storage, risk, and other factors.
“True” arbitrage requires that there be no market risk involved. Where securities are traded on more than one exchange, arbitrage occurs by simultaneously buying in one and selling on the other. A variation is the ‘event’ arbitrage, which exploits money-making opportunities arising from mergers, buyouts or restructuring. An algorithmic trader, alerted instantly to an acquisition, could buy the acquirer’s stock and short the target’s. Trend following algorithms: Trend-following algorithms techniques commonly used by technical analysts to identify a reversal in trends. They then piggyback on it at an early stage to benefit from the momentum. These algorithms may track technical indicators such as the 50 or 200-day moving averages or relative strength index, to bet on stocks on the verge of breaking out or breaking down. Traders who subscribe to a trend following strategy do not aim to forecast or predict specific price levels; they initiate a trade when a trend appears to have started, and exit the trade once the trend appears to have ended.
Scalping: Scalping is a method of arbitrage of small price gaps created by the bid-ask spread. Scalpers attempt to act like traditional market makers or specialists. To make the spread means to buy at the bid price and sell at the ask price, to gain the bid/ask difference. This procedure allows for profit even when the bid and ask do not move at all, as long as there are traders who are willing to take market prices. It normally involves establishing and liquidating a position quickly, usually within minutes or even seconds. The role of a scalper is actually the role of market makers or specialists who are to maintain the liquidity and order flow of a product of a market. A market maker is basically a specialized scalper. The volume a market maker trades are many times more than the average individual scalpers. A market maker has a sophisticated trading system to monitor trading activity.
However, a market maker is bound by strict exchange rules while the individual trader is not. For instance, NASDAQ requires each market maker to post at least one bid and one ask at some price level, so as to maintain a two-sided market for each stock represented. Double hedging: Double Hedging describes a strategy that supervises pairs of assets with identical statistical correlations. Changes in market price of the given pair of assets are tracked, and once the change occurs the algorithm counts the difference in standard deviation between the two assets, and then executes the hedging procedure. For example, “Shell” and “Exxon” shares are both in the energy sector and therefore are correlated. Following this pair of stocks allows the algorithm to determine if future profits can be made, and the possibility of profit increases as the correlation between the two assets gets smaller. Thus, “double hedging” occurs as the correlation value goes down, presenting the opportunity to simultaneously buy one share and sell the other thereby locking in a profit.
High Frequency Trading
The very essence of algorithmic trading lies in the fact that the machines running these algorithms are capable of processing a huge volume of information in a relatively short period of time. This particular fact is exploited in High-frequency trading (HFT) which is a special class of algorithmic trading in which computers make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. This has resulted in a dramatic change of the market microstructure, particularly in the way liquidity is provided. HFT is typically characterized by several distinguishing features: It is highly quantitative, employing computerized algorithms to analyze incoming market data and implement proprietary trading strategies An investment position is held only for very brief periods of time – from seconds to hours – and rapidly trades into and out of those positions, sometimes thousands or tens of thousands of times a day At the end of a trading day there is no net investment position;
It is mostly employed by proprietary firms or on proprietary trading desks in larger, diversified firms It is very sensitive to the processing speed of markets and of their own access to the market Many high-frequency traders provide liquidity and price discovery to the markets through market-making and arbitrage trading; high-frequency traders also take liquidity to manage risk or lock in profits
High-frequency traders compete on a basis of speed with other high-frequency traders, not long-term investors (who typically look for opportunities over a period of weeks, months, or years), and compete for very small, consistent profits. As a result, high-frequency trading has been shown to have a potential Sharpe ratio (measure of reward per unit of risk) thousands of times higher than the traditional buy-and-hold strategies. Aiming to capture just a fraction of a penny per share or currency unit on every trade, highfrequency traders move in and out of such short-term positions several times each day. Fractions of a penny accumulate fast to produce significantly positive results at the end of every day. High-frequency trading firms do not employ significant leverage, do not accumulate positions, and typically liquidate their entire portfolios on a daily basis. Recently, HFT has become more prominent and controversial.
These algorithms or techniques are commonly given names such as “Stealth” (developed by the Deutsche Bank), “Iceberg”, “Dagger”, “Guerrilla”, “Sniper”, “BASOR” (developed by Quod Financial) and “Sniffer”. Yet are quite simple mathematical constructs at their core. Dark pools are alternative electronic stock exchanges where trading takes place anonymously, with most orders hidden or “iceberged.” Gamers or “sharks” sniff out large orders by “pinging” small market orders to buy and sell. When several small orders are filled the sharks may have discovered the presence of a large iceberged order. “Now it’s an arms race,” said Andrew Lo, director of the Massachusetts Institute of Technology’s Laboratory for Financial Engineering. “Everyone is building more sophisticated algorithms, and the more competition exists, the smaller the profits.”
Analyzing the Impact
One of the major criticisms leveled at this form of trading is that from the point of view of how capital is allocated, they are inherently self-referential. That is, they represent the financial markets looking inward at their own dynamics and investing capital to create more capital without ever attending to the externalities of this behavior. Therefore, as long as the algorithm trading account creates sufficient risk-adjusted returns a financial manager has nothing to worry about. There seems to be no reason for anyone to embrace the ambiguities of fundamental investing if one could generate returns in this more direct way.
The result of algorithmic trading’s success, predictably, has been that more and more capital has been attracted to this investment focus. The resulting increase in crowdedness has meant that the anomalies now available for exploitation are both much more short-lived (hence driving the aforementioned “arms race” to micro-second execution windows) and also much more granular, meaning that more capital has to be applied during these windows to create material results, in turn driving the need for higher and higher leverage ratios. Market Volatility: The consequence of these developments has been an escalation in market volatility, an increase in unhedgeable catastrophic risk, and thus a decrease in risk-adjusted returns (if one truly accounted for catastrophic risk), all resulting in an unsustainable overweight in the overall market’s portfolio away from fundamental investing into short-term trading. Worst of all, this volatility has made it much harder for fundamental investors to enter the market. An article in the New York Times had an example of how HFT affects share prices and they used a $1.4 million order on which they said the slow moving investors had to pay $7,800 extra because of high frequency traders which is about half a percent more on the trade than they would have otherwise had to.
The Stock Market Crash of 1987: It has been said that the October 1987 stock market crash was caused in part by something called dynamic portfolio insurance, another approach based on algorithms. Dynamic portfolio insurance is a way of protecting your portfolio of shares so that if the market falls you can limit your losses to an amount you stipulate in advance. As the market falls, you sell some shares. By the time the market falls by a certain amount, you will have closed all your positions so that you can lose no more money. A proper implementation of the idea required some knowledge of option theory as developed by the economists Fischer Black of Goldman Sachs, Myron S. Scholes of Stanford and Robert C. Merton of Harvard. You type into some formula the current stock price, and this tells you how many shares to hold. The market falls and you type the new price into the formula, which tells you how many to sell. By 1987, however, the problem was the sheer number of people following the strategy and the market share that they collectively controlled.
If a fall in the market leads to people selling according to some formula, and if there are enough of these people following the same algorithm, then it will lead to a further fall in the market, and a further wave of selling, and so on — until the S&P 500 index loses over 20 percent of its value in single day: Oct. 19, Black Monday. Dynamic portfolio insurance caused the very thing it was designed to protect against. This is the sort of feedback that occurs between a popular strategy and the underlying market, with a long-lasting effect on the broader economy. A rise in price begets a rise and a fall begets a fall and as a result, volatility rises and the market is destabilized. All that’s needed is for a large number of people to be following the same type of strategy. The “Flash Crash” of 2010: As had been stated at the beginning of this article, most U.S. retail advisors believe that HFT had a major part to play in the 2010 “Flash Crash”. A paper by Kirilenko, Kyle, Samadi and Tuzun uses audit-trail data to describe events that took place on that day and how the HFT responses to the unusually large selling pressure exacerbated market volatility.
According to the paper, the events on May 6 unfolded as follows: Financial markets, already tense over concerns about the European sovereign debt crisis, opened to news concerning the Greek government’s ability to service its sovereign debt. As a result, premiums rose for buying protection against default on sovereign debt securities of Greece and a number of other European countries. In addition, the S&P 500 volatility index (“VIX”) increased, and yields of ten-year Treasuries fell as investors engaged in a “flight to quality.” By mid-afternoon, the Dow Jones Industrial Average was down about 2.5%. Sometime after 2:30 p.m., Fundamental Sellers began executing a large sell program. Typically, such a large sell program would not be executed at once, but rather spread out over time, perhaps over hours. The magnitude of the Fundamental Sellers’ trading program began to significantly outweigh the ability of Fundamental Buyers to absorb the selling pressure. HFTs and Intermediaries were the likely buyers of the initial batch of sell orders from Fundamental Sellers, thus accumulating temporary long positions.
Thus, during the early moments of this sell program’s execution, HFTs and Intermediaries provided liquidity to this sell order. However, just like market intermediaries in the days of floor trading, HFTs and Intermediaries had no desire to hold their positions over a long time horizon. A few minutes after they bought the first batch of contracts sold by Fundamental Sellers, HFTs aggressively sold contracts to reduce their inventories. As they sold contracts, HFTs were no longer providers of liquidity to the selling program. In fact, HFTs competed for liquidity with the selling program, further amplifying the price impact of this program. Furthermore, total trading volume and trading volume of HFTs increased significantly minutes before and during the Flash Crash. Finally, as the price of the E-mini rapidly fell and many traders were unwilling or unable to submit orders, HFTs repeatedly bought and sold from one another, generating a “hot-potato” effect.
Yet, Fundamental Buyers, who may have realized significant profits from this large decrease in price, did not seem to be willing or able to provide ample buy-side liquidity. As a result, between 2:45:13 and 2:45:27, prices of the E-mini fell about 1.7%. At 2:45:28, a 5 second trading pause was automatically activated in the E-mini. Opportunistic and Fundamental Buyers aggressively executed trades which led to a rapid recovery in prices. HFTs continued their strategy of rapidly buying and selling contracts, while about half of the Intermediaries closed their positions and got out of the market. In light of these events, a few fundamental questions arise. Why did it take so long for Fundamental Buyers to enter the market and why did the price concessions had to be so large? It seems possible that some Fundamental Buyers could not distinguish between macroeconomic fundamentals and market-specific liquidity events. It also seems possible that the opportunistic buyers have already accumulated a significant positive inventory earlier in the day as prices were steadily declining. Furthermore, it is possible that they could not quickly find opportunities to hedge additional positive inventory in other markets which also experienced significant volatility and higher latencies.
The Knight Capital Case: In a recent case eerily similar to the Flash Crash of 2010, a malfunction in Knight Capital’s trading system flooded the market with erroneous trades on August 1, 2012. The troubles resulted in some $440 million in trading losses — now estimated at roughly $275 million after taxes — which left the firm scrambling for a financial rescue. While that bailout came, there was a steep cost beyond just the trading losses, namely control of the firm. Knight’s new investors control nearly three-quarters of the company, and they got it for a song; existing shareholders have seen their stakes significantly diluted, and the company faces the daunting task of retaining clients. Knight is a trading firm that takes orders from big brokers like E-Trade and TD Ameritrade, routing them to the exchanges where the stocks are traded. One key role Knight Capital has played is that of a designated market maker, where it is responsible for maintaining orderly trades in the stocks it oversees.
Designated market makers are particularly important whenever there is a lot of market volatility; Knight is responsible for the trading of 524 NYSE-listed stocks, a sizable chunk of the roughly 2,300 total corporate issuers. Until its blunder, Knight was a respected top-level player, a distinction that makes its slip more alarming, because the worry is that if this could happen at Knight, it can happen anywhere. Increase in Liquidity: Although we’ve analyzed a lot of cases which reflect negatively on the idea of Algorithmic trading, it’s important not to neglect some of the benefits that algorithmic trading brings to the table. According to a research paper on the topic by Hendershott, Jones, and Menkveld, algorithmic trading “improves liquidity and enhances the informativeness of quotes.”
The authors of the paper emphasize that automated trading now runs through the markets at every level and that there are many different algorithms, used by many different types of market participants. Some hedge funds and broker-dealers supply liquidity using algorithms, competing with designated market-makers and other liquidity suppliers. For assets that trade on multiple venues, liquidity demanders often use smart order routers to determine where to send an order. Statistical arbitrage funds use computers to quickly process large amounts of information contained in the order flow and price moves in various securities, trading at high frequency based on patterns in the data. Last but not least, algorithms are used by institutional investors to trade large quantities of stock gradually over time. One very important observation made in the paper is that it is not at all obvious that algorithmic trading should improve market liquidity. If algorithms are cheaper and/or better at supplying liquidity, then AT may result in more competition in liquidity provision, thereby lowering the cost of immediacy. However, the effects could go the other way if algorithms are used mainly to demand liquidity.
Limit order submitters grant a trading option to others, and if algorithms make liquidity demanders better able to identify and pick off an in-themoney trading option, then the cost of providing the trading option increases, and spreads must widen to compensate. In fact, AT could actually lead to an unproductive arms race, where liquidity suppliers and liquidity demanders both invest in better algorithms to try to take advantage of the other side, with measured liquidity the unintended victim. This is the kind of thing most participants in algorithmic trading do not emphasize when raving about the obvious benefits it brings to markets. One of the highlights of the paper is the effort to track the rise of algorithmic trading (over roughly a five year period, 2001-2006) and to compare this to changes in liquidity.
This isn’t quite as easy as it might seem because algorithmic trading is just trading and not obviously distinct in market records from other trading. One cannot directly observe whether a particular order is generated by a computer algorithm. For cost and speed reasons, most algorithms do not rely on human intermediaries but instead generate orders that are sent electronically to a trading venue. Therefore, one can use the rate of electronic message traffic as a proxy for the amount of algorithmic trading taking place. The figure below shows this data, recorded for stocks with differing market capitalization (sorted into quintiles, Q1 being the largest fifth).
Clearly, the amount of electronic traffic in the trading system has increased by a factor of at least five over a period of five years:
The paper then compares this to data on the effective bid-ask spread for this same set of stocks, again organized by quintile, over the same period. The resulting figure indeed shows a more or less steady decrease in the spread, a measure of improving liquidity:
So, there is a clear correlation. The next question, of course, is whether this correlation reflects a causal process or not. What perhaps sets this study apart from others (see, for example, any number of reports by the Tabb Group, which monitors high-frequency markets) is an effort to get at this causal link. The authors do this by studying a particular historical event that increased the amount of algorithmic trading in some stocks but not others. The results suggest that there is a causal link. The conclusion, then, is that algorithmic trading (at least in the time period studied, in which stocks were generally rising) does improve market efficiency in the sense of higher liquidity and better price discovery. But there is a further caveat.
While the authors have accounted for share price levels and volatility in their study, it remains an open question whether algorithmic trading and algorithmic liquidity supply are equally beneficial in more turbulent or declining markets. Like Nasdaq market makers refusing to answer their phones during the 1987 stock market crash, algorithmic liquidity suppliers may simply turn off their machines when markets spike downward. This resonates with a general theme across all finance and economics. When markets are behaving “normally”, they seem to be more or less efficient and stable. When they go haywire, all the standard theories and accepted truths go out the window. Unfortunately, “haywire” isn’t as unusual as many theorists would like it to be.
Expert View
We were honored to have had an opportunity to interview Mr. Sanket Kapse, an analyst in a large Hedge Fund. We gained the following insights from the interview: Please tell us a bit about your company, profile and designation. I work at DE Shaw and Company as a Finance and Operations Generalist. My profile is that of a Back Office Analyst. Could you elaborate on the nature of your work at DE Shaw? I take care of the back office functions for the Algorithmic trading (AT) portfolio which includes PNL generation, corporate actions processing, performing various reconciliations and end of day management reporting (MIS).
Are you directly involved in Algorithm Trading activities of your company? There is no direct role of AT in my work but my firm uses the same extensively for trading in various geographies including Indian markets. Can you please throw some light on the subject? AT put simply, is using the power of computers to implement certain mathematical models to discover anomalies (arbitrage opportunities) in the prices of financial instruments to make a quick return. A simple example would be trading bonds.
Bonds can be valued using parameters like current market interest rates, coupon rate of the bond and the risk profile of the bond issuer. Computers can be fed a mathematical model and made to value hundreds of bonds each day to spot opportunities where a bond is either trading cheap or expensive. The algorithms can also trade in and out of the bond to make a quick profit. Same can be done for trading stocks, commodities, currencies, interest rates etc. In your opinion, is it ethical to employ AT? AT is just a way of trading which is faster and smarter than how human traders can trade. It also needs a lot of investment into technology and requires hiring math and science experts which can be only done by players who have deep pockets. It thus puts the less privileged traders at a disadvantage. AT is also used by large market players like hedge funds and investment banks for better risk management as they trade in and out of a position very quickly and hence take-on very little market related risk.
For the above two reasons many have termed AT as un-ethical. However, on the positive side, it adds a lot of depth and liquidity to the market. 1. Since AT is done on a high frequency, it creates better liquidity in the markets. This ensures better price discovery and prevents huge volatility in the prices that can be caused by scanty liquidity. 2. The high volumes generated by AT keeps the BID-ASK spreads very low. For e.g. if there is only one foreign exchange shop in your city, a tourist would have to buy dollars at say Rs. 50 per dollar for his trip. But on returning any residual dollars would not fetch more than Rs. 45. The spread here is Rs. 5 which is a loss to the tourist this would not have happened if there were 100 foreign exchange shops.
3. AT is a battle between sophisticated players that operate in the ultra-short term time frame. It does not negatively impact long-term investors as they do not compete for long-term returns.
Recent developments
Financial market news is now being formatted by firms such as Need To Know News, Thomson Reuters, Dow Jones, and Bloomberg, to be read and traded on via algorithms. “Computers are now being used to generate news stories about company earnings results or economic statistics as they are released. And this almost instantaneous information forms a direct feed into other computers which trade on the news.” The algorithms do not simply trade on simple news stories but also interpret more difficult to understand news. Some firms are also attempting to automatically assign sentiment (deciding if the news is good or bad) to news stories so that automated trading can work directly on the news story. “Increasingly, people are looking at all forms of news and building their own indicators around it in a semi-structured way,” as they constantly seek out new trading advantages said Rob Passarella, global director of strategy at Dow Jones Enterprise Media Group. His firm provides both a low latency news feed and news analytics for traders.
Passarella also pointed to new academic research being conducted on the degree to which frequent Google searches on various stocks can serve as trading indicators, the potential impact of various phrases and words that may appear in Securities and Exchange Commission statements and the latest wave of online communities devoted to stock trading topics. “Markets are by their very nature conversations, having grown out of coffee houses and taverns”, he said. So the way conversations get created in a digital society will be used to convert news into trades, as well, Passarella said. “There is a real interest in moving the process of interpreting news from the humans to the machines” says Kirsti Suutari, global business manager of algorithmic trading at Reuters.
“More of our customers are finding ways to use news content to make money.” An example of the importance of news reporting speed to algorithmic traders was an advertising campaign by Dow Jones (appearances included page W15 of the Wall Street Journal, on March 1, 2008) claiming that their service had beaten other news services by 2 seconds in reporting an interest rate cut by the Bank of England. In July 2007, Citigroup, which had already developed its own trading algorithms, paid $680 million for Automated Trading Desk, a 19-year-old firm that trades about 200 million shares a day. Citigroup had previously bought Lava Trading and OnTrade Inc. In late 2010, The UK Government Office for Science initiated a Foresight project investigating the future of computer trading in the financial markets, led by Dame Clara
Furse, ex-CEO of the London Stock Exchange and in September 2011 the project published its initial findings in the form of a three-chapter working paper available in three languages, along with 16 additional papers that provide supporting evidence. All of these findings are authored or co-authored by leading academics and practitioners, and were subjected to anonymous peer-review. The Foresight project is set to conclude in late 2012. In September 2011, RYBN has launched “ADM8”, an open source Trading Bot prototype, already active on the financial markets
References
1. Terrence Hendershott, Charles M. Jones, and Albert J. Menkveld, “Does Algorithmic Trading Improve Liquidity?,” The Journal of Finance, Vol. LXVI, No. 1, 2011 2. Andrei Kirilenko, Mehrdad Samadi, Albert S. Kyle, Tugkan Tuzun, “The Flash Crash: The Impact of High Frequency Trading on an Electronic Market,” Available at SSRN: http://ssrn.com/abstract=1686004, 2011 3. Markus Gsell, “Assessing the Impact of Algorithmic Trading on Markets: A Simulation Approach,” Available at SSRN: http://ssrn.com/abstract=1134834 4. Alain Chaboud, Benjamin Chiquoine, Erik Hjalmarsson, Clara Vega, “Rise of the Machines: Algorithmic Trading in the Foreign Exchange Market,” Available at SSRN: http://ssrn.com/abstract=1501135 5. Alvaro Carteay and Sebastian Jaimungal, “Modeling Asset Prices for Algorithmic and High Frequency Trading,” Available at SSRN: http://ssrn.com/abstract=1722202 6. S. Gjerstad and J. Dickhaut, “Price Formation in Double Auctions,” Games and Economic Behavior, Volume 22, Issue 1, p. 1, January, 1998 7. Michael Mackenzie, “SEC runs eye over high-speed trading,” The Financial Times, July 29, 2010, p. 21 8. Aline van Duyn, “If you’re reading this, it’s too late: a machine got here first,” The Financial Times, April 16, 2007, p.1 9. Charles Duhigg, “Stock Traders Find Speed Pays, in Milliseconds,” The New York Times, July 23, 2009 10. Paul Wilmott, “Hurrying Into the Next Panic?,” The New York Times, July 28, 2009 11. Caroline Vatetkevitch, Chuck Mikolajczak, “Error by Knight Capital rips through stock market,” Reuters, August 1, 2012