Market data
From Wikipedia, the free encyclopedia
In finance, Market Data refers to quote and trade related-data associated with equity, fixed-income, financial derivatives, currency, and other investment instruments. The term market data traditionally refers to numerical price data, reported from trading venues, such as stock exchanges. The price data is attached to a ticker symbol and additional data about the trade.
The latest price quotes moving in LED ribbons around the walls of trading floors or at the bottom of the screen on financial TV shows are familiar sights. This price data is not only used in real time to make on-the-spot decisions about buying or selling, but historical market data (see graph at right) is also used to project pricing trends and to calculate market risk on portfolios of investments that may be held by an individual or an institutional investor.
A typical equity market data message or business object furnished from NYSE, TSX, or Nasdaq might appear something like this:
Ticker Symbol | IBM |
Bid | 89.02 |
Ask | 89.08 |
Bid size | 300 |
Ask size | 1000 |
Last sale | 89.06 |
Last size | 200 |
Quote time | 14:32:45 |
Trade time | 14.32.44 |
exchange | NYSE |
volume | 7808 |
In actuality, the above example is an aggregation of different sources of data, as quote data (bid, ask, bid size, ask size) and trade data (last sale, last size, volume) are often generated over different data feeds.
Delivery of price data from exchanges to users, such as traders, is highly time-sensitive, approaching realtime. Specialized technologies called ticker plants are software (lately combined with field programmable gate array processors), designed to handle collection and throughput of massive data streams, displaying prices for traders and feeding computerized trading systems fast enough to capture opportunities before markets change. When stored, historical market data is also called time-series data, because it requires a specialized type of database that enables retrieval of a series prices over time for a single instrument.
While market data generally refers to realtime or delayed price quotations, the term increasingly includes static or reference data -- i.e. any type of data related to securities that is not changing in realtime. In other words, anything other than streaming prices.
Reference data includes identifier codes (e.g. CUSIP), the exchange a security trades on, end-of-day pricing, name and address of the issuing company, the terms of the security (such as interest rate and maturity on a bond), and the outstanding corporate actions (such as pending stock splits or proxy votes) related to the security. This type of data can be maintained in a relational database. Databases that maintain the references data for holdings in a portfolio are known as "securities master" files.
While price data generally originates from the exchanges, reference data generally originates from the issuer. However, before it arrives in the hands of investors or traders, it usually passes through the hands of vendors or "aggregators" that may reformat it, organize it and attempt to clear obvious anomalies on a realtime basis. Today, the business of market data aggregation and reselling is changing rapidly, due to advances in communications in financial markets, as well as the technologies available to integrate multiple streams of inbound financial data.
For consumers of market data, which are primarily the financial institutions and industry utilities serving the capital markets realm, the complexity of managing market data has risen with the increasing numbers of issued securities and the globalization of capital markets. Beyond the rising volume of data, the continuing evolution of complex derivatives and indices, along with new regulations designed to contain risk and protect markets and investors, all create more operational demands on market data management.
In the not-too-distant past, individual data vendors provided data for software application in financial institutions that were specifically designed for one data feed, thus giving that data vendor a lock on that area of operations. Today, particularly in the larger investment banks and asset management firms, the concept of centralized or enterprise data management is driving investments in large-scale enterprise data management systems which collect, normalize and integrate feeds from multiple data vendors, with the goal of building one "golden copy" of data supporting every kind of operation throughout the institution. Beyond the operational efficiency gained, this data consistency is increasingly necessary to enable compliance with regulatory requirements, such as Sarbanes Oxley, Reg NMS and the Basel 2 accord.
Contents |
[edit] Technology Solutions
The business of providing technology solutions to financial institutions for data management has grown over the past decade, as market data management has emerged from a little-known discipline for specialists to a high-priority issue for the entire capital markets industry and its regulators. Providers range from middleware and messaging vendors, vendors of cleansing and reconciliation software and services, and vendors of highly scalable solutions for managing the massive loads of incoming and stored reference data that must be maintained for daily trading, accounting, settlement, risk management and reporting to investors and regulators.
The market data distribution platforms are designed to transport over the network large amounts of data from financial markets. They are intended to respond to the fast changes on the financial markets, compressing or representing data using specially designed protocols to increase throughput and/or reduce latency.
Most market data servers run on Solaris or Linux as main targets, however some have versions for Windows.
[edit] Feed Handlers
A typical usage can be a "Feed Handler" solution. Applications (Sources) receive data from specific feed and connect to a server (Authority) which accepts connections from clients (Destinations) and redistributes data further. When a client (Destination) wants to subscribe for an instrument (to open an instrument), it sends a request to the server (Authority) and if the server hasn’t got the information in its cache it forwards the request to the Source(s). Each time a server (Authority) receives updates for an instrument, it sends them to all clients (Destinations), subscribed for it.
Notes:
1. A client (Destination) can unsubscribe itself for an individual instrument (close the instrument) and no further updates will be sent. When the connection between Authority and Destination breaks off, all requests made from the client will be dropped.
2. A server (Authority) can handle large client-connections, though usually a relatively small amount clients are connect to the same server at the same time.
3. A client (Destination) usually has a small amount of open instruments, though larger numbers are also supported.
4. The server has two levels of access permission:
- Logon permission – whether the client is allowed to connect to the server.
- Information permission – whether the client is allowed to view information about the current instrument. (This check is usually made by checking the contents of the instrument.).
[edit] See also
- Ticker
- Stock Exchange
- financial quote
- Security
- Regulation NMS
- Real Time Markets