Colocation centre

From Wikipedia, the free encyclopedia

Types of Internet hosting service
A room in the Telecity colocation centre in Aubervilliers, a suburb of Paris
A room in the Telecity colocation centre in Aubervilliers, a suburb of Paris

A colocation centre (collocation center) ("colo") or carrier hotel is a type of data center where multiple customers locate network, server and storage gear and interconnect to a variety of telecommunications and other network service provider(s) with a minimum of cost and complexity.

Increasingly organizations are recognizing the benefits of colocating their mission-critical equipment within a data centre. Colocation is becoming popular because of the time and cost savings a company can realize as result of using shared data centre infrastructure. Significant benefits of scale (large power and mechanical systems) result in large colocation facilities, typically 4500 to 9500 square metres (roughly 50000 to 100000 square feet). With IT and communications facilities in safe, secure hands, telecommunications, internet, ASP and content providers, as well as enterprises, enjoy less latency and the freedom to focus on their core business.

Additionally, customers reduce their traffic back-haul costs and free up their internal networks for other uses. Moreover, by outsourcing network traffic to a colocation service provider with greater bandwidth capacity, web site access speeds should improve considerably.

Major types of colocation customers are:

  • Web commerce companies, who use the facilities for a safe environment and cost-effective, redundant connections to the Internet
  • Major enterprises, who use the facility for disaster avoidance, offsite data backup and business continuity
  • Telecommunication companies, who use the facilities to interexchange traffic with other telecommunications companies and access to potential clients

Most network access point facilities provide colocation.

Contents

[edit] Building

A typical server rack, commonly seen in colocation.
A typical server rack, commonly seen in colocation.
  • Fire protection systems, including passive and active active design elements, as well as implementation of fire prevention programs in operations. Smoke detectors are usually installed to provide early warning of a developing fire by detecting particles generated by smoldering components prior to the development of flame. This allows investigation, interruption of power, and manual fire suppression using hand held fire extinguishers before the fire grows to a large size. A fire sprinkler system is often provided to control a full scale fire if it develops. Clean agent fire suppression gaseous systems are sometimes installed to suppress a fire earlier than the fire sprinkler system. Passive fire protection elements include the installation of fire walls around the space, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems, or if they are not installed.
  • 19-inch racks for data equipment and servers, 23-inch racks for telecom equipment.
  • Cabinets and cages for physical access control over tenants' equipment.
  • Overhead cable rack (tray) and fiberguide, power cables usually on separate rack from data.
  • Air conditioning is used to control the temperature and humidity in the space. ASHRAE recommends a temperature range of 20–25 °C and humidity range of 40–60% as optimal for electronic equipmemt conditions.[citation needed] The electrical power used by the electronic equipment is converted to heat, which is rejected to the ambient air in the data center space. Unless the heat is removed, the ambient temperature will rise, resulting in electronic equipment malfunction. By controlling the space air temperature, the server components at the board level are kept within the manufacturer's specified temperature/humidity range. Air conditioning systems help control space humidity within acceptable parameters by cooling the return space air below the dew point. Too much humidity and water may begin to condense on internal components. In case of a dry atmosphere, ancillary humidification systems may add water vapor to the space if the space humidy is too low, which can result in static electricity discharge problems which may damage components.
  • Low-impedance electrical ground.
  • Few, if any, windows.

[edit] Physical security

Most colocation centres have high levels of physical security, and may be guarded continuously. They may employ closed-circuit television camera.

Some colocation facilities require that employees escort customers, especially if there are not individual locked cages/cabinets for each customer. In other facilities, a card access system may allow customers access into the building, and individual cages/cabinets have locks.

[edit] Power

Colocation facilities generally have generators that start automatically when utility power fails, usually running on diesel fuel. These generators may have varying levels of redundancy, depending on how the facility is built.

Generators do not start instantaneously, so colocation facilities usually have battery backup systems. In many facilities, the operator of the facility provides large inverters to provide AC power from the batteries. In other cases, the customers may install smaller UPSes in their racks.

Some customers choose to use equipment that is powered directly by 48VDC (nominal) battery banks. This may provide better energy efficiency, and may reduce the number of parts that can fail.

An alternative to batteries is a motor generator connected to a flywheel and diesel engine.

Many colocation facilities can provide A and B power feeds to customer equipment, and high end servers and telecommunications equipment often can have two power supplies installed.

Colocation facilities are sometimes connected to multiple sections of the utility power grid for additional reliability.

[edit] Cooling

The operator of a colocation facility generally provides air conditioning for the computer and telecommunications equipment in the building. The cooling system generally includes some degree of redundancy

In older facilities, the cooling system capacity often limits the amount of equipment that can operate in the building, more so than the available square footage.

[edit] Internal connections

Colocation facility owners have differing rules regarding cross connects between their customers. These rules may allow customers to run such connections at no charge, or allow customers to order such connections for a significant monthly fee. They may allow customers to order cross connects to carriers, but not to other customers.

Some colocation centres feature a "meet-me-room" where the different carriers housed in the centre can efficiently exchange data.

Most peering points sit in colocation centres.

Because of the high concentration of servers inside larger colocation centres, most carriers will be interested in bringing direct connections to such buildings.

In many cases there will be a larger Internet Exchange hosted inside a colocation centre, where customers can connect for peering.

[edit] External connections

Colocation facilities generally have multiple locations for fiber optic cables to enter the building, to provide redundancy so that communications can continue if one bundle of cables is damaged.

[edit] References

[edit] External links