Colocation centre

For the methods for the solution of differential equations, see Collocation method. For the corpus linguistics notion, see collocation.

A colocation centre or colocation center (also spelled co-location, collocation, colo, or coloc) is a type of data centre where equipment, space, and bandwidth are available for rental to retail customers. Colocation facilities provide space, power, cooling, and physical security for the server, storage, and networking equipment of other firms—and connect them to a variety of telecommunications and network service providers—with a minimum of cost and complexity.

Benefits

Colocation has become a popular option for companies with midsize IT needs—especially those in Internet related business—because it allows the company to focus its IT staff on the actual work being done, instead of the logistical support needs which underlie the work. Significant benefits of scale (large power and mechanical systems) result in large colocation facilities, typically 4500 to 9500 square metres (roughly 50,000 to 100,000 square feet).

Claimed benefits of colocation include:[1]

Colocation facilities provide, as a retail rental business, usually on a term contract:

They also provide redundant systems for, usually, all of these features, to mitigate the problems when each inevitably fails.

Among the economies of scale which result from grouping many small-to-midsized customers together in one facility are included:

Major types of colocation customers are:

Configuration

“Multi-tenant [colocation] providers sell to a wide range of customers, from Fortune 1000 enterprises to small- and medium-sized organizations.”[3] “Typically the facility provides power and cooling to the space, but the IT equipment is owned by the customer. The value proposition of retail multi-tenant is that customers can retain full control of the design and management of their servers and storage, but turn over the daily task of managing data center and facility infrastructure to their multi-tenant provider.”[4]

Building features

Buildings with data centres inside them are often easy to recognize due to the amount of cooling equipment located outside or on the roof.[8]

Colocation facilities have many other special characteristics:

A typical server rack, commonly seen in colocation

Colocation data centres are often audited to prove that they live up to certain standards and levels of reliability; the most commonly seen systems are SSAE 16 SOC 1 Type I and Type II (formerly SAS 70 Type I and Type II) and the tier system by the Uptime Institute. For service organizations today, SSAE 16 calls for a description of its "system". This is far more detailed and comprehensive than SAS 70's description of "controls".[10] Other data center compliance standards include HIPAA (Health Insurance Portability and Accountability Act (HIPAA) audit) and PCI DSS Standards.

Physical security

Most colocation centres have high levels of physical security, including on-site security guards trained for Anti-Terrorism in the most extreme cases. Others may simply be guarded continuously. They may also employ CCTV.

Some colocation facilities require that employees escort customers, especially if there are not individual locked cages or cabinets for each customer. In other facilities, a PIN code or proximity card access system may allow customers access into the building, and individual cages or cabinets have locks. Biometric security measures, such as fingerprint recognition, voice recognition and "weight matching", are also becoming more commonplace in modern facilities. 'Man-traps' are also used, where a hallway leading into the data centre has a door at each end and both cannot be open simultaneously; visitors can be seen via CCTV and are manually authorized to enter.

Power

Colocation facilities generally have generators that start automatically when utility power fails, usually running on diesel fuel. These generators may have varying levels of redundancy, depending on how the facility is built.

Generators do not start instantaneously, so colocation facilities usually have battery backup systems. In many facilities, the operator of the facility provides large inverters to provide AC power from the batteries. In other cases, the customers may install smaller UPSes in their racks.

Some customers choose to use equipment that is powered directly by 48VDC (nominal) battery banks. This may provide better energy efficiency, and may reduce the number of parts that can fail, though the reduced voltage greatly increases necessary current, and thus the size (and cost) of power delivery wiring.

An alternative to batteries is a motor generator connected to a flywheel and diesel engine.

Many colocation facilities can provide redundant, A and B power feeds to customer equipment, and high end servers and telecommunications equipment often can have two power supplies installed.

“Redundancy in IT is a system design in which a component is duplicated so if it fails there will be a backup.”[11]

N+1, also referred to as “parallel redundant”: “The number of UPS modules that are required to handle an adequate supply of power for essential connected systems, plus one more.”[12]

2N+1, also referred to as “system plus system”: “2 UPS systems feeding 2 independent output distribution systems.”[13] Offers complete redundancy between sides A and B. “2(N+1) architectures fed directly to dual-corded loads provide the highest availability by offering complete redundancy and eliminating single points of failure.”[14]

Colocation facilities are sometimes connected to multiple sections of the utility power grid for additional reliability.

Cooling

The operator of a colocation facility generally provides air conditioning for the computer and telecommunications equipment in the building. The cooling system generally includes some degree of redundancy.

In older facilities, the cooling system capacity often limits the amount of equipment that can operate in the building, more so than the available square footage.


Internal connections

Colocation facility owners have differing rules regarding cross connects between their customers, some of whom may be carriers. These rules may allow customers to run such connections at no charge, or allow customers to order such connections for a significant monthly fee. They may allow customers to order cross connects to carriers, but not to other customers.

Some colocation centres feature a "meet-me-room" where the different carriers housed in the centre can efficiently exchange data.

Most peering points sit in colocation centres.

Because of the high concentration of servers inside larger colocation centres, most carriers will be interested in bringing direct connections to such buildings.

In many cases, there will be a larger Internet Exchange hosted inside a colocation centre, where customers can connect for peering.

External connections

Colocation facilities generally have multiple locations for fibre optic cables to enter the building, to provide redundancy so that communications can continue if one bundle of cables is damaged. Some also have wireless backup connections, for example via satellite.

See also

References

  1. ↑ Rachel A. Dines, Sophia I. Vargas, Doug Washburn, and Eric Chi, “Build Or Colocate? The ROI Of Your Next Data Center”, “Forrester”, August 2013
  2. ↑ Miami data center to protect Latin American e-commerce from fraud | Datacenter Dynamics
  3. ↑ Jeff Paschke, “Multi-Tenant Datacenter Global Providers - 2014”, “451 Research”, August 2014
  4. ↑ David Freeland, “Colocation and Managed Hosting”, “FOCUS Telecom”, Winter 2012
  5. ↑ "Colocation Benefits And How To Get Started". Psychz Networks. Retrieved 18 February 2015.
  6. ↑ DCD Intelligence “Assessing the Cost: Modular versus Traditional Build”, October 2013
  7. ↑ John Rath, “DCK Guide To Modular Data Centers: The Modular Market”, “Data Center Knowledge”, October 2011
  8. ↑ Examples can be seen at http://www.datacentermap.com/blog/data-centers-from-the-sky-174.html
  9. ↑ Thermal Guidelines for Data Processing Environments, 3rd Ed. | ASHRAE Store
  10. ↑ "SSAE 16 (SOC 1) Overview".
  11. ↑ Clive Longbottom, “How to plan and manage datacentre redundancy”, “Computer Weekly”, August 2013
  12. ↑ Margaret Rouse, “N+1 UPS”, “TechTarget”, June 2010
  13. ↑ Emerson Network Power, “Powering Change in the Data Center”
  14. ↑ Kevin McCarthy and Victor Avelar, “Comparing UPS System Design Configurations”, “Schneider Electric”

External links