SciNet Consortium

"SciNet" redirects here. For the high-performance network built at the SC Conference, see SCinet.
First wave of SciNet computer installation process
SciNet CTO, Chris Loken (rightmost), at a data center discussion panel.

SciNet is a consortium of the University of Toronto and affiliated Ontario hospitals. It has received funding from both the federal and provincial government, Faculties at the University of Toronto, and affiliated hospitals.

It is one of seven regional High Performance Computing consortia across Canada and is the most powerful university HPC system outside of the US. As of November 2008, the partially constructed systems were already ranked at #53 on the Top 500 List. It is also the only Canadian HPC in top one hundred of the list. The parallel systems were anticipated to rank around #50 and #25 upon completion in June 2009. The TOP500 list for June 2009 ranked the GPC iDataplex system at #16, while the TCS dropped to #80.

The SciNet offices are based on the St. George street campus, however, to accommodate the large floor space and power needs, the datacentre facility is housed in a warehouse about 30 km north of campus in Vaughan.

At the core of SciNet research are six key areas of study: Astronomy and Astrophysics, Aerospace and Biomedical Engineering, High Energy Particle Physics, Integrative Computational Biology, Planetary Physics, and Theoretical Chemical Physics.

History

SciNet was initially formed in the fall of 2004 following an agreement between the Canadian high-performance computing community to develop a response to the newly created National Platform Fund. The community felt that funding from the NPF would enable the development of a collective national capability in HPC. The Canadian HPC community was successful in its NPF proposal and SciNet was awarded a portion of that funding.

SciNet finalized its contract with IBM to build the system in July 2008 and the formal announcement was August 14, 2008.[1] On Thursday, June 18, 2009, the most powerful supercomputer in Canada went online and would have ranked twelfth most powerful computer worldwide had it been completed six months earlier.[2]

Specifications

The SciNet has two compute clusters that are optimized for different types of computing:

General Purpose Cluster

The General Purpose Cluster consists of 3,780 IBM System x iDataPlex dx360 M3 nodes, each with 2 quad-core Intel Nehalem (Xeon 5540) processor running at 2.53 GHz, totaling 30,240 cores in 45 racks. (An iDataPlex rack cabinet provides 84 rack units of space.[3]) All nodes are connected with Gigabit Ethernet, and DDR InfiniBand is used additionally in 864 nodes to provide high-speed and low-latency communication for message passing applications.[4] The computer will use the same amount of energy which could be used to power four thousand homes, and is water-cooled. To utilize the cold Canadian climate, the system is notified when external air goes below a certain temperature, at which time the chiller switches over to use the "free-air" cooling available. SciNet, IBM Corp and Compute Canada collaborated on the supercomputer venture.[2][5][6] The new computer system at U of T's SciNet is the largest Intel processor based IBM installation globally.[7]

Data center

The computer room itself is 3,000 square feet (280 m2) on a raised floor. It has a 735-ton chiller and cooling towers for “free-air” cooling. A significant research area that will be addressed using the SciNet machines is that of climate change and global warming, which is why creating one of the greenest datacenters in the world was of key importance in this project. A traditional datacenter generally uses 33% of the energy going into its centre for cooling and other non-computing power consumption; however, SciNet and IBM have successfully created a centre that uses less than 20% towards these areas.

Partners

Founding Institution
Affiliated Hospitals

Common uses

Wikinews has related news: Canada's supercomputer goes online

The U of T supercomputer which can perform 300 trillion calculations per second will be used for highly calculation-intensive tasks such as problems involving quantum mechanical physics, weather forecasting, climate research, climate change models, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of the big bang theory in conjunction with the Large Hadron Collider (LHC) in CERN, Geneva which will produce cataclysmic conditions that will mimic the beginning of time,and the U of T supercomputer will examine the particle collisions.[2][5][6] Part of the collaboration with LHC will be to answer questions about why matter has mass and what comprises the Universe's mass? Additional areas of research will be models of greenhouse gas-induced global warming and the effect on Arctic sea ice. The international ATLAS project will be explored by the new supercomputer to discover forces which govern the universe.[7]

References

External links

This article is issued from Wikipedia - version of the Tuesday, October 27, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.