A General Circulation Model (GCM) is a mathematical model of the general circulation of a planetary atmosphere or ocean and based on the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources (radiation, latent heat). These equations are the basis for complex computer programs commonly used for simulating the atmosphere or ocean of the Earth. Atmospheric and Oceanic GCMs (AGCM and OGCM) are key components of Global Climate Models along with sea ice and land-surface components. GCMs and global climate models are widely applied for weather forecasting, understanding the climate, and projecting climate change. Versions designed for decade to century time scale climate applications were originally created by Syukuro Manabe and Kirk Bryan at the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey.[1] These computationally intensive numerical models are based on the integration of a variety of fluid dynamical, chemical, and sometimes biological equations.
In 1956, Norman Phillips developed a mathematical model which could realistically depict monthly and seasonal patterns in the troposphere, which became the first successful climate model.[2][3] Following Phillips's work, several groups began working to create general circulation models.[4] The first general circulation climate model that combined both oceanic and atmospheric processes was developed in the late 1960s at the NOAA Geophysical Fluid Dynamics Laboratory.[5] By the early 1980s, the United States' National Center for Atmospheric Research had developed the Community Atmosphere Model; this model has been continuously refined into the 2000s.[6] In 1996, efforts began to initialize and model soil and vegetation types, which led to more realistic forecasts.[7] Coupled ocean-atmosphere climate models such as the Hadley Centre for Climate Prediction and Research's HadCM3 model are currently being used as inputs for climate change studies.[4] The importance of gravity waves was neglected within these models until the mid 1980s. Now, gravity waves are required within global climate models in order to properly simulate regional and global scale circulations, though their broad spectrum makes their incorporation complicated.[8]
There are both atmospheric GCMs (AGCMs) and oceanic GCMs (OGCMs). An AGCM and an OGCM can be coupled together to form an atmosphere-ocean coupled general circulation model (CGCM or AOGCM). With the addition of other components (such as a sea ice model or a model for evapotranspiration over land), the AOGCM becomes the basis for a full climate model. Within this structure, different variations can exist, and their varying response to climate change may be studied (e.g., Sun and Hansen, 2003).
A recent trend in GCMs is to apply them as components of Earth System Models, e.g. by coupling to ice sheet models for the dynamics of the Greenland and Antarctic ice sheets, and one or more chemical transport models (CTMs) for species important to climate. Thus a carbon CTM may allow a GCM to better predict changes in carbon dioxide concentrations resulting from changes in anthropogenic emissions. In addition, this approach allows accounting for inter-system feedback: e.g. chemistry-climate models allow the possible effects of climate change on the recovery of the ozone hole to be studied.[9]
Climate prediction uncertainties depend on uncertainties in chemical, physical, and social models (see IPCC scenarios below).[10] Progress has been made in incorporating more realistic chemistry and physics in the models, but significant uncertainties and unknowns remain, especially regarding the future course of human population, industry, and technology.
Note that many simpler levels of climate model exist; some are of only heuristic interest, while others continue to be scientifically relevant.
Three-dimensional (more properly four-dimensional) GCMs discretise the equations for fluid motion and integrate these forward in time. They also contain parameterisations for processes – such as convection – that occur on scales too small to be resolved directly. More sophisticated models may include representations of the carbon and other cycles.
A simple general circulation model (SGCM), a minimal GCM, consists of a dynamical core that relates material properties such as temperature to dynamical properties such as pressure and velocity. Examples are programs that solve the primitive equations, given energy input into the model, and energy dissipation in the form of scale-dependent friction, so that atmospheric waves with the highest wavenumbers are the ones most strongly attenuated. Such models may be used to study atmospheric processes within a simplified framework but are not suitable for future climate projections.
Atmospheric GCMs (AGCMs) model the atmosphere (and typically contain a land-surface model as well) and impose sea surface temperatures (SSTs). A large amount of information including model documentation is available from AMIP.[11] They may include atmospheric chemistry.
A GCM contains a number of prognostic equations that are stepped forward in time (typically winds, temperature, moisture, and surface pressure) together with a number of diagnostic equations that are evaluated from the simultaneous values of the variables. As an example, pressure at any height can be diagnosed by applying the hydrostatic equation to the predicted surface pressure and the predicted values of temperature between the surface and the height of interest. The pressure diagnosed in this way then is used to compute the pressure gradient force in the time-dependent equation for the winds.
Oceanic GCMs (OGCMs) model the ocean (with fluxes from the atmosphere imposed) and may or may not contain a sea ice model. For example, the standard resolution of HadOM3 is 1.25 degrees in latitude and longitude, with 20 vertical levels, leading to approximately 1,500,000 variables.
Coupled atmosphere-ocean GCMs (AOGCMs) (e.g. HadCM3, GFDL CM2.X) combine the two models. They thus have the advantage of removing the need to specify fluxes across the interface of the ocean surface. These models are the basis for sophisticated model predictions of future climate, such as are discussed by the IPCC.
AOGCMs represent the pinnacle of complexity in climate models and internalise as many processes as possible. They are the only tools that could provide detailed regional predictions of future climate change. However, they are still under development. The simpler models are generally susceptible to simple analysis and their results are generally easy to understand. AOGCMs, by contrast, are often nearly as hard to analyse as the real climate system.
The fluid equations for AGCMs are discretised using either the finite difference method or the spectral method. For finite differences, a grid is imposed on the atmosphere. The simplest grid uses constant angular grid spacing (i.e., a latitude / longitude grid), however, more sophisticated non-rectantangular grids (e.g., icohedral) and grids of variable resolution[12] are more often used.[13] The "LMDz" model can be arranged to give high resolution over any given section of the planet. HadGEM1 (and other ocean models) use an ocean grid with higher resolution in the tropics to help resolve processes believed to be important for ENSO. Spectral models generally use a gaussian grid, because of the mathematics of transformation between spectral and grid-point space. Typical AGCM resolutions are between 1 and 5 degrees in latitude or longitude: the Hadley Centre model HadCM3, for example, uses 3.75 in longitude and 2.5 degrees in latitude, giving a grid of 96 by 73 points (96 x 72 for some variables); and has 19 levels in the vertical. This results in approximately 500,000 "basic" variables, since each grid point has four variables (u,v, T, Q), though a full count would give more (clouds; soil levels). HadGEM1 uses a grid of 1.875 degrees in longitude and 1.25 in latitude in the atmosphere; HiGEM, a high-resolution variant, uses 1.25 x 0.83 degrees respectively.[14] These resolutions are lower than is typically used for weather forecasting.[15] Ocean resolutions tend to be higher, for example HadCM3 has 6 ocean grid points per atmospheric grid point in the horizontal.
For a standard finite difference model, uniform gridlines converge towards the poles. This would lead to computational instabilities (see CFL condition) and so the model variables must be filtered along lines of latitude close to the poles. Ocean models suffer from this problem too, unless a rotated grid is used in which the North Pole is shifted onto a nearby landmass. Spectral models do not suffer from this problem. There are experiments using geodesic grids[16] and icosahedral grids, which (being more uniform) do not have pole-problems. Another approach to solving the grid spacing problem is to deform a Cartesian cube such that it covers the surface of a sphere.[17]
Some early incarnations of AOGCMs required a somewhat ad hoc process of "flux correction" to achieve a stable climate (not all model groups used this technique). This resulted from separately prepared ocean and atmospheric models each having a different implicit flux from the other component than the other component could actually provide. If uncorrected this could lead to a dramatic drift away from observations in the coupled model. However, if the fluxes were 'corrected', the problems in the model that led to these unrealistic fluxes might be unrecognised and that might affect the model sensitivity. As a result, there has always been a strong disincentive to use flux corrections, and the vast majority of models used in the current round of the Intergovernmental Panel on Climate Change do not use them. The model improvements that now make flux corrections unnecessary are various, but include improved ocean physics, improved resolution in both atmosphere and ocean, and more physically consistent coupling between atmosphere and ocean models.
Moist convection causes the release of latent heat and is important to the Earth's energy budget. Convection occurs on too small a scale to be resolved by climate models, and hence must be parameterised. This has been done since the earliest days of climate modelling, in the 1950s. Akio Arakawa did much of the early work and variants of his scheme are still used [1] although there is a variety of different schemes now in use [2] [3] [4]. The behavior of clouds is still poorly understood and is parametrized. [5].
Most models include software to diagnose a wide range of variables for comparison with observations or study of atmospheric processes. An example is the 1.5 metre temperature, which is the standard height for near-surface observations of air temperature. This temperature is not directly predicted from the model but is deduced from the surface and lowest-model-layer temperatures. Other software is used for creating plots and animations.
Coupled ocean-atmosphere GCMs use transient climate simulations to project/predict future temperature changes under various scenarios. These can be idealised scenarios (most commonly, CO2 increasing at 1%/yr) or more realistic (usually the "IS92a" or more recently the SRES scenarios). Which scenarios should be considered most realistic is currently uncertain, as the projections of future CO2 (and sulphate) emission are themselves uncertain.
The 2001 IPCC Third Assessment Report figure 9.3 shows the global mean response of 19 different coupled models to an idealised experiment in which CO2 is increased at 1% per year [6]. Figure 9.5 shows the response of a smaller number of models to more realistic forcing. For the 7 climate models shown there, the temperature change to 2100 varies from 2 to 4.5 °C with a median of about 3 °C.
Future scenarios do not include unknowable events – for example, volcanic eruptions or changes in solar forcing. These effects are believed to be small in comparison to GHG forcing in the long term, but large volcanic eruptions, for example, are known to exert a temporary cooling effect.
Human emissions of GHGs are an external input to the models, although it would be possible to couple in an economic model to provide these as well. Atmospheric GHG levels are usually supplied as an input, though it is possible to include a carbon cycle model including land vegetation and oceanic processes to calculate GHG levels.
For the six SRES marker scenarios, IPCC (2007:7–8) gave a "best estimate" of global mean temperature increase (2090–2099 relative to the period 1980–1999) that ranged from 1.8 °C to 4.0 °C. Over the same time period, the "likely" range (greater than 66% probability, based on expert judgement) for these scenarios was for a global mean temperature increase of between 1.1 and 6.4 °C.[18]
Pope (2008) described a study where climate change projections were made using several different emission scenarios.[19] In a scenario where global emissions start to decrease by 2010 and then decline at a sustained rate of 3% per year, the likely global average temperature increase was predicted to be 1.7 °C above pre-industrial levels by 2050, rising to around 2 °C by 2100. In a projection designed to simulate a future where no efforts are made to reduce global emissions, the likely rise in global average temperature was predicted to be 5.5 °C by 2100. A rise as high as 7 °C was thought possible but less likely.
Sokolov et al. (2009) examined a scenario designed to simulate a future where there is no policy to reduce emissions. In their integrated model, this scenario resulted in a median warming over land (2090–2099 relative to the period 1980–1999) of 5.1 °C. Under the same emissions scenario but with different modeling of the future climate, the predicted median warming was 4.1 °C.[20]
AOGCMs represent the pinnacle of complexity in climate models and internalise as many processes as possible. However, they are still under development and uncertainties remain. They may be coupled to models of other processes, such as the carbon cycle, so as to better model feedback effects. Most recent simulations show "plausible" agreement with the measured temperature anomalies over the past 150 years, when forced by observed changes in greenhouse gases and aerosols, but better agreement is achieved when natural forcings are also included.[21][22]
No model – whether a wind tunnel model for designing aircraft, or a climate model for projecting global warming – perfectly reproduces the system being modeled. Such inherently imperfect models may nevertheless produce useful results. In this context, GCMs are capable of reproducing the general features of the observed global temperature over the past century.[21]
A debate over how to reconcile climate model predictions that upper air (tropospheric) warming should be greater than surface warming, with observations some of which appeared to show otherwise[23] now appears to have been resolved in favour of the models, following revisions to the data: see satellite temperature record.
The effects of clouds are a significant area of uncertainty in climate models. Clouds have competing effects on the climate. One of the roles that clouds play in climate is in cooling the surface by reflecting sunlight back into space; another is warming by increasing the amount of infrared radiation emitted from the atmosphere to the surface.[24] In the 2001 IPCC report on climate change, the possible changes in cloud cover were highlighted as one of the dominant uncertainties in predicting future climate change;[25] see also[26]
Thousands of climate researchers around the world use climate models to understand the climate system. There are thousands of papers published about model-based studies in peer-reviewed journals – and a part of this research is work improving the models. Improvement has been difficult but steady (most obviously, state of the art AOGCMs no longer require flux correction), and progress has sometimes led to discovering new uncertainties.
In 2000, a comparison between measurements and dozens of GCM simulations of ENSO-driven tropical precipitation, water vapor, temperature, and outgoing longwave radiation found similarity between measurements and simulation of most factors. However the simulated change in precipitation was about one-fourth less than what was observed. Errors in simulated precipitation imply errors in other processes, such as errors in the evaporation rate that provides moisture to create precipitation. The other possibility is that the satellite-based measurements are in error. Either indicates progress is required in order to monitor and predict such changes. [7]
A more complete discussion of climate models is provided in the IPCC's Third Assessment Report.[27]
The precise magnitude of future changes in climate is still uncertain;[28] for the end of the 21st century (2071 to 2100), for SRES scenario A2, the change of global average SAT change from AOGCMs compared with 1961 to 1990 is +3.0 °C (4.8 °F) and the range is +1.3 to +4.5 °C (+2 to +7.2 °F).
The global climate models used for climate projections are very similar in structure to (and often share computer code with) numerical models for weather prediction but are nonetheless logically distinct.
Most weather forecasting is done on the basis of interpreting the output of numerical model results. Since forecasts are short—typically a few days or a week—such models do not usually contain an ocean model but rely on imposed SSTs. They also require accurate initial conditions to begin the forecast—typically these are taken from the output of a previous forecast, with observations blended in. Because the results are needed quickly the predictions must be run in a few hours; but because they only need to cover a week of real time these predictions can be run at higher resolution than in climate mode. Currently the ECMWF runs at 40 km (25 mi) resolution [8] as opposed to the 100-to-200 km (62-to-120 mi) scale used by typical climate models. Often nested models are run forced by the global models for boundary conditions, to achieve higher local resolution: for example, the Met Office runs a mesoscale model with an 11 km (6.8 mi) resolution [9] covering the UK, and various agencies in the U.S. also run nested models such as the NGM and NAM models. Like most global numerical weather prediction models such as the GFS, global climate models are often spectral models [10] instead of grid models. Spectral models are often used for global models because some computations in modeling can be performed faster thus reducing the time needed to run the model simulation.
Climate models use quantitative methods to simulate the interactions of the atmosphere, oceans, land surface, and ice. They are used for a variety of purposes from study of the dynamics of the climate system to projections of future climate.
All climate models take account of incoming energy as short wave electromagnetic radiation, chiefly visible and short-wave (near) infrared, as well as outgoing energy as long wave (far) infrared electromagnetic radiation from the earth. Any imbalance results in a change in temperature.
The most talked-about models of recent years have been those relating temperature to emissions of carbon dioxide (see greenhouse gas). These models project an upward trend in the surface temperature record, as well as a more rapid increase in temperature at higher altitudes.[29]
Three (or more properly, four since time is also considered) dimensional GCM's discretise the equations for fluid motion and energy transfer and integrate these over time. They also contain parametrisations for processes—such as convection—that occur on scales too small to be resolved directly.
Atmospheric GCMs (AGCMs) model the atmosphere and impose sea surface temperatures as boundary conditions. Coupled atmosphere-ocean GCMs (AOGCMs, e.g. HadCM3, EdGCM, GFDL CM2.X, ARPEGE-Climat[30]) combine the two models.
Models can range from relatively simple to quite complex:
This is not a full list; for example "box models" can be written to treat flows across and within ocean basins. Furthermore, other types of modelling can be interlinked, such as land use, allowing researchers to predict the interaction between climate and ecosystems.
Box models are simplified versions of complex systems, reducing them to boxes (or reservoirs) linked by fluxes. The boxes are assumed to be mixed homogeneously. Within a given box, the concentration of any chemical species is therefore uniform. However, the abundance of a species within a given box may vary as a function of time due to the input to (or loss from) the box or due to the production, consumption or decay of this species within the box.
Simple box models, i.e. box model with a small number of boxes whose properties (e.g. their volume) do not change with time, are often useful to derive analytical formulas describing the dynamics and steady-state abundance of a species. More complex box models are usually solved using numerical techniques.
Box models are used extensively to model environmental systems or ecosystems and in studies of ocean circulation and the carbon cycle.[31]
A very simple model of the radiative equilibrium of the Earth is:
where
and
The constant πr2 can be factored out, giving
Solving for the temperature,
This yields an average earth temperature of 288 K (15 °C; 59 °F).[34] This is because the above equation represents the effective radiative temperature of the Earth (including the clouds and atmosphere). The use of effective emissivity and albedo account for the greenhouse effect.
This very simple model is quite instructive. For example, it easily determines the effect on average earth temperature of changes in solar constant or change of albedo or effective earth emissivity. Using the simple formula, the percent change of the average amount of each parameter, considered independently, to cause a one degree Celsius change in steady-state average earth temperature is as follows:
The average emissivity of the earth is readily estimated from available data. The emissivities of terrestrial surfaces are all in the range of 0.96 to 0.99[35][36] (except for some small desert areas which may be as low as 0.7). Clouds, however, which cover about half of the earth’s surface, have an average emissivity of about 0.5[37] (which must be reduced by the fourth power of the ratio of cloud absolute temperature to average earth absolute temperature) and an average cloud temperature of about 258 K (−15 °C; 5 °F).[38] Taking all this properly into account results in an effective earth emissivity of about 0.64 (earth average temperature 285 K (12 °C; 53 °F)).
This simple model readily determines the effect of changes in solar output or change of earth albedo or effective earth emissivity on average earth temperature. It says nothing, however about what might cause these things to change. Zero-dimensional models do not address the temperature distribution on the earth or the factors that move energy about the earth.
The zero-dimensional model above, using the solar constant and given average earth temperature, determines the effective earth emissivity of long wave radiation emitted to space. This can be refined in the vertical to a zero-dimensional radiative-convective model, which considers two processes of energy transport:
The radiative-convective models have advantages over the simple model: they can determine the effects of varying greenhouse gas concentrations on effective emissivity and therefore the surface temperature. But added parameters are needed to determine local emissivity and albedo and address the factors that move energy about the earth.
Links:
The zero-dimensional model may be expanded to consider the energy transported horizontally in the atmosphere. This kind of model may well be zonally averaged. This model has the advantage of allowing a rational dependence of local albedo and emissivity on temperature – the poles can be allowed to be icy and the equator warm – but the lack of true dynamics means that horizontal transports have to be specified.
Depending on the nature of questions asked and the pertinent time scales, there are, on the one extreme, conceptual, more inductive models, and, on the other extreme, general circulation models operating at the highest spatial and temporal resolution currently feasible. Models of intermediate complexity bridge the gap. One example is the Climber-3 model. Its atmosphere is a 2.5-dimensional statistical-dynamical model with 7.5° × 22.5° resolution and time step of 1/2 a day; the ocean is MOM-3 (Modular Ocean Model) with a 3.75° × 3.75° grid and 24 vertical levels.
A climate modeller is a person who designs, develops, implements, tests, maintains or exploits climate models. There are three major types of institutions where a climate modeller may be found:
|
|
|
|
|