The IPCC Data Distribution CentreFrequently Asked Questions (FAQs) |
The DDC can provide scientific and technical support regarding those datasets which are provided through this web site. We have compiled a list of Frequently Asked Questions (FAQs) which may help you interpret and apply these data efficiently and consistently. A number of different sections have been constructed. Some questions may be common more than one section and are, therefore, repeated.
Data Download and Availability
Q. Are there regional DDCs containing duplicate data?
A. At present the DDC web site is housed between Hamburg (DKRZ- Yellow Pages) and Norwich (CRU - Green Pages). It is the intention in due course to establish DDC mirror sites around the world, which will provide access to the identical information and data. By locating your nearest mirror site you may be able to speed up access and download time. Potential sites include NCAR, CSIRO, South Africa and India. Registered users will be notified when these sites are activated.
Q. How can I obtain the IPCC DDC CD-ROM?
A. The DDC will be producing a CD-ROM containing all the information and datasets available on the DDC Green Pages. This CD will contain its own browser (Netscape Navigator) and will be compatible with PCs running either Windows '95 or Windows '98 and Mac OS. There will be no charge for the CD. The CD release data is currently March 1999. You can place an order for the CD by emailing: ipcc.ddc@uea.ac.uk or by visiting the CD-ROM page under User Support.
Q. Are results from RCMs available?
A. The DDC is not presently handling any data from Regional Climate Model experiments and has no current plans to do so. An inventory of RCM studies, and other regional downscaling activities, has been compiled by Working Group II of the IPCC. Contact persons are listed here. Not a large number of RCM climate change experiments have yet been performed and it is anticipated that many impact studies will continue to use results from GCM experiments.
Q. How do I get daily GCM data?
A. The DDC is making available to researchers only monthly results for a core set of variables from a number of GCM experiments. The volume of daily data generated by these experiments, together with the different data management systems and different data policies of the modelling centres, means that requests for daily data (or for more specialised variables at monthly timesteps) cannot be handled by the DDC. Requests for daily GCM data should be directed to the respective modelling centres. The relevant contacts are as follows:
Reasonable requests for daily data from these experiments will be met, although the precise availability of what daily data for what periods and for what variables and from which experiments is subject to the data
Q. Which GCM should I use?
A. Many climate change experiments have been performed with GCMs: between 30 and 40 equilibrium experiments and over 15 transient experiments. Five sets of transient experiments are available through the DDC web site. Four criteria for selection of which GCM(s) to use for an impact study have been suggested: vintage, resolution, validity and representativeness of results.
Vintage. In general, recent model simulations are likely (though by no means certain) to be more reliable than those of an earlier vintage. They are based on recent knowledge, incorporate more processes and feedbacks and are usually of a higher spatial resolution than earlier models.
Resolution. As climate models have evolved and computing power has increased, there has been a tendency towards increased resolution. Some of the early GCMs operated on a horizontal resolution of some 1000 km with between 2 and 10 levels in the vertical. More recent models are run at nearer 250 km spatial resolution with perhaps 20 vertical levels. However, although higher resolution models contain more spatial detail this does not necessarily guarantee a superior model performance.
Validity. A more persuasive criterion for model selection is to adopt the GCMs that simulate the present-day climate most faithfully, on the premise that these GCMs would also yield the most reliable representation of future climate. The approach involves comparing GCM simulations that represent present-day conditions with the observed climate. The modelled and observed data are projected to the same grid, and statistical methods employed to compare, for example, mean values, variability and climatic patterns. Some model-observed comparisons are possible using the Data Visualisation Pages of the DDC.
Representativeness. If results from more than one GCM are to be applied in an impact assessment (and given the known uncertainties of GCMs, this is strongly recommended), another criterion for selection is to examine the representativeness of the results. Where several GCMs are to be selected, it might be prudent to choose models that show a range of changes in a key variable in the study region (for example, models showing little change in precipitation, models showing an increase and models showing a decrease). The selections may not necessarily be the best validated models (see above), although some combination of models satisfying both criteria couldbe agreed upon.
Q. What CO2 concentration accompanies what GCM scenario?
The greenhouse gas forcing scenarios used by all of the GCM experiments reported on the DDC are expressed in terms of equivalent CO2 concentrations. Most of these experiments use a forcing scenario of 1% per annum increase in equivalent greenhouse gas concentrations (HadCM2 also uses a ~0.5% per annum forced scenario). None of the model experiments therefore explicitly calculate, or assume, an actual CO2 (as opposed to an equivalent CO2) concentration curve, yet for many impacts studies it is the actual CO2 concentration that is needed. Of the IPCC IS92 emissions scenarios, the 1% per annum increase in CO2 equivalent concentration is best approximated by the IS92a emissions scenario (according to IPCC 1996 calculations). We can therefore use the mix of greenhouse gases reported in the IS92a scenario to estimate the CO2 concentration path associated with the 1% per annum forcing used by the GCMs (we use the IS92d emissions scenario to estimate the CO2 concentrations for the ~0.5% per annum forced experiments). We have done this using the same set of emissions-concentrations-forcing relationships used in IPCC 1996. The CO2 concentrations for the 2020s, 2050s and 2080s listed in the Tables on the GCM Experiments Pages have been estimated in this way.
Q. Why are changes in Tmean not always the average of changes in Tmin and Tmax?
A. Whether or not the change in mean monthly mean temperature is the average of the changes in mean monthly Tmin and Tmax depends on how the particular GCM calculates its daily Tmean, Tmin and Tmax. In HadCM2, for example, the daily Tmean is the average temperature over 24 hours (the average of 48 half hourly calculations), whereas daily Tmin is the lowest temperature reached in a day and Tmax is the highest temperature reached in a day. Clearly, in this case unless there is a perfect sine curve in the diurnal temperature cycle then daily Tmean need not be the average of daily Tmin and daily Tmax. Since the monthly and 30-year mean monthly values are derived from the daily temperatures, this explains why changes in Tmean may not equal the average of changes in Tmin and Tmax. How close Tmean changes are to the average of Tmin and Tmax changes depends on how the model calculates these daily quantities and on the shape of the diurnal cycle.
Changes in 30-year mean diurnal temperature range (*DTR) for the DDC results are calculated as the difference between the changes in the 30-year mean monthly Tmax and Tmin changes.
Q. What sea-level rise accompanies what GCM scenario?
A. It is possible to derive from coupled ocean-atmosphere GCM experiments the contribution of thermal expansion to total sea-level rise. However, no GCM directly generates estimates of the ice-melt contribution (land glaciers and ice sheets). For this reason it is not straightforward to obtain estimates of total global sea-level rise that are consistent with the different GCM experiments listed on the DDC. The thermal expansion of the respective GCMs will be made available through the DDC in due course.
At present however, it is possible to obtain the total global-mean sea-level rise for HadCM2. This is because an off-line ice-melt model has been used in conjunction with HadCM2 climate output to generate the ice-melt contribution and this, when combined with HadCM2's thermal expansion, yields a consistent estimate of total sea-level rise.
Q. What is the climate sensitivity of a GCM?
The term "climate sensitivity" refers to the steady-state increase in the global annual mean surface air temperature associated with a given global-mean radiative forcing. It is common practise to use CO2 doubling as a benchmark for comparing GCM climate sensitivities. Thus in practise the climate sensitivity may be defined as the change in global-mean temperature that would ultimately be reached following a doubling of carbon dioxide concentration in the atmosphere (e.g. from 275 ppmv to 550 ppmv). The Intergovernmental Panel on Climate Change (IPCC) has always reported the likely range for this quantity to be between 1.5º and 4.5ºC, with a 'mid-range' estimate of 2.5ºC.
Each GCM possesses a different climate sensitivity, depending on the representation of various feedback processes in the model, including water vapour. It is generally assumed that the climate sensitivity of a model is approximately constant over the range of forcings expected for the next century. The climate sensitivity of a model is also largely independent (±10%) of the specific combination of different forcing factors (solar, aerosols, CO2, CH4, etc.) that produce a given global-mean forcing.
The range of climate sensitivities in the DDC models is from about 2.5ºC to 4.0ºC.
Q. How should I define a GCM change field?
It is usual practise in climate scenario construction to use the change in climate from present to future conditions simulated by a climate model, rather than use directly the actual future climate from the model. This is mainly because the climatology of a GCM may sometimes, and for some regions, be rather different from that observed.
The change fields from a GCM experiment can be defined in a number of ways. The approach we have followed on this web site and for the DDC models is to adopt the 30-year simulation period 1961-90 in the models as the reference period and then calculate change fields for future 30-year periods, namely the 2020s (2010-2039), the 2050s (2040-2069) and the 2080s (2070-2099). The data files available from the Green Site Data Download Pages contain these calculated 30-year mean GCM change fields.
It is also possible to calculate change fields using the unforced (control) simulations of the respective GCM (these are available from the Yellow Pages). In this case one should be careful when applying the change fields since they define the magnitude of climate change since pre-industrial times (e.g. ~ 1800) rather than from the 1961-90 period.
Q. What are GCM ensembles?
A. GCM predictions of climate change may depend upon the choice of point on the control run at which increasing greenhouse gas concentrations are introduced. For this reason, some modelling centres have performed "ensemble" simulations with their climate model. In such cases, a number of identical model experiments are performed with the same historical changes and future changes in greenhouse gases, but these changes are initiated from different points on the control run. The underlying climate change predicted by each of these model experiments is very similar, showing that the initial condition is not important to the long-term change. However, there are significant year-to-year and decade-to-decade differences in the resulting climate. These differences are due to natural climate variability and are particularly large at regional scales and for some variables such as precipitation. For this reason, results from the different members of an ensemble may be averaged together to provide a more robust estimate of the climate change.
Of the models posted on the DDC web site, HadCM2 and CGCM1 have been used in this way and results from individual ensemble members, as well as the ensemble-mean, may be downloaded.
Q. How do I add the GCM fields to my baseline data?
GCM change fields generally exist on coarser space scales than the observational climate data being used in a project. For example, the observational dataset available from the DDC Pages exists at 0.5º latitude/longitude resolution whereas the GCM data exist at scales large than 2.5º latitude/longitude. There are a number of ways GCM changes and observed data may be combined:
Whatever method is used, be careful to combine variables that are describing the same meteorological variable, e.g. cloud cover with cloud cover, vapour pressure with vapour pressure, etc. The variables deposited with the DDC by modelling centres are not always directly compatible with the observed variables most commonly used in impacts studies.
Q. How can I downscale the GCM data to smaller scales?
One of the simplest ways of adding spatial detail to GCM-based climate change scenarios is to interpolate GCM-scale changes to a finer resolution and then combine these interpolated changes with observed climate information at the fine resolution (see Question above). This may be achieved using high resolution observed mean monthly climatologies or can be done simply by perturbing an observed monthly or daily time series for a site or catchment by the GCM changes interpolated to the site in question. This approach is sometimes termed 'unintelligent' downscaling because no new meteorological insight is added in the interpolation process that goes beyond the GCM-based changes; the basic spatial patterns of present climate are assumed to remain largely unchanged in the future. This very simple approach to downscaling is easy to apply and allows impact assessment models to use climate scenarios at a resolution that would otherwise be difficult or costly to obtain.
Another downscaling option is to use a higher resolution limited-area model (often called a Regional Climate Model - RCM) to generate climate change scenarios at the required resolution. Such regional climate models typically cover an area the size of Europe, have a spatial resolution of about 30-50 km and are driven by boundary conditions taken from the GCM for one particular period in the present and one in the future. This approach has been adopted by a number of modelling centres around the world. The DDC does not currently contain any results from RCM experiments and not many impact studies have yet used results from RCM experiments.
Because unintelligent downscaling assumes that climate change will be uniform over GCM grid-scales and because RCMs are slow and expensive to run, another set of approaches to the downscaling problem has been developed for scenario applications. These approaches may conveniently be grouped together as statistical downscaling methods. There are at least three broad clusters of methods within this general category - regression methods, circulation typing schemes, and stochastic weather generators. Developing a statistical downscaling model is usually quite time-intensive and will always require very extensive observational data - daily/hourly weather data, for the surface and maybe for the upper air, and usually for several/many sites or gridboxes covering the region of interest. It need not necessarily be a cheaper or easier option than running an RCM. It should also be noted that most downscaling methods and models are developed with a specific application in mind - whether agriculture, forestry, water, etc. - and quite often for a specific geographic region. Not all downscaling methods can easily be transported from one region to another. In each case, just as with a regional climate model, the derived regional scenarios depend on the validity of the GCM output.
Q. How do I get daily GCM data?
A.The DDC is making available to researchers only monthly results for a core set of variables from a number of GCM experiments. The volume of daily data generated by these experiments, together with the different data management systems and different data policies of the modelling centres, means that requests for daily data (or for more specialised variables at monthly timesteps) cannot be handled by the DDC. Requests for daily GCM data should be directed to the respective modelling centres. The relevant contacts are as follows:
Reasonable requests for daily data from these experiments will be met, although the precise availability of what daily data for what periods and for what variables and from which experiments is subject to the data
Climate Change Scenarios - Construction and Application
Q. Can I construct a scenario using a different climate sensitivity?
All of the results from the DDC models assume a specified forcing scenario (e.g. 1% per annum growth in CO2 equivalent concentrations, with or without aerosol forcing) and each model possesses a specific climate sensitivity (e.g. 3.5ºC). These forcing scenarios and climate sensitivities do not span the range of forcings and sensitivities suggested by the IPCC 1996 Report and they do not span the range of forcings suggested by the new SRES emissions scenarios. If users wish to construct climate change scenarios that do capture a wider part of these IPCC ranges, then the DDC model results will need manipulating in some way.
One possible way is to use the results from a simple climate model in which the forcing and climate sensitivity can be specified to the users wishes. The DDC results can then be scaled according to the global-mean temperature response of the simple model. This approach has some advantages and some disadvantages and some of these are discussed in the following paper which will appear shortly in Climate Change: Mitchell,J.F.B., Johns,T.C., Eagles,M., Ingram,W.J. and Davis,R.A. (1999) Towards the construction of climate change scenarios Climatic Change (in press).
The DDC User Support Service (email: The DDC User Support Service: ipcc.ddc@uea.ac.uk) will answer specific questions about this and other approaches to scenario construction.
Q. How do I introduce changes in interannual or interdaily variability into my scenario?
The 30-year mean monthly GCM change fields available from the DDC Green Data Download Pages do not contain any information about changes in interannual or interdaily variability. Sometimes this information may be very important to include in an impact study. Changes in interannual variability can be calculated using the monthly time series GCM data available from the DDC Yellow Pages. In this case, it is suggested to calculate the ratio of the standard deviation in the future time-slice compared to the 1961-90 time-slice and then impose this ratio on the observed time series climate data (note: if you do not have observed interannual variability data available for your study, then you cannot incorporate interannual variability changes). This ratio will either inflate or deflate the observed interannual variations.
To introduce GCM changes in interdaily climate variability,
it is necessary to access daily GCM data. The DDC does not hold
daily model data and users are directed to the individual modelling
centres to gain access.
Q. What are weather generators?
Weather generators can be used to construct site-specific or small-scale climate change scenarios, although they rely on a slightly different approach from other statistical downscaling methods. A weather generator is calibrated on an observed daily weather series over some appropriate period, usually for a site, but possibly for a catchment or a small gridbox. The generator is then capable of generating, stochastically, an infinite series of daily weather for the respective spatial domain. This generated time series will possess - in theory - the correct (i.e., the observed) lower and higher order climate statistics for that domain.
The parameters of the weather generator can then be perturbed
using output from a GCM, allowing the generator to yield synthetic
daily weather for the climate change scenario. To derive the appropriate
WG parameter perturbation from the coarse-scale GCM, other downscaling
methods need to be employed. One of the weaknesses of weather
generators is that because they are stochastically based, they
do not always capture the low frequency variations (e.g. multi-year
or multi-decadal variations) in climate that may be quite important
for certain impacts applications. The following reference compares
the performance of two widely used weather generators: Semenov,M.A.,
Brooks,R.J., Barrow,E.M. and Richardson,C.W. (1998) Comparison
of the WGEN and LARS-WG stochastic weather generators in diverse
climates Climate Research, 10, 95-107.
Back to Top
Q. How can I downscale the GCM data to smaller scales?
One of the simplest ways of adding spatial detail to GCM-based climate change scenarios is to interpolate GCM-scale changes to a finer resolution and then combine these interpolated changes with observed climate information at the fine resolution (see Question above). This may be achieved using high resolution observed mean monthly climatologies or can be done simply by perturbing an observed monthly or daily time series for a site or catchment by the GCM changes interpolated to the site in question. This approach is sometimes termed 'unintelligent' downscaling because no new meteorological insight is added in the interpolation process that goes beyond the GCM-based changes; the basic spatial patterns of present climate are assumed to remain largely unchanged in the future. This very simple approach to downscaling is easy to apply and allows impact assessment models to use climate scenarios at a resolution that would otherwise be difficult or costly to obtain.
Another downscaling option is to use a higher resolution limited-area model (often called a Regional Climate Model - RCM) to generate climate change scenarios at the required resolution. Such regional climate models typically cover an area the size of Europe, have a spatial resolution of about 30-50 km and are driven by boundary conditions taken from the GCM for one particular period in the present and one in the future. This approach has been adopted by a number of modelling centres around the world. The DDC does not currently contain any results from RCM experiments and not many impact studies have yet used results from RCM experiments.
Because unintelligent downscaling assumes that climate change will be uniform over GCM grid-scales and because RCMs are slow and expensive to run, another set of approaches to the downscaling problem has been developed for scenario applications. These approaches may conveniently be grouped together as statistical downscaling methods. There are at least three broad clusters of methods within this general category - regression methods, circulation typing schemes, and stochastic weather generators. Developing a statistical downscaling model is usually quite time-intensive and will always require very extensive observational data - daily/hourly weather data, for the surface and maybe for the upper air, and usually for several/many sites or gridboxes covering the region of interest. It need not necessarily be a cheaper or easier option than running an RCM. It should also be noted that most downscaling methods and models are developed with a specific application in mind - whether agriculture, forestry, water, etc. - and quite often for a specific geographic region. Not all downscaling methods can easily be transported from one region to another. In each case, just as with a regional climate model, the derived regional scenarios depend on the validity of the GCM output.
Q. Should I take 1961-90 or 1990 as my baseline?
A. IPCC have usually taken the year '1990' as the baseline year for the presentation of emissions scenarios and for calculations of future climate and sea-level change. '1990' has also been adopted by the UN FCCC in their definition of emissions reductions targets. Choosing a single year as a baseline is appropriate for some applications, but not for others.
With regard to climate, for example, a single year is not appropriate to use as the baseline. Climate variability means that a single year may be unusually warm or cold or dry or wet and does not therefore make a useful reference point for measuring climate change. More common in climatological applications is the use of the average climate over a 30-year period to define the reference or baseline climate. A 30-year climatic average smoothes out many of the year-to-year variations in climate, while the individual 30 years of such a period captures much of the interannual and short time-scale variability of climate that may be relevant for an impact application. It is also desirable to use such a multi-year period rather than a single year to define climate change fields extracted from GCM simulations.
For these reasons, we suggest the period 1961-90 generally be used as the baseline period. This period has generally good observed data availability (e.g. the observed climatology described by the DDC, it represents the recent climate to which many present-day human or natural systems are likely to be reasonably well adapted, and the period ends in 1990, the year adopted by many IPCC and UN FCCC applications.
Q. How do I add the GCM fields to my baseline data?
GCM change fields generally exist on coarser space scales than the observational climate data being used in a project. For example, the observational dataset available from the DDC Pages exists at 0.5º latitude/longitude resolution whereas the GCM data exist at scales large than 2.5º latitude/longitude. There are a number of ways GCM changes and observed data may be combined:
Whatever method is used, be careful to combine variables that are describing the same meteorological variable, e.g. cloud cover with cloud cover, vapour pressure with vapour pressure, etc. The variables deposited with the DDC by modelling centres are not always directly compatible with the observed variables most commonly used in impacts studies.
The forcing scenarios used by the DDC models do not originate directly from any coherent future view of the world. They are an arbitrary imposition of a 1% per annum growth in future greenhouse gas concentrations. In fact, the closest of the IS92 emissions scenarios to this arbitrary forcing is the IS92a scenario (IPCC 1996 calculated the equivalent per annum growth rate in concentrations for IS92a to be about 0.85% per annum). It is therefore not unreasonable to use the IS92a assumptions about population, GDP and energy technology to create the background world in which these DDC modelled climate changes might occur. Similarly, for the ~0.5% per annum forcing scenario used by HadCM2, the IS92d assumptions would be the best to use. These data are held on the DDC Green Pages under Non-Climatic Scenarios.
Recently, the IPCC have commissioned a new set of future world storylines and emissions scenarios. These are called the SRES98 scenarios. Converting these new emissions scenarios into equivalent CO2 concentration growth curves using IPCC 1996 equations, yields the SRES A2 storyline as the best approximation for the 1% forced GCM results and the SRES B1 storyline as the best approximation for the ~0.5% forced experiments. Some of the non-climatic background assumptions for these storylines can be picked up from the relevant DDC Green Pages.
Q. Is there a Kyoto Protocol scenario available?
A. The Kyoto Protocol agreed at the Third Conference of the Parties to the UN Framework Convention on Climate Change in December 1997, sets a target for developed countries to reduce net emissions of greenhouse gases by about 5% by the period 2008-2012 with respect to 1990 levels. Should this Protocol be ratified and adhered to, there are modest implications for future climate change and sea level.
Neither the IS92 nor the SRES98 emissions scenarios available from the DDC, nor any of the forcing scenarios used by GCM experiments, include the effect of the Kyoto Protocol on future emissions or radiative forcing. Climate change scenarios obtained from the DDC should be regarded therefore as 'non-interventionist' scenarios. A separate scenario exercise would need to be undertaken to consider the effects of the Kyoto Protocol, although it has been shown that the effect of the Kyoto Protocol on future climate change is likely to be modest.
Q. When will the IPCC Technical Guidelines be ready?
A. The Task Group on Climate Scenarios for Impacts Assessment (TGCIA) is preparing Guidance Material on the use and application of the datasets available through the DDC. This document exists in draft form, but will not be finalised until later in 1999. Some of the answers to these FAQs are taken from the draft document.
Q. Why are changes in Tmean not always the average of changes in Tmin and Tmax?
A. Whether or not the change in mean monthly mean temperature is the average of the changes in mean monthly Tmin and Tmax depends on how the particular GCM calculates its daily Tmean, Tmin and Tmax. In HadCM2, for example, the daily Tmean is the average temperature over 24 hours (the average of 48 half hourly calculations), whereas daily Tmin is the lowest temperature reached in a day and Tmax is the highest temperature reached in a day. Clearly, in this case unless there is a perfect sine curve in the diurnal temperature cycle then daily Tmean need not be the average of daily Tmin and daily Tmax. Since the monthly and 30-year mean monthly values are derived from the daily emperatures, this explains why changes in Tmean may not equal the average of changes in Tmin and Tmax. How close Tmean changes are to the average of Tmin and Tmax changes depends on how the model calculates these daily quantities and on the shape of the diurnal cycle.
Changes in 30-year mean diurnal temperature range (*DTR) for the DDC results are calculated as the difference between the changes in the 30-year mean monthly Tmax and Tmin changes.
Q. Should I take 1961-90 or 1990 as my baseline?
A. IPCC have usually taken the year '1990' as the baseline year for the presentation of emissions scenarios and for calculations of future climate and sea-level change. '1990' has also been adopted by the UN FCCC in their definition of emissions reductions targets. Choosing a single year as a baseline is appropriate for some applications, but not for others.
With regard to climate, for example, a single year is not appropriate to use as the baseline. Climate variability means that a single year may be unusually warm or cold or dry or wet and does not therefore make a useful reference point for measuring climate change. More common in climatological applications is the use of the average climate over a 30-year period to define the reference or baseline climate. A 30-year climatic average smoothes out many of the year-to-year variations in climate, while the individual 30 years of such a period captures much of the interannual and short time-scale variability of climate that may be relevant for an impact application. It is also desirable to use such a multi-year period rather than a single year to define climate change fields extracted from GCM simulations.
For these reasons, we suggest the period 1961-90 generally be used as the baseline period. This period has generally good observed data availability (e.g. the observed climatology described by the DDC - link), it represents the recent climate to which many present-day human or natural systems are likely to be reasonably well adapted, and the period ends in 1990, the year adopted by many IPCC and UN FCCC applications.
A. The IPCC Data Distribution Centre has been established to provide climate and associated datasets, along with the appropriate scientific and technical advice on using these datasets. If you require any further information regarding the IPCC, COP3 and the Kyoto Protocol or the UN FCCC you should visit the following sites: