173 resultados para Asymptotic Mean Squared Errors
Resumo:
The sensitivity of the UK Universities Global Atmospheric Modelling Programme (UGAMP) General Circulation Model (UGCM) to two very different approaches to convective parametrization is described. Comparison is made between a Kuo scheme, which is constrained by large-scale moisture convergence, and a convective-adjustment scheme, which relaxes to observed thermodynamic states. Results from 360-day integrations with perpetual January conditions are used to describe the model's tropical time-mean climate and its variability. Both convection schemes give reasonable simulations of the time-mean climate, but the representation of the main modes of tropical variability is markedly different. The Kuo scheme has much weaker variance, confined to synoptic frequencies near 4 days, and a poor simulation of intraseasonal variability. In contrast, the convective-adjustment scheme has much more transient activity at all time-scales. The various aspects of the two schemes which might explain this difference are discussed. The particular closure on moisture convergence used in this version of the Kuo scheme is identified as being inappropriate.
Resumo:
A regional study of the prediction of extratropical cyclones by the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) has been performed. An objective feature-tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast error statistics have then been produced for the position, intensity, and propagation speed of the storms. In previous work, data limitations meant it was only possible to present the diagnostics for the entire Northern Hemisphere (NH) or Southern Hemisphere. A larger data sample has allowed the diagnostics to be computed separately for smaller regions around the globe and has made it possible to explore the regional differences in the prediction of storms by the EPS. Results show that in the NH there is a larger ensemble mean error in the position of storms over the Atlantic Ocean. Further analysis revealed that this is mainly due to errors in the prediction of storm propagation speed rather than in direction. Forecast storms propagate too slowly in all regions, but the bias is about 2 times as large in the NH Atlantic region. The results show that storm intensity is generally overpredicted over the ocean and underpredicted over the land and that the absolute error in intensity is larger over the ocean than over the land. In the NH, large errors occur in the prediction of the intensity of storms that originate as tropical cyclones but then move into the extratropics. The ensemble is underdispersive for the intensity of cyclones (i.e., the spread is smaller than the mean error) in all regions. The spatial patterns of the ensemble mean error and ensemble spread are very different for the intensity of cyclones. Spatial distributions of the ensemble mean error suggest that large errors occur during the growth phase of storm development, but this is not indicated by the spatial distributions of the ensemble spread. In the NH there are further differences. First, the large errors in the prediction of the intensity of cyclones that originate in the tropics are not indicated by the spread. Second, the ensemble mean error is larger over the Pacific Ocean than over the Atlantic, whereas the opposite is true for the spread. The use of a storm-tracking approach, to both weather forecasters and developers of forecast systems, is also discussed.
Resumo:
The aim of this paper is to demonstrate the importance of changing temperature variability with climate change in assessments of future heat-related mortality. Previous studies have only considered changes in the mean temperature. Here we present estimates of heat-related mortality resulting from climate change for six cities: Boston, Budapest, Dallas, Lisbon, London and Sydney. They are based on climate change scenarios for the 2080s (2070-2099) and the temperature-mortality (t-m) models constructed and validated in Gosling et al. (2007). We propose a novel methodology for assessing the impacts of climate change on heat-related mortality that considers both changes in the mean and variability of the temperature distribution.
Resumo:
We describe numerical simulations designed to elucidate the role of mean ocean salinity in climate. Using a coupled atmosphere-ocean general circulation model, we study a 100-year sensitivity experiment in which the global-mean salinity is approximately doubled from its present observed value, by adding 35 psu everywhere in the ocean. The salinity increase produces a rapid global-mean sea-surface warming of C within a few years, caused by reduced vertical mixing associated with changes in cabbeling. The warming is followed by a gradual global-mean sea-surface cooling of C within a few decades, caused by an increase in the vertical (downward) component of the isopycnal diffusive heat flux. We find no evidence of impacts on the variability of the thermohaline circulation (THC) or El Niño/Southern Oscillation (ENSO). The mean strength of the Atlantic meridional overturning is reduced by 20% and the North Atlantic Deep Water penetrates less deeply. Nevertheless, our results dispute claims that higher salinities for the world ocean have profound consequences for the thermohaline circulation. In additional experiments with doubled atmospheric carbon dioxide, we find that the amplitude and spatial pattern of the global warming signal are modified in the hypersaline ocean. In particular, the equilibrated global-mean sea-surface temperature increase caused by doubling carbon dioxide is reduced by 10%. We infer the existence of a non-linear interaction between the climate responses to modified carbon dioxide and modified salinity.
Resumo:
The Robert–Asselin time filter is widely used in numerical models of weather and climate. It successfully suppresses the spurious computational mode associated with the leapfrog time-stepping scheme. Unfortunately, it also weakly suppresses the physical mode and severely degrades the numerical accuracy. These two concomitant problems are shown to occur because the filter does not conserve the mean state, averaged over the three time slices on which it operates. The author proposes a simple modification to the Robert–Asselin filter, which does conserve the three-time-level mean state. When used in conjunction with the leapfrog scheme, the modification vastly reduces the impacts on the physical mode and increases the numerical accuracy for amplitude errors by two orders, yielding third-order accuracy. The modified filter could easily be incorporated into existing general circulation models of the atmosphere and ocean. In principle, it should deliver more faithful simulations at almost no additional computational expense. Alternatively, it may permit the use of longer time steps with no loss of accuracy, reducing the computational expense of a given simulation.
Resumo:
This article describes the development and evaluation of the U.K.’s new High-Resolution Global Environmental Model (HiGEM), which is based on the latest climate configuration of the Met Office Unified Model, known as the Hadley Centre Global Environmental Model, version 1 (HadGEM1). In HiGEM, the horizontal resolution has been increased to 0.83° latitude × 1.25° longitude for the atmosphere, and 1/3° × 1/3° globally for the ocean. Multidecadal integrations of HiGEM, and the lower-resolution HadGEM, are used to explore the impact of resolution on the fidelity of climate simulations. Generally, SST errors are reduced in HiGEM. Cold SST errors associated with the path of the North Atlantic drift improve, and warm SST errors are reduced in upwelling stratocumulus regions where the simulation of low-level cloud is better at higher resolution. The ocean model in HiGEM allows ocean eddies to be partially resolved, which dramatically improves the representation of sea surface height variability. In the Southern Ocean, most of the heat transports in HiGEM is achieved by resolved eddy motions, which replaces the parameterized eddy heat transport in the lower-resolution model. HiGEM is also able to more realistically simulate small-scale features in the wind stress curl around islands and oceanic SST fronts, which may have implications for oceanic upwelling and ocean biology. Higher resolution in both the atmosphere and the ocean allows coupling to occur on small spatial scales. In particular, the small-scale interaction recently seen in satellite imagery between the atmosphere and tropical instability waves in the tropical Pacific Ocean is realistically captured in HiGEM. Tropical instability waves play a role in improving the simulation of the mean state of the tropical Pacific, which has important implications for climate variability. In particular, all aspects of the simulation of ENSO (spatial patterns, the time scales at which ENSO occurs, and global teleconnections) are much improved in HiGEM.
Resumo:
ATSR-2 active fire data from 1996 to 2000, TRMM VIRS fire counts from 1998 to 2000 and burn scars derived from SPOT VEGETATION ( the Global Burnt Area 2000 product) were mapped for Peru and Bolivia to analyse the spatial distribution of burning and its intra- and inter-annual variability. The fire season in the region mainly occurs between May and October; though some variation was found between the six broad habitat types analysed: desert, grassland, savanna, dry forest, moist forest and yungas (the forested valleys on the eastern slope of the Andes). Increased levels of burning were generally recorded in ATSR-2 and TRMM VIRS fire data in response to the 1997/1998 El Nino, but in some areas the El Nino effect was masked by the more marked influences of socio-economic change on land use and land cover. There were differences between the three global datasets: ATSR-2 under-recorded fires in ecosystems with low net primary productivities. This was because fires are set during the day in this region and, when fuel loads are low, burn out before the ATSR-2 overpass in the region which is between 02.45 h and 03.30 h. TRMM VIRS was able to detect these fires because its overpasses cover the entire diurnal range on a monthly basis. The GBA2000 product has significant errors of commission (particularly areas of shadow in the well-dissected eastern Andes) and omission (in the agricultural zone around Santa Cruz, Bolivia and in north-west Peru). Particular attention was paid to biomass burning in high-altitude grasslands, where fire is an important pastoral management technique. Fires and burn scars from Landsat Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM) data for a range of years between 1987 and 2000 were mapped for areas around Parque Nacional Rio Abiseo (Peru) and Parque Nacional Carrasco (Bolivia). Burn scars mapped in the grasslands of these two areas indicate far more burning had taken place than either the fires or the burn scars derived from global datasets. Mean scar sizes are smaller and have a smaller range in size between years the in the study area in Peru (6.6-7.1 ha) than Bolivia (16.9-162.5 ha). Trends in biomass burning in the two highland areas can be explained in terms of the changing socio-economic environments and impacts of conservation. The mismatch between the spatial scale of biomass burning in the high-altitude grasslands and the sensors used to derive global fire products means that an entire component of the fire regime in the region studied is omitted, despite its importance in the farming systems on the Andes.
Resumo:
A high-resolution record of sea-level change spanning the past 1000 years is derived from foraminiferal and chronological analyses of a 2m thick salt-marsh peat sequence at Chezzetcook, Nova Scotia, Canada. Former mean tide level positions are reconstructed with a precision of +/- 0.055 in using a transfer function derived from distributions of modern salt-marsh foraminifera. Our age model for the core section older than 300 years is based on 19 AMS C-14 ages and takes into account the individual probability distributions of calibrated radiocarbon ages. The past 300 years is dated by pollen and the isotopes Pb-206, Pb-207, Pb-210, Cs-137 and Am-241. Between AD 1000 and AD 1800, relative sea level rose at a mean rate of 17cm per century. Apparent pre-industrial rises of sea level dated at AD 1500-1550 and AD 1700-1800 cannot be clearly distinguished when radiocarbon age errors are taken into account. Furthermore, they may be an artefact of fluctuations in atmospheric C-14 production. In the 19th century sea level rose at a mean rate of 1.6mm/yr. Between AD 1900 and AD 1920, sea-level rise accelerated to the modern mean rate of 3.2mm/yr. This acceleration corresponds in time with global temperature rise and may therefore be associated with recent global warming. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Recent developments in contracting practice in the UK have built upon recommendations contained in highprofile reports, such as those by Latham and Egan. However, the New Engineering Contract (NEC), endorsed by Latham, is based upon principles of contract drafting that seem open to question. Any contract operates in the context of its legislative environment and current working practices. This report identifies eight contentious hypotheses in the literature on construction contracts and tests their validity in a sample survey that attracted 190 responses. The survey shows, among other things, that while partnership is a positive and useful idea, authoritative contract management is considered more effective and that “win-win” contracts, while desirable, are basically impractical. Further, precision and fairness in contracts are not easy to achieve simultaneously. While participants should know what is in their contracts, they should not routinely resort to legal action; and standard-form contracts should not seek to be universally applicable. Fundamental changes to drafting policy should be undertaken within the context of current legal contract doctrine and with a sensitivity to the way that contracts are used in contemporary practice. Attitudes to construction contracting may seem to be changing on the surface, but detailed analysis of what lies behind apparent agreement on new ways of working reveals that attitudes are changing much more slowly than they appear to be.
Resumo:
Background Pharmacy aseptic units prepare and supply injectables to minimise risks. The UK National Aseptic Error Reporting Scheme has been collecting data on pharmacy compounding errors, including near-misses, since 2003. Objectives The cumulative reports from January 2004 to December 2007, inclusive, were analysed. Methods The different variables of product types, error types, staff making and detecting errors, stage errors detected, perceived contributory factors, and potential or actual outcomes were presented by cross-tabulation of data. Results A total of 4691 reports were submitted against an estimated 958 532 items made, returning 0.49% as the overall error rate. Most of the errors were detected before reaching patients, with only 24 detected during or after administration. The highest number of reports related to adult cytotoxic preparations (40%) and the most frequently recorded error was a labelling error (34.2%). Errors were mostly detected at first check in assembly area (46.6%). Individual staff error contributed most (78.1%) to overall errors, while errors with paediatric parenteral nutrition appeared to be blamed on low staff levels more than other products were. The majority of errors (68.6%) had no potential patient outcomes attached, while it appeared that paediatric cytotoxic products and paediatric parenteral nutrition were associated with greater levels of perceived patient harm. Conclusions The majority of reports were related to near-misses, and this study highlights scope for examining current arrangements for checking and releasing products, certainly for paediatric cytotoxic and paediatric parenteral nutrition preparations within aseptic units, but in the context of resource and capacity constraints.
Resumo:
Airborne laser altimetry has the potential to make frequent detailed observations that are important for many aspects of studying land surface processes. However, the uncertainties inherent in airborne laser altimetry data have rarely been well measured. Uncertainty is often specified as generally as 20cm in elevation, and 40cm planimetric. To better constrain these uncertainties, we present an analysis of several datasets acquired specifically to study the temporal consistency of laser altimetry data, and thus assess its operational value. The error budget has three main components, each with a time regime. For measurements acquired less than 50ms apart, elevations have a local standard deviation in height of 3.5cm, enabling the local measurement of surface roughness of the order of 5cm. Points acquired seconds apart acquire an additional random error due to Differential Geographic Positioning System (DGPS) fluctuation. Measurements made up to an hour apart show an elevation drift of 7cm over a half hour. Over months, this drift gives rise to a random elevation offset between swathes, with an average of 6.4cm. The RMS planimetric error in point location was derived as 37.4cm. We conclude by considering the consequences of these uncertainties on the principle application of laser altimetry in the UK, intertidal zone monitoring.