13 resultados para Mean-Reverting Process

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The frequency of persistent atmospheric blocking events in the 40-yr ECMWF Re-Analysis (ERA-40) is compared with the blocking frequency produced by a simple first-order Markov model designed to predict the time evolution of a blocking index [defined by the meridional contrast of potential temperature on the 2-PVU surface (1 PVU ≡ 1 × 10−6 K m2 kg−1 s−1)]. With the observed spatial coherence built into the model, it is able to reproduce the main regions of blocking occurrence and the frequencies of sector blocking very well. This underlines the importance of the climatological background flow in determining the locations of high blocking occurrence as being the regions where the mean midlatitude meridional potential vorticity (PV) gradient is weak. However, when only persistent blocking episodes are considered, the model is unable to simulate the observed frequencies. It is proposed that this persistence beyond that given by a red noise model is due to the self-sustaining nature of the blocking phenomenon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The commercial process in construction projects is an expensive and highly variable overhead. Collaborative working practices carry many benefits, which are widely disseminated, but little information is available about their costs. Transaction Cost Economics is a theoretical framework that seeks explanations for why there are firms and how the boundaries of firms are defined through the “make-or-buy” decision. However, it is not a framework that offers explanations for the relative costs of procuring construction projects in different ways. The idea that different methods of procurement will have characteristically different costs is tested by way of a survey. The relevance of transaction cost economics to the study of commercial costs in procurement is doubtful. The survey shows that collaborative working methods cost neither more nor less than traditional methods. But the benefits of collaboration mean that there is a great deal of enthusiasm for collaboration rather than competition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new iterative algorithm for orthogonal frequency division multiplexing (OFDM) joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the relatively less studied problem of "overfitting" such that the iterative approach may converge to a trivial solution. Specifically, we apply a hard-decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the PHN, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical Simulations are also given to verify the proposed algorithm. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate stimulant medication response following a single dose of methylphenidate (MPH) in children and young people with hyperkinetic disorder using infrared motion analysis combined with a continuous performance task (QbTest system) as objective measures. The hypothesis was put forward that a moderate testdose of stimulant medication could determine a robust treatment response, partial response and non-response in relation to activity, attention and impulse control measures. Methods: The study included 44 children and young people between the ages of 7-18 years with a diagnosis of hyperkinetic disorder (F90 & F90.1). A single dose-protocol incorporated the time course effects of both immediate release MPH and extended release MPH (Concerta XL, Equasym XL) to determine comparable peak efficacy periods post intake. Results: A robust treatment response with objective measures reverting to the population mean was found in 37 participants (84%). Three participants (7%) demonstrated a partial response to MPH and four participants (9%) were determined as non-responders due to deteriorating activity measures together with no improvements in attention and impulse control measures. Conclusion: Objective measures provide early into prescribing the opportunity to measure treatment response and monitor adverse reactions to stimulant medication. Most treatment responders demonstrated an effective response to MPH on a moderate testdose facilitating a swift and more optimal titration process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper draws from a wider research programme in the UK undertaken for the Investment Property Forum examining liquidity in commercial property. One aspect of liquidity is the process by which transactions occur including both how properties are selected for sale and the time taken to transact. The paper analyses data from three organisations; a property company, a major financial institution and an asset management company, formally a major public sector pension fund. The data covers three market states and includes sales completed in 1995, 2000 and 2002 in the UK. The research interviewed key individuals within the three organisations to identify any common patterns of activity within the sale process and also identified the timing of 187 actual transactions from inception of the sale to completion. The research developed a taxonomy of the transaction process. Interviews with vendors indicated that decisions to sell were a product of a combination of portfolio, specific property and market based issues. Properties were generally not kept in a “readiness for sale” state. The average time from first decision to sell the actual property to completion had a mean time of 298 days and a median of 190 days. It is concluded that this study may underestimate the true length of the time to transact for two reasons. Firstly, the pre-marketing period is rarely recorded in transaction files. Secondly, and more fundamentally, studies of sold properties may contain selection bias. The research indicated that vendors tended to sell properties which it was perceived could be sold at a ‘fair’ price in a reasonable period of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the novel use of cluster analysis in the field of industrial process control. The severe multivariable process problems encountered in manufacturing have often led to machine shutdowns, where the need for corrective actions arises in order to resume operation. Production faults which are caused by processes running in less efficient regions may be prevented or diagnosed using a reasoning based on cluster analysis. Indeed the intemal complexity of a production machinery may be depicted in clusters of multidimensional data points which characterise the manufacturing process. The application of a Mean-Tracking cluster algorithm (developed in Reading) to field data acquired from a high-speed machinery will be discussed. The objective of such an application is to illustrate how machine behaviour can be studied, in particular how regions of erroneous and stable running behaviour can be identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We bridge the properties of the regular triangular, square, and hexagonal honeycomb Voronoi tessellations of the plane to the Poisson-Voronoi case, thus analyzing in a common framework symmetry breaking processes and the approach to uniform random distributions of tessellation-generating points. We resort to ensemble simulations of tessellations generated by points whose regular positions are perturbed through a Gaussian noise, whose variance is given by the parameter α2 times the square of the inverse of the average density of points. We analyze the number of sides, the area, and the perimeter of the Voronoi cells. For all valuesα >0, hexagons constitute the most common class of cells, and 2-parameter gamma distributions provide an efficient description of the statistical properties of the analyzed geometrical characteristics. The introduction of noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α = 0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise withα <0.12. For all tessellations and for small values of α, we observe a linear dependence on α of the ensemble mean of the standard deviation of the area and perimeter of the cells. Already for a moderate amount of Gaussian noise (α >0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α >2, results converge to those of Poisson-Voronoi tessellations. The geometrical properties of n-sided cells change with α until the Poisson- Voronoi limit is reached for α > 2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established. This law allows for an easy link to the Lewis law for areas and agrees with exact asymptotic results. Finally, for α >1, the ensemble mean of the cells area and perimeter restricted to the hexagonal cells agree remarkably well with the full ensemble mean; this reinforces the idea that hexagons, beyond their ubiquitous numerical prominence, can be interpreted as typical polygons in 2D Voronoi tessellations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical experiments are described that pertain to the climate of a coupled atmosphere–ocean–ice system in the absence of land, driven by modern-day orbital and CO2 forcing. Millennial time-scale simulations yield a mean state in which ice caps reach down to 55° of latitude and both the atmosphere and ocean comprise eastward- and westward-flowing zonal jets, whose structure is set by their respective baroclinic instabilities. Despite the zonality of the ocean, it is remarkably efficient at transporting heat meridionally through the agency of Ekman transport and eddy-driven subduction. Indeed the partition of heat transport between the atmosphere and ocean is much the same as the present climate, with the ocean dominating in the Tropics and the atmosphere in the mid–high latitudes. Variability of the system is dominated by the coupling of annular modes in the atmosphere and ocean. Stochastic variability inherent to the atmospheric jets drives variability in the ocean. Zonal flows in the ocean exhibit decadal variability, which, remarkably, feeds back to the atmosphere, coloring the spectrum of annular variability. A simple stochastic model can capture the essence of the process. Finally, it is briefly reviewed how the aquaplanet can provide information about the processes that set the partition of heat transport and the climate of Earth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluate the ability of process based models to reproduce observed global mean sea-level change. When the models are forced by changes in natural and anthropogenic radiative forcing of the climate system and anthropogenic changes in land-water storage, the average of the modelled sea-level change for the periods 1900–2010, 1961–2010 and 1990–2010 is about 80%, 85% and 90% of the observed rise. The modelled rate of rise is over 1 mm yr−1 prior to 1950, decreases to less than 0.5 mm yr−1 in the 1960s, and increases to 3 mm yr−1 by 2000. When observed regional climate changes are used to drive a glacier model and an allowance is included for an ongoing adjustment of the ice sheets, the modelled sea-level rise is about 2 mm yr−1 prior to 1950, similar to the observations. The model results encompass the observed rise and the model average is within 20% of the observations, about 10% when the observed ice sheet contributions since 1993 are added, increasing confidence in future projections for the 21st century. The increased rate of rise since 1990 is not part of a natural cycle but a direct response to increased radiative forcing (both anthropogenic and natural), which will continue to grow with ongoing greenhouse gas emissions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sudden stratospheric warmings (SSWs) are the most prominent vertical coupling process in the middle atmosphere, which occur during winter and are caused by the interaction of planetary waves (PWs) with the zonal mean flow. Vertical coupling has also been identified during the equinox transitions, and is similarly associated with PWs. We argue that there is a characteristic aspect of the autumn transition in northern high latitudes, which we call the “hiccup”, and which acts like a “mini SSW”, i.e. like a small minor warming. We study the average characteristics of the hiccup based on a superimposed epoch analysis using a nudged version of the Canadian Middle Atmosphere Model, representing 30 years of historical data. Hiccups can be identified in about half the years studied. The mesospheric zonal wind results are compared to radar observations over Andenes (69N,16E) for the years 2000–2013. A comparison of the average characteristics of hiccups and SSWs shows both similarities and differences between the two vertical coupling processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject of climate feedbacks focuses attention on global mean surface air temperature (GMST) as the key metric of climate change. But what does knowledge of past and future GMST tell us about the climate of specific regions? In the context of the ongoing UNFCCC process, this is an important question for policy-makers as well as for scientists. The answer depends on many factors, including the mechanisms causing changes, the timescale of the changes, and the variables and regions of interest. This paper provides a review and analysis of the relationship between changes in GMST and changes in local climate, first in observational records and then in a range of climate model simulations, which are used to interpret the observations. The focus is on decadal timescales, which are of particular interest in relation to recent and near-future anthropogenic climate change. It is shown that GMST primarily provides information about forced responses, but that understanding and quantifying internal variability is essential to projecting climate and climate impacts on regional-to-local scales. The relationship between local forced responses and GMST is often linear but may be nonlinear, and can be greatly complicated by competition between different forcing factors. Climate projections are limited not only by uncertainties in the signal of climate change but also by uncertainties in the characteristics of real-world internal variability. Finally, it is shown that the relationship between GMST and local climate provides a simple approach to climate change detection, and a useful guide to attribution studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vertical profile of aerosol is important for its radiative effects, but weakly constrained by observations on the global scale, and highly variable among different models. To investigate the controlling factors in one particular model, we investigate the effects of individual processes in HadGEM3–UKCA and compare the resulting diversity of aerosol vertical profiles with the inter-model diversity from the AeroCom Phase II control experiment. In this way we show that (in this model at least) the vertical profile is controlled by a relatively small number of processes, although these vary among aerosol components and particle sizes. We also show that sufficiently coarse variations in these processes can produce a similar diversity to that among different models in terms of the global-mean profile and, to a lesser extent, the zonal-mean vertical position. However, there are features of certain models' profiles that cannot be reproduced, suggesting the influence of further structural differences between models. In HadGEM3–UKCA, convective transport is found to be very important in controlling the vertical profile of all aerosol components by mass. In-cloud scavenging is very important for all except mineral dust. Growth by condensation is important for sulfate and carbonaceous aerosol (along with aqueous oxidation for the former and ageing by soluble material for the latter). The vertical extent of biomass-burning emissions into the free troposphere is also important for the profile of carbonaceous aerosol. Boundary-layer mixing plays a dominant role for sea salt and mineral dust, which are emitted only from the surface. Dry deposition and below-cloud scavenging are important for the profile of mineral dust only. In this model, the microphysical processes of nucleation, condensation and coagulation dominate the vertical profile of the smallest particles by number (e.g. total CN  >  3 nm), while the profiles of larger particles (e.g. CN  >  100 nm) are controlled by the same processes as the component mass profiles, plus the size distribution of primary emissions. We also show that the processes that affect the AOD-normalised radiative forcing in the model are predominantly those that affect the vertical mass distribution, in particular convective transport, in-cloud scavenging, aqueous oxidation, ageing and the vertical extent of biomass-burning emissions.