974 resultados para deterministic trend


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ensemble clustering (EC) can arise in data assimilation with ensemble square root filters (EnSRFs) using non-linear models: an M-member ensemble splits into a single outlier and a cluster of M−1 members. The stochastic Ensemble Kalman Filter does not present this problem. Modifications to the EnSRFs by a periodic resampling of the ensemble through random rotations have been proposed to address it. We introduce a metric to quantify the presence of EC and present evidence to dispel the notion that EC leads to filter failure. Starting from a univariate model, we show that EC is not a permanent but transient phenomenon; it occurs intermittently in non-linear models. We perform a series of data assimilation experiments using a standard EnSRF and a modified EnSRF by a resampling though random rotations. The modified EnSRF thus alleviates issues associated with EC at the cost of traceability of individual ensemble trajectories and cannot use some of algorithms that enhance performance of standard EnSRF. In the non-linear regimes of low-dimensional models, the analysis root mean square error of the standard EnSRF slowly grows with ensemble size if the size is larger than the dimension of the model state. However, we do not observe this problem in a more complex model that uses an ensemble size much smaller than the dimension of the model state, along with inflation and localisation. Overall, we find that transient EC does not handicap the performance of the standard EnSRF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the European Union, first-tier assessment of the long-term risk to birds and mammals from pesticides is based on calculation of a deterministic long-term toxicity/exposure ratio(TERlt). The ratio is developed from generic herbivores and insectivores and applied to all species. This paper describes two case studies that implement proposed improvements to the way long-term risk is assessed. These refined methods require calculation of a TER for each of five identified phases of reproduction (phase-specific TERs) and use of adjusted No Observed Effect Levels (NOELs)to incorporate variation in species sensitivity to pesticides. They also involve progressive refinement of the exposure estimate so that it applies to particular species, rather than generic indicators, and relates spraying date to onset of reproduction. The effect of using these new methods on the assessment of risk is described. Each refinement did not necessarily alter the calculated TER value in a way that was either predictable or consistent across both case studies. However, use of adjusted NOELs always reduced TERs, and relating spraying date to onset of reproduction increased most phase-specific TERs. The case studies suggested that the current first-tier TERlt assessment may underestimate risk in some circumstances and that phase-specific assessments can help identify appropriate risk-reduction measures. The way in which deterministic phase-specific assessments can currently be implemented to enhance first-tier assessment is outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamics affects the distribution and abundance of stratospheric ozone directly through transport of ozone itself and indirectly through its effect on ozone chemistry via temperature and transport of other chemical species. Dynamical processes must be considered in order to understand past ozone changes, especially in the northern hemisphere where there appears to be significant low-frequency variability which can look “trend-like” on decadal time scales. A major challenge is to quantify the predictable, or deterministic, component of past ozone changes. Over the coming century, changes in climate will affect the expected recovery of ozone. For policy reasons it is important to be able to distinguish and separately attribute the effects of ozone-depleting substances and greenhouse gases on both ozone and climate. While the radiative-chemical effects can be relatively easily identified, this is not so evident for dynamics — yet dynamical changes (e.g., changes in the Brewer-Dobson circulation) could have a first-order effect on ozone over particular regions. Understanding the predictability and robustness of such dynamical changes represents another major challenge. Chemistry-climate models have recently emerged as useful tools for addressing these questions, as they provide a self-consistent representation of dynamical aspects of climate and their coupling to ozone chemistry. We can expect such models to play an increasingly central role in the study of ozone and climate in the future, analogous to the central role of global climate models in the study of tropospheric climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Total ozone trends are typically studied using linear regression models that assume a first-order autoregression of the residuals [so-called AR(1) models]. We consider total ozone time series over 60°S–60°N from 1979 to 2005 and show that most latitude bands exhibit long-range correlated (LRC) behavior, meaning that ozone autocorrelation functions decay by a power law rather than exponentially as in AR(1). At such latitudes the uncertainties of total ozone trends are greater than those obtained from AR(1) models and the expected time required to detect ozone recovery correspondingly longer. We find no evidence of LRC behavior in southern middle-and high-subpolar latitudes (45°–60°S), where the long-term ozone decline attributable to anthropogenic chlorine is the greatest. We thus confirm an earlier prediction based on an AR(1) analysis that this region (especially the highest latitudes, and especially the South Atlantic) is the optimal location for the detection of ozone recovery, with a statistically significant ozone increase attributable to chlorine likely to be detectable by the end of the next decade. In northern middle and high latitudes, on the other hand, there is clear evidence of LRC behavior. This increases the uncertainties on the long-term trend attributable to anthropogenic chlorine by about a factor of 1.5 and lengthens the expected time to detect ozone recovery by a similar amount (from ∼2030 to ∼2045). If the long-term changes in ozone are instead fit by a piecewise-linear trend rather than by stratospheric chlorine loading, then the strong decrease of northern middle- and high-latitude ozone during the first half of the 1990s and its subsequent increase in the second half of the 1990s projects more strongly on the trend and makes a smaller contribution to the noise. This both increases the trend and weakens the LRC behavior at these latitudes, to the extent that ozone recovery (according to this model, and in the sense of a statistically significant ozone increase) is already on the verge of being detected. The implications of this rather controversial interpretation are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hourly data (1994–2009) of surface ozone concentrations at eight monitoring sites have been investigated to assess target level and long–term objective exceedances and their trends. The European Union (EU) ozone target value for human health (60 ppb–maximum daily 8–hour running mean) has been exceeded for a number of years for almost all sites but never exceeded the set limit of 25 exceedances in one year. Second highest annual hourly and 4th highest annual 8–hourly mean ozone concentrations have shown a statistically significant negative trend for in–land sites of Cork–Glashaboy, Monaghan and Lough Navar and no significant trend for the Mace Head site. Peak afternoon ozone concentrations averaged over a three year period from 2007 to 2009 have been found to be lower than corresponding values over a three–year period from 1996 to 1998 for two sites: Cork–Glashaboy and Lough Navar sites. The EU long–term objective value of AOT40 (Accumulated Ozone Exposure over a threshold of 40 ppb) for protection of vegetation (3 ppm–hour, calculated from May to July) has been exceeded, on an individual year basis, for two sites: Mace Head and Valentia. The critical level for the protection of forest (10 ppm–hour from April to September) has not been exceeded for any site except at Valentia in the year 2003. AOT40–Vegetation shows a significant negative trend for a 3–year running average at Cork–Glashaboy (–0.13±0.02 ppm–hour per year), at Lough Navar (–0.05±0.02 ppm–hour per year) and at Monaghan (–0.03±0.03 ppm–hour per year–not statistically significant) sites. No statistically significant trend was observed for the coastal site of Mace head. Overall, with the exception of the Mace Head and Monaghan sites, ozone measurement records at Irish sites show a downward negative trend in peak values that affect human health and vegetation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although difference-stationary (DS) and trend-stationary (TS) processes have been subject to considerable analysis, there are no direct comparisons for each being the data-generation process (DGP). We examine incorrect choice between these models for forecasting for both known and estimated parameters. Three sets of Monte Carlo simulations illustrate the analysis, to evaluate the biases in conventional standard errors when each model is mis-specified, compute the relative mean-square forecast errors of the two models for both DGPs, and investigate autocorrelated errors, so both models can better approximate the converse DGP. The outcomes are surprisingly different from established results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the use of Association Rule Mining (ARM) and our proposed Transaction based Rule Change Mining (TRCM) to identify the rule types present in tweet’s hashtags over a specific consecutive period of time and their linkage to real life occurrences. Our novel algorithm was termed TRCM-RTI in reference to Rule Type Identification. We created Time Frame Windows (TFWs) to detect evolvement statuses and calculate the lifespan of hashtags in online tweets. We link RTI to real life events by monitoring and recording rule evolvement patterns in TFWs on the Twitter network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the criticism that system dynamics is a ‘hard’ or ‘deterministic’ systems approach. This criticism is seen to have four interpretations and each is addressed from the perspectives of social theory and systems science. Firstly, system dynamics is shown to offer not prophecies but Popperian predictions. Secondly, it is shown to involve the view that system structure only partially, not fully, determines human behaviour. Thirdly, the field's assumptions are shown not to constitute a grand content theory—though its structural theory and its attachment to the notion of causality in social systems are acknowledged. Finally, system dynamics is shown to be significantly different from systems engineering. The paper concludes that such confusions have arisen partially because of limited communication at the theoretical level from within the system dynamics community but also because of imperfect command of the available literature on the part of external commentators. Improved communication on theoretical issues is encouraged, though it is observed that system dynamics will continue to justify its assumptions primarily from the point of view of practical problem solving. The answer to the question in the paper's title is therefore: on balance, no.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study has explored the prediction errors of tropical cyclones (TCs) in the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) for the Northern Hemisphere summer period for five recent years. Results for the EPS are contrasted with those for the higher-resolution deterministic forecasts. Various metrics of location and intensity errors are considered and contrasted for verification based on IBTrACS and the numerical weather prediction (NWP) analysis (NWPa). Motivated by the aim of exploring extended TC life cycles, location and intensity measures are introduced based on lower-tropospheric vorticity, which is contrasted with traditional verification metrics. Results show that location errors are almost identical when verified against IBTrACS or the NWPa. However, intensity in the form of the mean sea level pressure (MSLP) minima and 10-m wind speed maxima is significantly underpredicted relative to IBTrACS. Using the NWPa for verification results in much better consistency between the different intensity error metrics and indicates that the lower-tropospheric vorticity provides a good indication of vortex strength, with error results showing similar relationships to those based on MSLP and 10-m wind speeds for the different forecast types. The interannual variation in forecast errors are discussed in relation to changes in the forecast and NWPa system and variations in forecast errors between different ocean basins are discussed in terms of the propagation characteristics of the TCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the relationship between institutional investor holdings and stock misvaluation in the U.S. between 1980 and 2010. I find that institutional investors overweigh overvalued and underweigh undervalued stocks in their portfolio, taking the market portfolio as a benchmark. Cross-sectionally, institutional investors hold more overvalued stocks than undervalued stocks. The time-series studies also show that institutional ownership of overvalued portfolios increases as the portfolios' degree of overvaluation. As an investment strategy, institutional investors' ride of stock misvaluation is neither driven by the fund flows from individual investors into institutions, nor industry-specific. Consistent with the agency problem explanation, investment companies and independent investment advisors have a higher tendency to ride stock misvaluation than other institutions. There is weak evidence that institutional investors make positive profit by riding stock misvaluation. My findings challenge the models that view individual investors as noise traders and disregard the role of institutional investors in stock market misvaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. To investigate mortality in which paracoccidioidomycosis appears on any line or part of the death certificate. Method. Mortality data for 1985-2005 were obtained from the multiple cause-of-death database maintained by the Sao Paulo State Data Analysis System (SEADE). Standardized mortality coefficients were calculated for paracoccidioidomycosis as the underlying cause-of-death and as an associated cause-of-death, as well as for the total number of times paracoccidioidomycosis was mentioned on the death certificates. Results. During this 21-year period, there were 1950 deaths related to paracoccidioidomycosis; the disease was the underlying cause-of-death in 1 164 cases (59.69%) and an associated cause-of-death in 786 (40.31%). Between 1985 and 2005 records show a 59.8% decline in the mortality coefficient due to paracoccidioidomycosis as the underlying cause and a 53.0% decline in the mortality as associated cause. The largest number of deaths occurred among men, in the older age groups, and among rural workers, with an upward trend in winter months. The main causes associated with paracoccidioidomycosis as the underlying cause-of-death were pulmonary fibrosis, chronic lower respiratory tract diseases, and pneumonias. Malignant neoplasms and AIDS were the main underlying causes when paracoccidioidomycosis was an associated cause-of-death. The decision tables had to be adapted for the automated processing of causes of death in death certificates where paracoccidioidomycosis was mentioned. Conclusions. Using the multiple cause-of-death method together with the traditional underlying cause-of-death approach provides a new angle on research aimed at broadening our understanding of the natural history of paracoccidioidomycosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have investigated plasma turbulence at the edge of a tokamak plasma using data from electrostatic potential fluctuations measured in the Brazilian tokamak TCABR. Recurrence quantification analysis has been used to provide diagnostics of the deterministic content of the series. We have focused our analysis on the radial dependence of potential fluctuations and their characterization by recurrence-based diagnostics. Our main result is that the deterministic content of the experimental signals is most pronounced at the external part of the plasma column just before the plasma radius. Since the chaoticity of the signals follows the same trend, we have concluded that the electrostatic plasma turbulence at the tokamak plasma edge can be partially explained by means of a deterministic nonlinear system. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heavy-ion total reaction cross-section measurements for more than 1100 reaction cases covering 61 target nuclei in the range (6)Li-(238)U and 158 projectile nuclei from (2)H to (84)Kr (mostly exotic ones) have been analyzed in a systematic way by using an empirical, three-parameter formula that is applicable to the cases of projectile kinetic energies above the Coulomb barrier. The analysis has shown that the average total nuclear binding energy per nucleon of the interacting nuclei and their radii are the chief quantities that describe the cross-section patterns. A great amount of cross-section data (87%) has been quite satisfactorily reproduced by the proposed formula; therefore, the total reaction cross-section predictions for new, not yet experimentally investigated reaction cases can be obtained within 25% (or much less) uncertainty.