939 resultados para monitoring process mean and variance
Resumo:
Environmental data sets of pollutant concentrations in air, water, and soil frequently include unquantified sample values reported only as being below the analytical method detection limit. These values, referred to as censored values, should be considered in the estimation of distribution parameters as each represents some value of pollutant concentration between zero and the detection limit. Most of the currently accepted methods for estimating the population parameters of environmental data sets containing censored values rely upon the assumption of an underlying normal (or transformed normal) distribution. This assumption can result in unacceptable levels of error in parameter estimation due to the unbounded left tail of the normal distribution. With the beta distribution, which is bounded by the same range of a distribution of concentrations, $\rm\lbrack0\le x\le1\rbrack,$ parameter estimation errors resulting from improper distribution bounds are avoided. This work developed a method that uses the beta distribution to estimate population parameters from censored environmental data sets and evaluated its performance in comparison to currently accepted methods that rely upon an underlying normal (or transformed normal) distribution. Data sets were generated assuming typical values encountered in environmental pollutant evaluation for mean, standard deviation, and number of variates. For each set of model values, data sets were generated assuming that the data was distributed either normally, lognormally, or according to a beta distribution. For varying levels of censoring, two established methods of parameter estimation, regression on normal ordered statistics, and regression on lognormal ordered statistics, were used to estimate the known mean and standard deviation of each data set. The method developed for this study, employing a beta distribution assumption, was also used to estimate parameters and the relative accuracy of all three methods were compared. For data sets of all three distribution types, and for censoring levels up to 50%, the performance of the new method equaled, if not exceeded, the performance of the two established methods. Because of its robustness in parameter estimation regardless of distribution type or censoring level, the method employing the beta distribution should be considered for full development in estimating parameters for censored environmental data sets. ^
Resumo:
This article describes in short sections the use and interpretation of indirect blood pressure measurements, central venous pressure, body temperature, pulse oximetry, end tidal CO2 measurements, pulse and heart rate, urine production and emergency laboratory values. Most of these parameters are directly or indirectly linked to the perfusion of the patient. Optimizing these values are one of the most important goals in emergency and critical care medicine.
Resumo:
For early diagnosis and therapy of alcohol-related disorders, alcohol biomarkers are highly valuable. Concerning specificity, indirect markers can be influenced by nonethanol-related factors, whereas direct markers are only formed after ethanol consumption. Sensitivity of the direct markers depends on cutoffs of analytical methods, material for analysis and plays an important role for their utilization in different fields of application. Until recently, the biomarker phosphatidylethanol has been used to differentiate between social drinking and alcohol abuse. After method optimization, the detection limit could be lowered and phosphatidylethanol became sensitive enough to even detect the consumption of low amounts of alcohol. This perspective gives a summary of most common alcohol biomarkers and summarizes new developments for monitoring alcohol consumption habits.
Resumo:
BACKGROUND AND AIMS The structured IBD Ahead 'Optimised Monitoring' programme was designed to obtain the opinion, insight and advice of gastroenterologists on optimising the monitoring of Crohn's disease activity in four settings: (1) assessment at diagnosis, (2) monitoring in symptomatic patients, (3) monitoring in asymptomatic patients, and (4) the postoperative follow-up. For each of these settings, four monitoring methods were discussed: (a) symptom assessment, (b) endoscopy, (c) laboratory markers, and (d) imaging. Based on literature search and expert opinion compiled during an international consensus meeting, recommendations were given to answer the question 'which diagnostic method, when, and how often'. The International IBD Ahead Expert Panel advised to tailor this guidance to the healthcare system and the special prerequisites of each country. The IBD Ahead Swiss National Steering Committee proposes best-practice recommendations adapted for Switzerland. METHODS The IBD Ahead Steering Committee identified key questions and provided the Swiss Expert Panel with a structured literature research. The expert panel agreed on a set of statements. During an international expert meeting the consolidated outcome of the national meetings was merged into final statements agreed by the participating International and National Steering Committee members - the IBD Ahead 'Optimized Monitoring' Consensus. RESULTS A systematic assessment of symptoms, endoscopy findings, and laboratory markers with special emphasis on faecal calprotectin is deemed necessary even in symptom-free patients. The choice of recommended imaging methods is adapted to the specific situation in Switzerland and highlights the importance of ultrasonography and magnetic resonance imaging besides endoscopy. CONCLUSION The recommendations stress the importance of monitoring disease activity on a regular basis and by objective parameters, such as faecal calprotectin and endoscopy with detailed documentation of findings. Physicians should not rely on symptoms only and adapt the monitoring schedule and choice of options to individual situations.
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.
Resumo:
We describe the recovery of three daily meteorological records for the southern Alps (Domodossola, Riva del Garda, and Rovereto), all starting in the second half of the nineteenth century. We use these new data, along with additional records, to study regional changes in the mean temperature and extreme indices of heat waves and cold spells frequency and duration over the period 1874–2015. The records are homogenized using subdaily cloud cover observations as a constraint for the statistical model, an approach that has never been applied before in the literature. A case study based on a record of parallel observations between a traditional meteorological window and a modern screen shows that the use of cloud cover can reduce the root-mean-square error of the homogenization by up to 30% in comparison to an unaided statistical correction. We find that mean temperature in the southern Alps has increased by 1.4°C per century over the analyzed period, with larger increases in daily minimum temperatures than maximum temperatures. The number of hot days in summer has more than tripled, and a similar increase is observed in duration of heat waves. Cold days in winter have dropped at a similar rate. These trends are mainly caused by climate change over the last few decades.
Resumo:
Land and water management in semi-arid regions requires detailed information on precipitation distribution, including extremes, and changes therein. Such information is often lacking. This paper describes statistics of mean and extreme precipitation in a unique data set from the Mount Kenya region, encompassing around 50 stations with at least 30 years of data. We describe the data set, including quality control procedures and statistical break detection. Trends in mean precipitation and extreme indices calculated from these data for individual rainy seasons are compared with corresponding trends in reanalysis products. From 1979 to 2011, mean precipitation decreased at 75% of the stations during the ‘long rains’ (March to May) and increased at 70% of the stations during the ‘short rains’ (October to December). Corresponding trends are found in the number of heavy precipitation days, and maximum of consecutive 5-day precipitation. Conversely, an increase in consecutive dry days within both main rainy seasons is found. However, trends are only statistically significant in very few cases. Reanalysis data sets agree with observations with respect to interannual variability, while correlations are considerably lower for monthly deviations (ratios) from the mean annual cycle. While some products well reproduce the rainfall climatology and some the spatial trend pattern, no product reproduces both.
Resumo:
Large-scale studies of ocean biogeochemistry and carbon cycling have often partitioned the ocean into regions along lines of latitude and longitude despite the fact that spatially more complex boundaries would be closer to the true biogeography of the ocean. Herein, we define 17 open-ocean biomes classified from four observational data sets: sea surface temperature (SST), spring/summer chlorophyll a concentrations (Chl a), ice fraction, and maximum mixed layer depth (maxMLD) on a 1° × 1° grid. By considering interannual variability for each input, we create dynamic ocean biome boundaries that shift annually between 1998 and 2010. Additionally we create a core biome map, which includes only the grid cells that do not change biome assignment across the 13 years of the time-varying biomes. These biomes can be used in future studies to distinguish large-scale ocean regions based on biogeochemical function.
Resumo:
The liberalization of electricity markets more than ten years ago in the vast majority of developed countries has introduced the need of modelling and forecasting electricity prices and volatilities, both in the short and long term. Thus, there is a need of providing methodology that is able to deal with the most important features of electricity price series, which are well known for presenting not only structure in conditional mean but also time-varying conditional variances. In this work we propose a new model, which allows to extract conditionally heteroskedastic common factors from the vector of electricity prices. These common factors are jointly estimated as well as their relationship with the original vector of series, and the dynamics affecting both their conditional mean and variance. The estimation of the model is carried out under the state-space formulation. The new model proposed is applied to extract seasonal common dynamic factors as well as common volatility factors for electricity prices and the estimation results are used to forecast electricity prices and their volatilities in the Spanish zone of the Iberian Market. Several simplified/alternative models are also considered as benchmarks to illustrate that the proposed approach is superior to all of them in terms of explanatory and predictive power.