1000 resultados para Rademacher averages


Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Echocardiography is the standard clinical approach for quantification of the severity of aortic stenosis (AS). A comprehensive examination of its overall reproducibility and the simultaneous estimation of its variance components by multiple operators, readers, probe applications, and beats have not been undertaken. METHOD AND RESULTS Twenty-seven subjects with AS were scanned over 7 months in the echo-department by a median of 3 different operators. From each patient and each operator multiple runs of beats from multiple probe positions were stored for later analysis by multiple readers. The coefficient of variation was 13.3%, 15.9%, 17.6%, and 20.2% for the aortic peak velocity (Vmax), and velocity time integral (VTI), and left ventricular outflow tract (LVOT) Vmax and VTI respectively. The largest individual contributors to the overall variability were the beat-to-beat variability (9.0%, 9.3%, 9.5%, 9.4% respectively) and that of inability of an individual operator to precisely apply the probe to the same position twice (8.3%, 9.4%, 12.9%, 10.7% respectively). The tracing (inter-reader) and reader (inter-reader), and operator (inter-operator) contribution were less important. CONCLUSIONS Reproducibility of measurements in AS is poorer than often reported in the literature. The source of this variability does not appear, as traditionally believed, to result from a lack of training or operator and reader specific factors. Rather the unavoidable beat-to-beat biological variability, and the inherent impossibility of applying the ultrasound probe in exactly the same position each time are the largest contributors. Consequently, guidelines suggesting greater standardisation of procedures and further training for sonographers are unlikely to result in an improvement in precision. Clinicians themselves should be wary of relying on even three-beat averages as their expected coefficient of variance is 10.3% for the peak velocity at the aortic valve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work a modelization of the turbulence in the atmospheric boundary layer, under convective condition, is made. For this aim, the equations that describe the atmospheric motion are expressed through Reynolds averages and, then, they need closures. This work consists in modifying the TKE-l closure used in the BOLAM (Bologna Limited Area Model) forecast model. In particular, the single column model extracted from BOLAM is used, which is modified to obtain other three different closure schemes: a non-local term is added to the flux- gradient relations used to close the second order moments present in the evolution equation of the turbulent kinetic energy, so that the flux-gradient relations become more suitable for simulating an unstable boundary layer. Furthermore, a comparison among the results obtained from the single column model, the ones obtained from the three new schemes and the observations provided by the known case in literature ”GABLS2” is made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Klimamontoring benötigt eine operative, raum-zeitliche Analyse der Klimavariabilität. Mit dieser Zielsetzung, funktionsbereite Karten regelmäßig zu erstellen, ist es hilfreich auf einen Blick, die räumliche Variabilität der Klimaelemente in der zeitlichen Veränderungen darzustellen. Für aktuelle und kürzlich vergangene Jahre entwickelte der Deutsche Wetterdienst ein Standardverfahren zur Erstellung solcher Karten. Die Methode zur Erstellung solcher Karten variiert für die verschiedenen Klimaelemente bedingt durch die Datengrundlage, die natürliche Variabilität und der Verfügbarkeit der in-situ Daten.rnIm Rahmen der Analyse der raum-zeitlichen Variabilität innerhalb dieser Dissertation werden verschiedene Interpolationsverfahren auf die Mitteltemperatur der fünf Dekaden der Jahre 1951-2000 für ein relativ großes Gebiet, der Region VI der Weltorganisation für Meteorologie (Europa und Naher Osten) angewendet. Die Region deckt ein relativ heterogenes Arbeitsgebiet von Grönland im Nordwesten bis Syrien im Südosten hinsichtlich der Klimatologie ab.rnDas zentrale Ziel der Dissertation ist eine Methode zur räumlichen Interpolation der mittleren Dekadentemperaturwerte für die Region VI zu entwickeln. Diese Methode soll in Zukunft für die operative monatliche Klimakartenerstellung geeignet sein. Diese einheitliche Methode soll auf andere Klimaelemente übertragbar und mit der entsprechenden Software überall anwendbar sein. Zwei zentrale Datenbanken werden im Rahmen dieser Dissertation verwendet: So genannte CLIMAT-Daten über dem Land und Schiffsdaten über dem Meer.rnIm Grunde wird die Übertragung der Punktwerte der Temperatur per räumlicher Interpolation auf die Fläche in drei Schritten vollzogen. Der erste Schritt beinhaltet eine multiple Regression zur Reduktion der Stationswerte mit den vier Einflussgrößen der Geographischen Breite, der Höhe über Normalnull, der Jahrestemperaturamplitude und der thermischen Kontinentalität auf ein einheitliches Niveau. Im zweiten Schritt werden die reduzierten Temperaturwerte, so genannte Residuen, mit der Interpolationsmethode der Radialen Basis Funktionen aus der Gruppe der Neuronalen Netzwerk Modelle (NNM) interpoliert. Im letzten Schritt werden die interpolierten Temperaturraster mit der Umkehrung der multiplen Regression aus Schritt eins mit Hilfe der vier Einflussgrößen auf ihr ursprüngliches Niveau hochgerechnet.rnFür alle Stationswerte wird die Differenz zwischen geschätzten Wert aus der Interpolation und dem wahren gemessenen Wert berechnet und durch die geostatistische Kenngröße des Root Mean Square Errors (RMSE) wiedergegeben. Der zentrale Vorteil ist die wertegetreue Wiedergabe, die fehlende Generalisierung und die Vermeidung von Interpolationsinseln. Das entwickelte Verfahren ist auf andere Klimaelemente wie Niederschlag, Schneedeckenhöhe oder Sonnenscheindauer übertragbar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recombinant human tumour necrosis factor (TNF) has a selective effect on angiogenic vessels in tumours. Given that it induces vasoplegia, its clinical use has been limited to administration through isolated limb perfusion (ILP) for regionally advanced melanomas and soft tissue sarcomas of the limbs. When combined with the alkylating agent melphalan, a single ILP produces a very high objective response rate. In melanoma, the complete response (CR) rate is around 80% and the overall objective response rate greater than 90%. In soft tissue sarcomas that are inextirpable, ILP is a neoadjuvant treatment resulting in limb salvage in 80% of the cases. The CR rate averages 20% and the objective response rate is around 80%. The mode of action of TNF-based ILP involves two distinct and successive effects on the tumour-associated vasculature: first, an increase in endothelium permeability leading to improved chemotherapy penetration within the tumour tissue, and second, a selective killing of angiogenic endothelial cells resulting in tumour vessel destruction. The mechanism whereby these events occur involves rapid (of the order of minutes) perturbation of cell-cell adhesive junctions and inhibition of alphavbeta3 integrin signalling in tumour-associated vessels, followed by massive death of endothelial cells and tumour vascular collapse 24 hours later. New, promising approaches for the systemic use of TNF in cancer therapy include TNF targeting by means of single chain antibodies or endothelial cell ligands, or combined administration with drugs perturbing integrin-dependent signalling and sensitizing angiogenic endothelial cells to TNF-induced death.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Trauma care is expensive. However, reliable data on the exact lifelong costs incurred by a major trauma patient are lacking. Discussion usually focuses on direct medical costs--underestimating consequential costs resulting from absence from work and permanent disability. METHODS: Direct medical costs and consequential costs of 63 major trauma survivors (ISS >13) at a Swiss trauma center from 1995 to 1996 were assessed 5 years posttrauma. The following cost evaluation methods were used: correction cost method (direct cost of restoring an original state), human capital method (indirect cost of lost productivity), contingent valuation method (human cost as the lost quality of life), and macroeconomic estimates. RESULTS: Mean ISS (Injury Severity Score) was 26.8 +/- 9.5 (mean +/- SD). In all, 22 patients (35%) were disabled, causing discounted average lifelong total costs of USD 1,293,800, compared with 41 patients (65%) who recovered without any disabilities with incurred costs of USD 147,200 (average of both groups USD 547,800). Two thirds of these costs were attributable to a loss of production whereas only one third was a result of the cost of correction. Primary hospital treatment (USD 27,800 +/- 37,800) was only a minor fraction of the total cost--less than the estimated cost of police and the judiciary. Loss of quality of life led to considerable intangible human costs similar to real costs. CONCLUSIONS: Trauma costs are commonly underestimated. Direct medical costs make up only a small part of the total costs. Consequential costs, such as lost productivity, are well in excess of the usual medical costs. Mere cost averages give a false estimate of the costs incurred by patients with/without disabilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a complicated target distribution via simple ergodic averages. A fundamental question in MCMC applications is when should the sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the MCMC sampling the first time the width of a confidence interval based on the ergodic averages is less than a user-specified value. Hence calculating Monte Carlo standard errors is a critical step in assessing the output of the simulation. In particular, we consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We describe sufficient conditions for the strong consistency and asymptotic normality of both methods and investigate their finite sample properties in a variety of examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a method for diagnosing confounding bias under a model which links a spatially and temporally varying exposure and health outcome. We decompose the association into orthogonal components, corresponding to distinct spatial and temporal scales of variation. If the model fully controls for confounding, the exposure effect estimates should be equal at the different temporal and spatial scales. We show that the overall exposure effect estimate is a weighted average of the scale-specific exposure effect estimates. We use this approach to estimate the association between monthly averages of fine particles (PM2.5) over the preceding 12 months and monthly mortality rates in 113 U.S. counties from 2000-2002. We decompose the association between PM2.5 and mortality into two components: 1) the association between “national trends” in PM2.5 and mortality; and 2) the association between “local trends,” defined as county-specificdeviations from national trends. This second component provides evidence as to whether counties having steeper declines in PM2.5 also have steeper declines in mortality relative to their national trends. We find that the exposure effect estimates are different at these two spatio-temporalscales, which raises concerns about confounding bias. We believe that the association between trends in PM2.5 and mortality at the national scale is more likely to be confounded than is the association between trends in PM2.5 and mortality at the local scale. If the association at the national scale is set aside, there is little evidence of an association between 12-month exposure to PM2.5 and mortality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We used differential GPS measurements from a 13 station GPS network spanning the Santa Ana Volcano and Coatepeque Caldera to characterize the inter-eruptive activity and tectonic movements near these two active and potentially hazardous features. Caldera-forming events occurred from 70-40 ka and at Santa Ana/Izalco volcanoes eruptive activity occurred as recently as 2005. Twelve differential stations were surveyed for 1 to 2 hours on a monthly basis from February through September 2009 and tied to a centrally located continuous GPS station, which serves as the reference site for this volcanic network. Repeatabilities of the averages from 20-minute sessions taken over 20 hours or longer range from 2-11 mm in the horizontal (north and east) components of the inter-station baselines, suggesting a lower detection limit for the horizontal components of any short-term tectonic or volcanic deformation. Repeatabilities of the vertical baseline component range from 12-34 mm. Analysis of the precipitable water vapor in the troposphere suggests that tropospheric decorrelation as a function of baseline lengths and variable site elevations are the most likely sources of vertical error. Differential motions of the 12 sites relative to the continuous reference site reveal inflation from February through July at several sites surrounding the caldera with vertical displacements that range from 61 mm to 139 mm followed by a lower magnitude deflation event on 1.8-7.4 km-long baselines. Uplift rates for the inflationary period reach 300 mm/yr with 1σ uncertainties of +/- 26 – 119 mm. Only one other station outside the caldera exhibits a similar deformation trend, suggesting a localized source. The results suggest that the use of differential GPS measurements from short duration occupations over short baselines can be a useful monitoring tool at sub-tropical volcanoes and calderas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, advanced metering infrastructure (AMI) has been the main research focus due to the traditional power grid has been restricted to meet development requirements. There has been an ongoing effort to increase the number of AMI devices that provide real-time data readings to improve system observability. Deployed AMI across distribution secondary networks provides load and consumption information for individual households which can improve grid management. Significant upgrade costs associated with retrofitting existing meters with network-capable sensing can be made more economical by using image processing methods to extract usage information from images of the existing meters. This thesis presents a new solution that uses online data exchange of power consumption information to a cloud server without modifying the existing electromechanical analog meters. In this framework, application of a systematic approach to extract energy data from images replaces the manual reading process. One case study illustrates the digital imaging approach is compared to the averages determined by visual readings over a one-month period.