50 resultados para 1 sigma standard deviation for the average

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present results from an intercomparison program of CO2, δ(O2/N2) and δ13CO2 measurements from atmospheric flask samples. Flask samples are collected on a bi-weekly basis at the High Altitude Research Station Jungfraujoch in Switzerland for three European laboratories: the University of Bern, Switzerland, the University of Groningen, the Netherlands and the Max Planck Institute for Biogeochemistry in Jena, Germany. Almost 4 years of measurements of CO2, δ(O2/N2) and δ13CO2 are compared in this paper to assess the measurement compatibility of the three laboratories. While the average difference for the CO2 measurements between the laboratories in Bern and Jena meets the required compatibility goal as defined by the World Meteorological Organization, the standard deviation of the average differences between all laboratories is not within the required goal. However, the obtained annual trend and seasonalities are the same within their estimated uncertainties. For δ(O2/N2) significant differences are observed between the three laboratories. The comparison for δ13CO2 yields the least compatible results and the required goals are not met between the three laboratories. Our study shows the importance of regular intercomparison exercises to identify potential biases between laboratories and the need to improve the quality of atmospheric measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Telomeres have emerged as crucial cellular elements in aging and various diseases including cancer. To measure the average length of telomere repeats in cells, we describe our protocols that use fluorescent in situ hybridization (FISH) with labeled peptide nucleic acid (PNA) probes specific for telomere repeats in combination with fluorescence measurements by flow cytometry (flow FISH). Flow FISH analysis can be performed using commercially available flow cytometers, and has the unique advantage over other methods for measuring telomere length of providing multi-parameter information on the length of telomere repeats in thousands of individual cells. The accuracy and reproducibility of the measurements is augmented by the automation of most pipetting (aspiration and dispensing) steps, and by including an internal standard (control cells) with a known telomere length in every tube. The basic protocol for the analysis of nucleated blood cells from 22 different individuals takes about 12 h spread over 2-3 days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: A high proportion of patients with essential hypertension need a combination therapy to reach the therapeutic goal. In the present study, the tolerability and efficacy of a fixed, once daily combination of the AT1 blocker Losartan (100 mg) and the diuretic hydrochlorothiazide (HCTZ) (25 mg) for patients in the real-life situation was investigated. Special consideration was given to the results of ambulatory 24-hourblood pressure (ABP) measurements. METHODS: The open label, prospective non-interventional surveillance study took place from October 2005 to June 2006. A total of 1139 patients over 18 years in age were included whose blood pressures could not be adequately treated with HCTZ alone and for whom an individual dose titration for Losartan and HCTZ had already been performed. RESULTS: The average age (+/- standard deviation) of the patients was 61.2 +/- 11.6 years; 55.8% were men. Comorbidities were common. Specifically, left ventricular hypertrophy was present in 3.1% of the patients, coronary heart disease in 30.1%, chronic heart failure in 11.8% and status post myocardial infarction in 10.5%, respectively. In addition to the Losartan/HCTZ treatment, 61.0% of the patients received a second antihypertensive medicine. After an average treatment duration of 50.4 +/- 17.2 days, the base line systolic blood pressure of 160.8 +/- 16.3 mmHg decreased by 24.0 +/- 17.0 mmHg (-14.4%) and the diastolic blood pressure of 94.4 +/- 9.9 mmHg decreased by 11.8 +/- 10.2 mmHg (-11.8%). For the ABP measurements, the overall average systolic and diastolic blood pressures fell by 16.9 +/- 14.2 mmHg and 8.8 +/-10.3 mmHg, the day average by 17.3 +/- 14.8 mmHg and 9.0 +/- 10.2 mmHg and the night average by 15.1 +/- 17.6 mmHg and 7.8 +/- 11.7 mmHg, respectively. In twelve of the 1139 patients (1.1%), a total of 15 adverse events occurred. A causal connection with the medication was suspected in only in one case (one patient with three). CONCLUSION: The combination of Losartan/HCTZ 100/25 mg, as the exclusive therapy or in addition to other antihypertensive medicines, was for patients, many of whom who had comorbidities, in the real-life situation well tolerated and effective. The efficacy was demonstrated also during the night through ABP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Twentieth Century Reanalysis (20CR) is an atmospheric dataset consisting of 56 ensemble members, which covers the entire globe and reaches back to 1871. To assess the suitability of this dataset for studying past extremes, we analysed a prominent extreme event, namely the Galveston Hurricane, which made landfall in September 1900 in Texas, USA. The ensemble mean of 20CR shows a track of the pressure minimum with a small standard deviation among the 56 ensemble members in the area of the Gulf of Mexico. However, there are systematic differences between the assimilated “Best Track” from the International Best Track Archive for Climate Stewardship (IBTrACS) and the ensemble mean track in 20CR. East of the Strait of Florida, the tracks derived from 20CR are located systematically northeast of the assimilated track while in the Gulf of Mexico, the 20CR tracks are systematically shifted to the southwest compared to the IBTrACS position. The hurricane can also be observed in the wind field, which shows a cyclonic rotation and a relatively calm zone in the centre of the hurricane. The 20CR data reproduce the pressure gradient and cyclonic wind field. Regarding the amplitude of the wind speeds, the ensemble mean values from 20CR are significantly lower than the wind speeds known from measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The planning of refractive surgical interventions is a challenging task. Numerical modeling has been proposed as a solution to support surgical intervention and predict the visual acuity, but validation on patient specific intervention is missing. The purpose of this study was to validate the numerical predictions of the post-operative corneal topography induced by the incisions required for cataract surgery. The corneal topography of 13 patients was assessed preoperatively and postoperatively (1-day and 30-day follow-up) with a Pentacam tomography device. The preoperatively acquired geometric corneal topography – anterior, posterior and pachymetry data – was used to build patient-specific finite element models. For each patient, the effects of the cataract incisions were simulated numerically and the resulting corneal surfaces were compared to the clinical postoperative measurements at one day and at 30-days follow up. Results showed that the model was able to reproduce experimental measurements with an error on the surgically induced sphere of 0.38D one day postoperatively and 0.19D 30 days postoperatively. The standard deviation of the surgically induced cylinder was 0.54D at the first postoperative day and 0.38D 30 days postoperatively. The prediction errors in surface elevation and curvature were below the topography measurement device accuracy of ±5μm and ±0.25D after the 30-day follow-up. The results showed that finite element simulations of corneal biomechanics are able to predict post cataract surgery within topography measurement device accuracy. We can conclude that the numerical simulation can become a valuable tool to plan corneal incisions in cataract surgery and other ophthalmosurgical procedures in order to optimize patients' refractive outcome and visual function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Middle atmospheric water vapour can be used as a tracer for dynamical processes. It is mainly measured by satellite instruments and ground-based microwave radiometers. Ground-based instruments capable of measuring middle-atmospheric water vapour are sparse but valuable as they complement satellite measurements, are relatively easy to maintain and have a long lifetime. MIAWARA-C is a ground-based microwave radiometer for middle-atmospheric water vapour designed for use on measurement campaigns for both atmospheric case studies and instrument intercomparisons. MIAWARA-C's retrieval version 1.1 (v1.1) is set up in a such way as to provide a consistent data set even if the instrument is operated from different locations on a campaign basis. The sensitive altitude range for v1.1 extends from 4 hPa (37 km) to 0.017 hPa (75 km). For v1.1 the estimated systematic error is approximately 10% for all altitudes. At lower altitudes it is dominated by uncertainties in the calibration, with altitude the influence of spectroscopic and temperature uncertainties increases. The estimated random error increases with altitude from 5 to 25%. MIAWARA-C measures two polarisations of the incident radiation in separate receiver channels, and can therefore provide two measurements of the same air mass with independent instrumental noise. The standard deviation of the difference between the profiles obtained from the two polarisations is in excellent agreement with the estimated random measurement error of v1.1. In this paper, the quality of v1.1 data is assessed for measurements obtained at two different locations: (1) a total of 25 months of measurements in the Arctic (Sodankylä, 67.37° N, 26.63° E) and (2) nine months of measurements at mid-latitudes (Zimmerwald, 46.88° N, 7.46° E). For both locations MIAWARA-C's profiles are compared to measurements from the satellite experiments Aura MLS and MIPAS. In addition, comparisons to ACE-FTS and SOFIE are presented for the Arctic and to the ground-based radiometer MIAWARA for the mid-latitude campaigns. In general, all intercomparisons show high correlation coefficients, confirming the ability of MIAWARA-C to monitor temporal variations of the order of days. The biases are generally below 13% and within the estimated systematic uncertainty of MIAWARA-C. No consistent wet or dry bias is identified for MIAWARA-C. In addition, comparisons to the reference instruments indicate the estimated random error of v1.1 to be a realistic measure of the random variation on the retrieved profile between 45 and 70 km.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined outcomes and trends in surgery and radiation use for patients with locally advanced esophageal cancer, for whom optimal treatment isn't clear. Trends in surgery and radiation for patients with T1-T3N1M0 squamous cell or adenocarcinoma of the mid or distal esophagus in the Surveillance, Epidemiology, and End Results database from 1998 to 2008 were analyzed using generalized linear models including year as predictor; Surveillance, Epidemiology, and End Results doesn't record chemotherapy data. Local treatment was unimodal if patients had only surgery or radiation and bimodal if they had both. Five-year cancer-specific survival (CSS) and overall survival (OS) were analyzed using propensity-score adjusted Cox proportional-hazard models. Overall 5-year survival for the 3295 patients identified (mean age 65.1 years, standard deviation 11.0) was 18.9% (95% confidence interval: 17.3-20.7). Local treatment was bimodal for 1274 (38.7%) and unimodal for 2021 (61.3%) patients; 1325 (40.2%) had radiation alone and 696 (21.1%) underwent only surgery. The use of bimodal therapy (32.8-42.5%, P = 0.01) and radiation alone (29.3-44.5%, P < 0.001) increased significantly from 1998 to 2008. Bimodal therapy predicted improved CSS (hazard ratios [HR]: 0.68, P < 0.001) and OS (HR: 0.58, P < 0.001) compared with unimodal therapy. For the first 7 months (before survival curve crossing), CSS after radiation therapy alone was similar to surgery alone (HR: 0.86, P = 0.12) while OS was worse for surgery only (HR: 0.70, P = 0.001). However, worse CSS (HR: 1.43, P < 0.001) and OS (HR: 1.46, P < 0.001) after that initial timeframe were found for radiation therapy only. The use of radiation to treat locally advanced mid and distal esophageal cancers increased from 1998 to 2008. Survival was best when both surgery and radiation were used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE In biomedical journals authors sometimes use the standard error of the mean (SEM) for data description, which has been called inappropriate or incorrect. OBJECTIVE To assess the frequency of incorrect use of SEM in articles in three selected cardiovascular journals. METHODS AND RESULTS All original journal articles published in 2012 in Cardiovascular Research, Circulation: Heart Failure and Circulation Research were assessed by two assessors for inappropriate use of SEM when providing descriptive information of empirical data. We also assessed whether the authors state in the methods section that the SEM will be used for data description. Of 441 articles included in this survey, 64% (282 articles) contained at least one instance of incorrect use of the SEM, with two journals having a prevalence above 70% and "Circulation: Heart Failure" having the lowest value (27%). In 81% of articles with incorrect use of SEM, the authors had explicitly stated that they use the SEM for data description and in 89% SEM bars were also used instead of 95% confidence intervals. Basic science studies had a 7.4-fold higher level of inappropriate SEM use (74%) than clinical studies (10%). LIMITATIONS The selection of the three cardiovascular journals was based on a subjective initial impression of observing inappropriate SEM use. The observed results are not representative for all cardiovascular journals. CONCLUSION In three selected cardiovascular journals we found a high level of inappropriate SEM use and explicit methods statements to use it for data description, especially in basic science studies. To improve on this situation, these and other journals should provide clear instructions to authors on how to report descriptive information of empirical data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

HIV-1-infected cells in peripheral blood can be grouped into different transcriptional subclasses. Quantifying the turnover of these cellular subclasses can provide important insights into the viral life cycle and the generation and maintenance of latently infected cells. We used previously published data from five patients chronically infected with HIV-1 that initiated combination antiretroviral therapy (cART). Patient-matched PCR for unspliced and multiply spliced viral RNAs combined with limiting dilution analysis provided measurements of transcriptional profiles at the single cell level. Furthermore, measurement of intracellular transcripts and extracellular virion-enclosed HIV-1 RNA allowed us to distinguish productive from non-productive cells. We developed a mathematical model describing the dynamics of plasma virus and the transcriptional subclasses of HIV-1-infected cells. Fitting the model to the data allowed us to better understand the phenotype of different transcriptional subclasses and their contribution to the overall turnover of HIV-1 before and during cART. The average number of virus-producing cells in peripheral blood is small during chronic infection. We find that a substantial fraction of cells can become defectively infected. Assuming that the infection is homogenous throughout the body, we estimate an average in vivo viral burst size on the order of 104 virions per cell. Our study provides novel quantitative insights into the turnover and development of different subclasses of HIV-1-infected cells, and indicates that cells containing solely unspliced viral RNA are a good marker for viral latency. The model illustrates how the pool of latently infected cells becomes rapidly established during the first months of acute infection and continues to increase slowly during the first years of chronic infection. Having a detailed understanding of this process will be useful for the evaluation of viral eradication strategies that aim to deplete the latent reservoir of HIV-1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present precise iron stable isotope ratios measured by multicollector-ICP mass spectrometry (MC-ICP-MS) of human red blood cells (erythrocytes) and blood plasma from 12 healthy male adults taken during a clinical study. The accurate determination of stable isotope ratios in plasma first required substantial method development work, as minor iron amounts in plasma had to be separated from a large organic matrix prior to mass-spectrometric analysis to avoid spectroscopic interferences and shifts in the mass spectrometer's mass-bias. The 56Fe/54Fe ratio in erythrocytes, expressed as permil difference from the “IRMM-014” iron reference standard (δ56/54Fe), ranges from −3.1‰ to −2.2‰, a range typical for male Caucasian adults. The individual subject erythrocyte iron isotope composition can be regarded as uniform over the 21 days investigated, as variations (±0.059 to ±0.15‰) are mostly within the analytical precision of reference materials. In plasma, δ56/54Fe values measured in two different laboratories range from −3.0‰ to −2.0‰, and are on average 0.24‰ higher than those in erythrocytes. However, this difference is barely resolvable within one standard deviation of the differences (0.22‰). Taking into account the possible contamination due to hemolysis (iron concentrations are only 0.4 to 2 ppm in plasma compared to approx. 480 ppm in erythrocytes), we model the pure plasma δ56/54Fe to be on average 0.4‰ higher than that in erythrocytes. Hence, the plasma iron isotope signature lies between that of the liver and that of erythrocytes. This difference can be explained by redox processes involved during cycling of iron between transferrin and ferritin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decomposition technique introduced by Blinder (1973) and Oaxaca (1973) is widely used to study outcome differences between groups. For example, the technique is commonly applied to the analysis of the gender wage gap. However, despite the procedure's frequent use, very little attention has been paid to the issue of estimating the sampling variances of the decomposition components. We therefore suggest an approach that introduces consistent variance estimators for several variants of the decomposition. The accuracy of the new estimators under ideal conditions is illustrated with the results of a Monte Carlo simulation. As a second check, the estimators are compared to bootstrap results obtained using real data. In contrast to previously proposed statistics, the new method takes into account the extra variation imposed by stochastic regressors.