902 resultados para errors-in-variables model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the problem of selecting the best linear unbiased predictor (BLUP) of the latent value (e.g., serum glucose fasting level) of sample subjects with heteroskedastic measurement errors. Using a simple example, we compare the usual mixed model BLUP to a similar predictor based on a mixed model framed in a finite population (FPMM) setup with two sources of variability, the first of which corresponds to simple random sampling and the second, to heteroskedastic measurement errors. Under this last approach, we show that when measurement errors are subject-specific, the BLUP shrinkage constants are based on a pooled measurement error variance as opposed to the individual ones generally considered for the usual mixed model BLUP. In contrast, when the heteroskedastic measurement errors are measurement condition-specific, the FPMM BLUP involves different shrinkage constants. We also show that in this setup, when measurement errors are subject-specific, the usual mixed model predictor is biased but has a smaller mean squared error than the FPMM BLUP which points to some difficulties in the interpretation of such predictors. (C) 2011 Elsevier By. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Introduction We conducted the present study to investigate whether early large-volume crystalloid infusion can restore gut mucosal blood flow and mesenteric oxygen metabolism in severe sepsis. Methods Anesthetized and mechanically ventilated male mongrel dogs were challenged with intravenous injection of live Escherichia coli (6 × 109 colony-forming units/ml per kg over 15 min). After 90 min they were randomly assigned to one of two groups – control (no fluids; n = 13) or lactated Ringer's solution (32 ml/kg per hour; n = 14) – and followed for 60 min. Cardiac index, mesenteric blood flow, mean arterial pressure, systemic and mesenteric oxygen-derived variables, blood lactate and gastric carbon dioxide tension (PCO2; by gas tonometry) were assessed throughout the study. Results E. coli infusion significantly decreased arterial pressure, cardiac index, mesenteric blood flow, and systemic and mesenteric oxygen delivery, and increased arterial and portal lactate, intramucosal PCO2, PCO2 gap (the difference between gastric mucosal and arterial PCO2), and systemic and mesenteric oxygen extraction ratio in both groups. The Ringer's solution group had significantly higher cardiac index and systemic oxygen delivery, and lower oxygen extraction ratio and PCO2 gap at 165 min as compared with control animals. However, infusion of lactated Ringer's solution was unable to restore the PCO2 gap. There were no significant differences between groups in mesenteric oxygen delivery, oxygen extraction ratio, or portal lactate at the end of study. Conclusion Significant disturbances occur in the systemic and mesenteric beds during bacteremic severe sepsis. Although large-volume infusion of lactated Ringer's solution restored systemic hemodynamic parameters, it was unable to correct gut mucosal PCO2 gap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanoindentation is a valuable tool for characterization of biomaterials due to its ability to measure local properties in heterogeneous, small or irregularly shaped samples. However, applying nanoindentation to compliant, hydrated biomaterials leads to many challenges including adhesion between the nanoindenter tip and the sample. Although adhesion leads to overestimation of the modulus of compliant samples when analyzing nanoindentation data using traditional analysis techniques, most studies of biomaterials have ignored its effects. This paper demonstrates two methods for managing adhesion in nanoindentation analysis, the nano-JKR force curve method and the surfactant method, through application to two biomedically-relevant compliant materials, poly(dimethyl siloxane) (PDMS) elastomers and poly(ethylene glycol) (PEG) hydrogels. The nano-JKR force curve method accounts for adhesion during data analysis using equations based on the Johnson-Kendall-Roberts (JKR) adhesion model, while the surfactant method eliminates adhesion during data collection, allowing data analysis using traditional techniques. In this study, indents performed in air or water resulted in adhesion between the tip and the sample, while testing the same materials submerged in Optifree Express() contact lens solution eliminated tip-sample adhesion in most samples. Modulus values from the two methods were within 7% of each other, despite different hydration conditions and evidence of adhesion. Using surfactant also did not significantly alter the properties of the tested material, allowed accurate modulus measurements using commercial software, and facilitated nanoindentation testing in fluids. This technique shows promise for more accurate and faster determination of modulus values from nanoindentation of compliant, hydrated biological samples. Copyright 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-site time series studies of air pollution and mortality and morbidity have figured prominently in the literature as comprehensive approaches for estimating acute effects of air pollution on health. Hierarchical models are generally used to combine site-specific information and estimate pooled air pollution effects taking into account both within-site statistical uncertainty, and across-site heterogeneity. Within a site, characteristics of time series data of air pollution and health (small pollution effects, missing data, highly correlated predictors, non linear confounding etc.) make modelling all sources of uncertainty challenging. One potential consequence is underestimation of the statistical variance of the site-specific effects to be combined. In this paper we investigate the impact of variance underestimation on the pooled relative rate estimate. We focus on two-stage normal-normal hierarchical models and on under- estimation of the statistical variance at the first stage. By mathematical considerations and simulation studies, we found that variance underestimation does not affect the pooled estimate substantially. However, some sensitivity of the pooled estimate to variance underestimation is observed when the number of sites is small and underestimation is severe. These simulation results are applicable to any two-stage normal-normal hierarchical model for combining information of site-specific results, and they can be easily extended to more general hierarchical formulations. We also examined the impact of variance underestimation on the national average relative rate estimate from the National Morbidity Mortality Air Pollution Study and we found that variance underestimation as much as 40% has little effect on the national average.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decadal-to-century scale trends for a range of marine environmental variables in the upper mesopelagic layer (UML, 100–600 m) are investigated using results from seven Earth System Models forced by a high greenhouse gas emission scenario. The models as a class represent the observation-based distribution of oxygen (O2) and carbon dioxide (CO2), albeit major mismatches between observation-based and simulated values remain for individual models. By year 2100 all models project an increase in SST between 2 °C and 3 °C, and a decrease in the pH and in the saturation state of water with respect to calcium carbonate minerals in the UML. A decrease in the total ocean inventory of dissolved oxygen by 2% to 4% is projected by the range of models. Projected O2 changes in the UML show a complex pattern with both increasing and decreasing trends reflecting the subtle balance of different competing factors such as circulation, production, remineralization, and temperature changes. Projected changes in the total volume of hypoxic and suboxic waters remain relatively small in all models. A widespread increase of CO2 in the UML is projected. The median of the CO2 distribution between 100 and 600m shifts from 0.1–0.2 mol m−3 in year 1990 to 0.2–0.4 mol m−3 in year 2100, primarily as a result of the invasion of anthropogenic carbon from the atmosphere. The co-occurrence of changes in a range of environmental variables indicates the need to further investigate their synergistic impacts on marine ecosystems and Earth System feedbacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between spot volume and variation for all protein spots observed on large format 2D gels when utilising silver stain technology and a model system based on mammalian NSO cell extracts is reported. By running multiple gels we have shown that the reproducibility of data generated in this way is dependent on individual protein spot volumes, which in turn are directly correlated with the coefficient of variation. The coefficients of variation across all observed protein spots were highest for low abundant proteins which are the primary contributors to process error, and lowest for more abundant proteins. Using the relationship between spot volume and coefficient of variation we show it is necessary to calculate variation for individual protein spot volumes. The inherent limitations of silver staining therefore mean that errors in individual protein spot volumes must be considered when assessing significant changes in protein spot volume and not global error. (C) 2003 Elsevier Science (USA). All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software simulation models are computer programs that need to be verified and debugged like any other software. In previous work, a method for error isolation in simulation models has been proposed. The method relies on a set of feature matrices that can be used to determine which part of the model implementation is responsible for deviations in the output of the model. Currrently these feature matrices have to be generated by hand from the model implementation, which is a tedious and error-prone task. In this paper, a method based on mutation analysis, as well as prototype tool support for the verification of the manually generated feature matrices is presented. The application of the method and tool to a model for wastewater treatment shows that the feature matrices can be verified effectively using a minimal number of mutants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to demonstrate the existence of a strong and significant effect of complexity in aphasia independent from other variables including length. Complexity was found to be a strong and significant predictor of accurate repetition in a group of 13 Italian aphasic patients when it was entered in a regression equation either simultaneously or after a large number of other variables. Significant effects were found both when complexity was measured in terms of number of complex onsets (as in a recent paper by Nickels & Howard, 2004) and when it was measured in a more comprehensive way. Significant complexity effects were also found with matched lists contrasting simple and complex words and in analyses of errors. Effects of complexity, however, were restricted to patients with articulatory difficulties. Reasons for this association and for the lack of significant results in Nickels and Howard (2004) are discussed. © 2005 Psychology Press Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62J05, 62J10, 62F35, 62H12, 62P30.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Since 2005, the workload of community pharmacists in England has increased with a concomitant increase in stress and work pressure. However, it is unclear how these factors are impacting on the ability of community pharmacists to ensure accuracy during the dispensing process. This research seeks to extend our understanding of the nature, outcome, and predictors of dispensing errors. Methodology: A retrospective analysis of a purposive sample of incident report forms (IRFs) from the database of a pharmacist indemnity insurance provider was conducted. Data collected included; type of error, degree of harm caused, pharmacy and pharmacist demographics, and possible contributory factors. Results: In total, 339 files from UK community pharmacies were retrieved from the database. The files dated from June 2006 to November 2011. Incorrect item (45.1%, n = 153/339) followed by incorrect strength (24.5%, n = 83/339) were the most common forms of error. Almost half (41.6%, n = 147/339) of the patients suffered some form of harm ranging from minor harm (26.7%, n = 87/339) to death (0.3%, n = 1/339). Insufficient staff (51.6%, n = 175/339), similar packaging (40.7%, n = 138/339) and the pharmacy being busier than normal (39.5%, n = 134/339) were identified as key contributory factors. Cross-tabular analysis against the final accuracy check variable revealed significant association between the pharmacy location (P < 0.024), dispensary layout (P < 0.025), insufficient staff (P < 0.019), and busier than normal (P < 0.005) variables. Conclusion: The results provide an overview of some of the individual, organisational and technical factors at play at the time of a dispensing error and highlight the need to examine further the relationships between these factors and dispensing error occurrence.