902 resultados para errors-in-variables model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid quantum mechanics/molecular mechanics (QM/MM) simulations provide a powerful tool for studying chemical reactions, especially in complex biochemical systems. In most works to date, the quantum region is kept fixed throughout the simulation and is defined in an ad hoc way based on chemical intuition and available computational resources. The simulation errors associated with a given choice of the quantum region are, however, rarely assessed in a systematic manner. Here we study the dependence of two relevant quantities on the QM region size: the force error at the center of the QM region and the free energy of a proton transfer reaction. Taking lysozyme as our model system, we find that in an apolar region the average force error rapidly decreases with increasing QM region size. In contrast, the average force error at the polar active site is considerably higher, exhibits large oscillations and decreases more slowly, and may not fall below acceptable limits even for a quantum region radius of 9.0 A. Although computation of free energies could only be afforded until 6.0 A, results were found to change considerably within these limits. These errors demonstrate that the results of QM/MM calculations are heavily affected by the definition of the QM region (not only its size), and a convergence test is proposed to be a part of setting up QM/MM simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body-size measurement errors are usually ignored in stock assessments, but may be important when body-size data (e.g., from visual sur veys) are imprecise. We used experiments and models to quantify measurement errors and their effects on assessment models for sea scallops (Placopecten magellanicus). Errors in size data obscured modes from strong year classes and increased frequency and size of the largest and smallest sizes, potentially biasing growth, mortality, and biomass estimates. Modeling techniques for errors in age data proved useful for errors in size data. In terms of a goodness of model fit to the assessment data, it was more important to accommodate variance than bias. Models that accommodated size errors fitted size data substantially better. We recommend experimental quantification of errors along with a modeling approach that accommodates measurement errors because a direct algebraic approach was not robust and because error parameters were diff icult to estimate in our assessment model. The importance of measurement errors depends on many factors and should be evaluated on a case by case basis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particle tracking techniques are often used to assess the local mechanical properties of cells and biological fluids. The extracted trajectories are exploited to compute the mean-squared displacement that characterizes the dynamics of the probe particles. Limited spatial resolution and statistical uncertainty are the limiting factors that alter the accuracy of the mean-squared displacement estimation. We precisely quantified the effect of localization errors in the determination of the mean-squared displacement by separating the sources of these errors into two separate contributions. A "static error" arises in the position measurements of immobilized particles. A "dynamic error" comes from the particle motion during the finite exposure time that is required for visualization. We calculated the propagation of these errors on the mean-squared displacement. We examined the impact of our error analysis on theoretical model fluids used in biorheology. These theoretical predictions were verified for purely viscous fluids using simulations and a multiple-particle tracking technique performed with video microscopy. We showed that the static contribution can be confidently corrected in dynamics studies by using static experiments performed at a similar noise-to-signal ratio. This groundwork allowed us to achieve higher resolution in the mean-squared displacement, and thus to increase the accuracy of microrheology studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of conducting inference on nonparametric high-frequency estimators without knowing their asymptotic variances. We prove that a multivariate subsampling method achieves this goal under general conditions that were not previously available in the literature. We suggest a procedure for a data-driven choice of the bandwidth parameters. Our simulation study indicates that the subsampling method is much more robust than the plug-in method based on the asymptotic expression for the variance. Importantly, the subsampling method reliably estimates the variability of the Two Scale estimator even when its parameters are chosen to minimize the finite sample Mean Squared Error; in contrast, the plugin estimator substantially underestimates the sampling uncertainty. By construction, the subsampling method delivers estimates of the variance-covariance matrices that are always positive semi-definite. We use the subsampling method to study the dynamics of financial betas of six stocks on the NYSE. We document significant variation in betas within year 2006, and find that tick data captures more variation in betas than the data sampled at moderate frequencies such as every five or twenty minutes. To capture this variation we estimate a simple dynamic model for betas. The variance estimation is also important for the correction of the errors-in-variables bias in such models. We find that the bias corrections are substantial, and that betas are more persistent than the naive estimators would lead one to believe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The results of an investigation on the limits of the random errors contained in the basic data of Physical Oceanography and their propagation through the computational procedures are presented in this thesis. It also suggest a method which increases the reliability of the derived results. The thesis is presented in eight chapters including the introductory chapter. Chapter 2 discusses the general theory of errors that are relevant in the context of the propagation of errors in Physical Oceanographic computations. The error components contained in the independent oceanographic variables namely, temperature, salinity and depth are deliniated and quantified in chapter 3. Chapter 4 discusses and derives the magnitude of errors in the computation of the dependent oceanographic variables, density in situ, gt, specific volume and specific volume anomaly, due to the propagation of errors contained in the independent oceanographic variables. The errors propagated into the computed values of the derived quantities namely, dynamic depth and relative currents, have been estimated and presented chapter 5. Chapter 6 reviews the existing methods for the identification of level of no motion and suggests a method for the identification of a reliable zero reference level. Chapter 7 discusses the available methods for the extension of the zero reference level into shallow regions of the oceans and suggests a new method which is more reliable. A procedure of graphical smoothening of dynamic topographies between the error limits to provide more reliable results is also suggested in this chapter. Chapter 8 deals with the computation of the geostrophic current from these smoothened values of dynamic heights, with reference to the selected zero reference level. The summary and conclusion are also presented in this chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation provides techniques for combining observations and prior model forecasts to create initial conditions for numerical weather prediction (NWP). The relative weighting assigned to each observation in the analysis is determined by its associated error. Remote sensing data usually has correlated errors, but the correlations are typically ignored in NWP. Here, we describe three approaches to the treatment of observation error correlations. For an idealized data set, the information content under each simplified assumption is compared with that under correct correlation specification. Treating the errors as uncorrelated results in a significant loss of information. However, retention of an approximated correlation gives clear benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under transmission errors. The idea of ErDCF is to use high data rate nodes to work as relays for the low data rate nodes. ErDCF achieves higher throughput and reduces energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF) in an ideal channel environment. However, there is a possibility that this expected gain may decrease in the presence of transmission errors. In this work, we modify the saturation throughput model of ErDCF to accurately reflect the impact of transmission errors under different rate combinations. It turns out that the throughput gain of ErDCF can still be maintained under reasonable link quality and distance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assimilation of temperature observations into an ocean model near the equator often results in a dynamically unbalanced state with unrealistic overturning circulations. The way in which these circulations arise from systematic errors in the model or its forcing is discussed. A scheme is proposed, based on the theory of state augmentation, which uses the departures of the model state from the observations to update slowly evolving bias fields. Results are summarized from an experiment applying this bias correction scheme to an ocean general circulation model. They show that the method produces more balanced analyses and a better fit to the temperature observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SST errors in the tropical Atlantic are large and systematic in current coupled general-circulation models. We analyse the growth of these errors in the region of the south-eastern tropical Atlantic in initialised decadal hindcasts integrations for three of the models participating in the Coupled Model Inter-comparison Project 5. A variety of causes for the initial bias development are identified, but a crucial involvement is found, in all cases considered, of ocean-atmosphere coupling for their maintenance. These involve an oceanic “bridge” between the Equator and the Benguela-Angola coastal seas which communicates sub-surface ocean anomalies and constitutes a coupling between SSTs in the south-eastern tropical Atlantic and the winds over the Equator. The resulting coupling between SSTs, winds and precipitation represents a positive feedback for warm SST errors in the south-eastern tropical Atlantic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To examine the causes of prescribing and monitoring errors in English general practices and provide recommendations for how they may be overcome. Design: Qualitative interview and focus group study with purposive sampling and thematic analysis informed by Reason’s accident causation model. Participants: General practice staff participated in a combination of semi-structured interviews (n=34) and six focus groups (n=46). Setting: Fifteen general practices across three primary care trusts in England. Results: We identified seven categories of high-level error-producing conditions: the prescriber, the patient, the team, the task, the working environment, the computer system, and the primary-secondary care interface. Each of these was further broken down to reveal various error-producing conditions. The prescriber’s therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health, were all identified as possible causes. The patient’s characteristics and the complexity of the individual clinical case were also found to have contributed to prescribing errors. The importance of feeling comfortable within the practice team was highlighted, as well as the safety of general practitioners (GPs) in signing prescriptions generated by nurses when they had not seen the patient for themselves. The working environment with its high workload, time pressures, and interruptions, and computer related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts, were all highlighted as possible causes of prescribing errors and often interconnected. Conclusion: This study has highlighted the complex underlying causes of prescribing and monitoring errors in general practices, several of which are amenable to intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article develops a life-cycle general equilibrium model with heterogeneous agents who make choices of nondurables consumption, investment in homeowned housing and labour supply. Agents retire from an specific age and receive Social Security benefits which are dependant on average past earnings. The model is calibrated, numerically solved and is able to match stylized U.S. aggregate statistics and to generate average life-cycle profiles of its decision variables consistent with data and literature. We also conduct an exercise of complete elimination of the Social Security system and compare its results with the benchmark economy. The results enable us to emphasize the importance of endogenous labour supply and benefits for agents' consumption-smoothing behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a model for rattling in single-stage gearbox systems with some backlash consisting of two wheels with a sinusoidal driving; the equations of motions are analytically integrated between two impacts of the gear teeth. Just after each impact, a mapping is used to obtain the dynamical variables. We have observed a rich dynamical behavior in such system, by varying its control parameters, and we focus on intermittent switching between laminar oscillations and chaotic bursting, as well as crises, which are sudden changes in the chaotic behavior. The corresponding transient basins in phase space are found to be riddled-like, with a highly interwoven fractal structure. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to the working memory model, the phonological loop is the component of working memory specialized in processing and manipulating limited amounts of speech-based information. The Children's Test of Nonword Repetition (CNRep) is a suitable measure of phonological short-term memory for English-speaking children, which was validated by the Brazilian Children's Test of Pseudoword Repetition (BCPR) as a Portuguese-language version. The objectives of the present study were: i) to investigate developmental aspects of the phonological memory processing by error analysis in the nonword repetition task, and ii) to examine phoneme (substitution, omission and addition) and order (migration) errors made in the BCPR by 180 normal Brazilian children of both sexes aged 4-10, from preschool to 4th grade. The dominant error was substitution [F(3,525) = 180.47; P < 0.0001]. The performance was age-related [F(4,175) = 14.53; P < 0.0001]. The length effect, i.e., more errors in long than in short items, was observed [F(3,519) = 108.36; P < 0.0001]. In 5-syllable pseudowords, errors occurred mainly in the middle of the stimuli, before the syllabic stress [F(4,16) = 6.03; P = 0.003]; substitutions appeared more at the end of the stimuli, after the stress [F(12,48) = 2.27; P = 0.02]. In conclusion, the BCPR error analysis supports the idea that phonological loop capacity is relatively constant during development, although school learning increases the efficiency of this system. Moreover, there are indications that long-term memory contributes to holding memory trace. The findings were discussed in terms of distinctiveness, clustering and redintegration hypotheses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systematic errors can have a significant effect on GPS observable. In medium and long baselines the major systematic error source are the ionosphere and troposphere refraction and the GPS satellites orbit errors. But, in short baselines, the multipath is more relevant. These errors degrade the accuracy of the positioning accomplished by GPS. So, this is a critical problem for high precision GPS positioning applications. Recently, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique. It uses a natural cubic spline to model the errors as a function which varies smoothly in time. The systematic errors functions, ambiguities and station coordinates, are estimated simultaneously. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method.