964 resultados para Roundness errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the effectiveness of an inpatient electronic medication record system in reducing medication errors in Singaporean hospitals. This pre- and post-intervention study involving a control group was undertaken in two Singaporean acute care hospitals. In one hospital the inpatient electronic medication record system was implemented while in another hospital the paper-based medication record system was used. The mean incidence difference in medication errors of 0.06 between pre-intervention (0.72 per 1000 patient days) and post-intervention (0.78 per 1000 patient days) for the two hospitals was not statistically significant (95%, CI: [0.26, 0.20]). The mean incidence differences in medication errors relating to prescription, dispensing, and administration were also not statistically different. Common system failures involved a lack of medication knowledge by health professionals and a lack of a systematic approach in identifying correct dosages. There was no difference in the incidence of medication errors following the introduction of the electronic medication record system. More work is needed on how this system can reduce medication error rates and improve medication safety. © 2013 Wiley Publishing Asia Pty Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that in the context of the classical regression model with heteroskedastic errors, while ordinary least squares (OLS) is not efficient, the weighted least squares (WLS) and quasi-maximum likelihood (QML) estimators that utilize the information contained in the heteroskedasticity are. In the context of unit root testing with conditional heteroskedasticity, while intuition suggests that a similar result should apply, the relative performance of the tests associated with the OLS, WLS and QML estimators is not well understood. In particular, while QML has been shown to be able to generate more powerful tests than OLS, not much is known regarding the relative performance of the WLS-based test. By providing an in-depth comparison of the tests, the current paper fills this gap in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The CADF test of Pesaran (J Appl Econ 22:265–312, 2007) are among the most popular univariate tests for cross-section correlated panels around. Yet, the existing asymptotic analysis of this test statistic is limited to a model in which the errors are assumed to follow a simple AR(1) structure with homogenous autoregressive coefficients. One reason for this is that the model involves an intricate identification issue, as both the serial and cross-section correlation structures of the errors are unobserved. The purpose of the current paper is to tackle this issue and in so doing extend the existing analysis to the case of AR((Formula presented.)) errors with possibly heterogeneous coefficients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers two-sided tests for the parameter of an endogenous variable in an instrumental variable (IV) model with heteroskedastic and autocorrelated errors. We develop the nite-sample theory of weighted-average power (WAP) tests with normal errors and a known long-run variance. We introduce two weights which are invariant to orthogonal transformations of the instruments; e.g., changing the order in which the instruments appear. While tests using the MM1 weight can be severely biased, optimal tests based on the MM2 weight are naturally two-sided when errors are homoskedastic. We propose two boundary conditions that yield two-sided tests whether errors are homoskedastic or not. The locally unbiased (LU) condition is related to the power around the null hypothesis and is a weaker requirement than unbiasedness. The strongly unbiased (SU) condition is more restrictive than LU, but the associated WAP tests are easier to implement. Several tests are SU in nite samples or asymptotically, including tests robust to weak IV (such as the Anderson-Rubin, score, conditional quasi-likelihood ratio, and I. Andrews' (2015) PI-CLC tests) and two-sided tests which are optimal when the sample size is large and instruments are strong. We refer to the WAP-SU tests based on our weights as MM1-SU and MM2-SU tests. Dropping the restrictive assumptions of normality and known variance, the theory is shown to remain valid at the cost of asymptotic approximations. The MM2-SU test is optimal under the strong IV asymptotics, and outperforms other existing tests under the weak IV asymptotics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SOUZA, Anderson A. S. ; SANTANA, André M. ; BRITTO, Ricardo S. ; GONÇALVES, Luiz Marcos G. ; MEDEIROS, Adelardo A. D. Representation of Odometry Errors on Occupancy Grids. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To determine the prevalence of refractive errors in the public and private school system in the city of Natal, Northeastern Brazil. Methods: Refractometry was performed on both eyes of 1,024 randomly selected students, enrolled in the 2001 school year and the data were evaluated by the SPSS Data Editor 10.0. Ametropia was divided into: 1- from 0.1 to 0.99 diopter (D); 2- 1.0 to 2.99D; 3- 3.00 to 5.99D and 4- 6D or greater. Astigmatism was regrouped in: I- with-the-rule (axis from 0 to 30 and 150 to 180 degrees), II- against-the-rule (axis between 60 and 120 degrees) and III- oblique (axis between > 30 and < 60 and >120 and <150 degrees). The age groups were categorized as follows, in: 1- 5 to 10 years, 2- 11 to 15 years, 3- 16 to 20 years, 4- over 21 years. Results: Among refractive errors, hyperopia was the most common with 71%, followed by astigmatism (34%) and myopia (13.3%). Of the students with myopia and hyperopia, 48.5% and 34.1% had astigmatism, respectively. With respect to diopters, 58.1% of myopic students were in group 1, and 39% distributed between groups 2 and 3. Hyperopia were mostly found in group 1 (61.7%) as well as astigmatism (70.6%). The association of the astigmatism axes of both eyes showed 92.5% with axis with-the-rule in both eyes, while the percentage for those with axis againstthe- rule was 82.1% and even lower for the oblique axis (50%). Conclusion: The results found differed from those of most international studies, mainly from the Orient, which pointed to myopia as the most common refractive error, and corroborates the national ones, with the majority being hyperopia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As condições de ambiente térmico e aéreo, no interior de instalações para animais, alteram-se durante o dia, devido à influência do ambiente externo. Para que análises estatísticas e geoestatísticas sejam representativas, uma grande quantidade de pontos distribuídos espacialmente na área da instalação deve ser monitorada. Este trabalho propõe que a variação no tempo das variáveis ambientais de interesse para a produção animal, monitoradas no interior de instalações para animais, pode ser modelada com precisão a partir de registros discretos no tempo. O objetivo deste trabalho foi desenvolver um método numérico para corrigir as variações temporais dessas variáveis ambientais, transformando os dados para que tais observações independam do tempo gasto durante a aferição. O método proposto aproximou os valores registrados com retardos de tempo aos esperados no exato momento de interesse, caso os dados fossem medidos simultaneamente neste momento em todos os pontos distribuídos espacialmente. O modelo de correção numérica para variáveis ambientais foi validado para o parâmetro ambiental temperatura do ar, sendo que os valores corrigidos pelo método não diferiram pelo teste Tukey, a 5% de probabilidade dos valores reais registrados por meio de dataloggers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements and the high degree of connectivity between these elements. This paper presents a novel approach to solve robust parameter estimation problem for nonlinear model with unknown-but-bounded errors and uncertainties. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach. Copyright (C) 2000 IFAC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several factors render carotenoid determination inherently difficult. Thus, in spite of advances in analytical instrumentation, discrepancies in quantitative results on carotenoids can be encountered in the international literature. A good part of the errors comes from the pre-chromatographic steps such as sampling scheme that does not yield samples representative of the food lots under investigation; sample preparation which does not maintain representativity and guarantee homogeneity of the analytical sample; incomplete extraction; physical losses of carotenoids during the various steps, especially during partition or washing and by adsorption to glass walls of containers; isomerization and oxidation of carotenoids during analysis. on the otherhand, although currently considered the method of choice for carotenoids, high performance liquid chromatography (HPLC) is subject to various sources of errors, such as: incompatibility of the injection solvent and the mobile phase, resulting in distorted or split peaks; erroneous identification; unavailability, impurity and instability of carotenoid standards; quantification of highly overlapping peaks; low recovery from the HPLC column; errors in the preparation of standard solutions and in the calibration procedure; calculation errors. Illustrations of the possible errors in the quantification of carotenoids by HPLC are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the working memory model, the phonological loop is the component of working memory specialized in processing and manipulating limited amounts of speech-based information. The Children's Test of Nonword Repetition (CNRep) is a suitable measure of phonological short-term memory for English-speaking children, which was validated by the Brazilian Children's Test of Pseudoword Repetition (BCPR) as a Portuguese-language version. The objectives of the present study were: i) to investigate developmental aspects of the phonological memory processing by error analysis in the nonword repetition task, and ii) to examine phoneme (substitution, omission and addition) and order (migration) errors made in the BCPR by 180 normal Brazilian children of both sexes aged 4-10, from preschool to 4th grade. The dominant error was substitution [F(3,525) = 180.47; P < 0.0001]. The performance was age-related [F(4,175) = 14.53; P < 0.0001]. The length effect, i.e., more errors in long than in short items, was observed [F(3,519) = 108.36; P < 0.0001]. In 5-syllable pseudowords, errors occurred mainly in the middle of the stimuli, before the syllabic stress [F(4,16) = 6.03; P = 0.003]; substitutions appeared more at the end of the stimuli, after the stress [F(12,48) = 2.27; P = 0.02]. In conclusion, the BCPR error analysis supports the idea that phonological loop capacity is relatively constant during development, although school learning increases the efficiency of this system. Moreover, there are indications that long-term memory contributes to holding memory trace. The findings were discussed in terms of distinctiveness, clustering and redintegration hypotheses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systematic errors can have a significant effect on GPS observable. In medium and long baselines the major systematic error source are the ionosphere and troposphere refraction and the GPS satellites orbit errors. But, in short baselines, the multipath is more relevant. These errors degrade the accuracy of the positioning accomplished by GPS. So, this is a critical problem for high precision GPS positioning applications. Recently, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique. It uses a natural cubic spline to model the errors as a function which varies smoothly in time. The systematic errors functions, ambiguities and station coordinates, are estimated simultaneously. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method.