103 resultados para Forward error correcting code
Resumo:
Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable
Resumo:
Kirton's Adaption-Innovation Inventory (KAI) is a widely-used measure of "cognitive style." Surprisingly, there is very little research investigating the discriminant and incremental validity of the KAI. In two studies (n = 213), we examined whether (a) we could predict KAI scores with the "big five" personality dimensions and (b) the KAI scores predicted leadership behavior when controlling for personality and ability. Correcting for measurement error, we found that KAI scores were predicted mostly by personality and gender (multiple R = 0.82). KAI scores did not predict variance in leadership while controlling for established predictors. Our findings add to recent literature that questions the uniqueness and utility of cognitive style or similar "style" constructs; researchers using such measures must control for the big five factors and correct for measurement error to avoid confounded interpretations.
Resumo:
Optimal behavior relies on flexible adaptation to environmental requirements, notably based on the detection of errors. The impact of error detection on subsequent behavior typically manifests as a slowing down of RTs following errors. Precisely how errors impact the processing of subsequent stimuli and in turn shape behavior remains unresolved. To address these questions, we used an auditory spatial go/no-go task where continual feedback informed participants of whether they were too slow. We contrasted auditory-evoked potentials to left-lateralized go and right no-go stimuli as a function of performance on the preceding go stimuli, generating a 2 × 2 design with "preceding performance" (fast hit [FH], slow hit [SH]) and stimulus type (go, no-go) as within-subject factors. SH trials yielded SH trials on the following trials more often than did FHs, supporting our assumption that SHs engaged effects similar to errors. Electrophysiologically, auditory-evoked potentials modulated topographically as a function of preceding performance 80-110 msec poststimulus onset and then as a function of stimulus type at 110-140 msec, indicative of changes in the underlying brain networks. Source estimations revealed a stronger activity of prefrontal regions to stimuli after successful than error trials, followed by a stronger response of parietal areas to the no-go than go stimuli. We interpret these results in terms of a shift from a fast automatic to a slow controlled form of inhibitory control induced by the detection of errors, manifesting during low-level integration of task-relevant features of subsequent stimuli, which in turn influences response speed.
Resumo:
The multiscale finite-volume (MSFV) method is designed to reduce the computational cost of elliptic and parabolic problems with highly heterogeneous anisotropic coefficients. The reduction is achieved by splitting the original global problem into a set of local problems (with approximate local boundary conditions) coupled by a coarse global problem. It has been shown recently that the numerical errors in MSFV results can be reduced systematically with an iterative procedure that provides a conservative velocity field after any iteration step. The iterative MSFV (i-MSFV) method can be obtained with an improved (smoothed) multiscale solution to enhance the localization conditions, with a Krylov subspace method [e.g., the generalized-minimal-residual (GMRES) algorithm] preconditioned by the MSFV system, or with a combination of both. In a multiphase-flow system, a balance between accuracy and computational efficiency should be achieved by finding a minimum number of i-MSFV iterations (on pressure), which is necessary to achieve the desired accuracy in the saturation solution. In this work, we extend the i-MSFV method to sequential implicit simulation of time-dependent problems. To control the error of the coupled saturation/pressure system, we analyze the transport error caused by an approximate velocity field. We then propose an error-control strategy on the basis of the residual of the pressure equation. At the beginning of simulation, the pressure solution is iterated until a specified accuracy is achieved. To minimize the number of iterations in a multiphase-flow problem, the solution at the previous timestep is used to improve the localization assumption at the current timestep. Additional iterations are used only when the residual becomes larger than a specified threshold value. Numerical results show that only a few iterations on average are necessary to improve the MSFV results significantly, even for very challenging problems. Therefore, the proposed adaptive strategy yields efficient and accurate simulation of multiphase flow in heterogeneous porous media.
Resumo:
Whole-body counting is a technique of choice for assessing the intake of gamma-emitting radionuclides. An appropriate calibration is necessary, which is done either by experimental measurement or by Monte Carlo (MC) calculation. The aim of this work was to validate a MC model for calibrating whole-body counters (WBCs) by comparing the results of computations with measurements performed on an anthropomorphic phantom and to investigate the effect of a change in phantom's position on the WBC counting sensitivity. GEANT MC code was used for the calculations, and an IGOR phantom loaded with several types of radionuclides was used for the experimental measurements. The results show a reasonable agreement between measurements and MC computation. A 1-cm error in phantom positioning changes the activity estimation by >2%. Considering that a 5-cm deviation of the positioning of the phantom may occur in a realistic counting scenario, this implies that the uncertainty of the activity measured by a WBC is ∼10-20%.
Resumo:
Résumé Les experts forensiques en documents peuvent être confrontés à des écritures réalisées en conditions non conventionnelles. Ces circonstances atypiques pourraient être à l'origine d'une plus grande variabilité de la forme de l'écriture, en particulier lorsque des positions à priori inhabituelles du corps et / ou du support sont impliquées. En effet, en dépit de son aspect stéréotypé /standardisé évident, résultat d'un apprentissage par un modèle, notre écriture est caractérisée par une variabilité intrinsèque de la forme, qui évolue au cours du temps et qui, dans sa dimension qualitative, confère à l'écriture son caractère individuel. En d'autres termes, nous n'écrivons jamais deux fois de la même façon. Cette variabilité intraindividuelle (ou intra-variabilité) observée en condition conventionnelle, c'est-à-dire assis devant un support horizontal, pourrait augmenter en conditions non conventionnelles, par exemple dans une position inconfortable. Cela pourrait rendre plus difficile l'identification d'écrits apposés dans une condition non conventionnelle ou inconnue. Ne pas connaître les circonstances d'apposition d'une mention manuscrite ou ne pas s'interroger sur ces dernières, pourrait conduire l'expert à faire des erreurs d'appréciation. Et le simple fait d'étudier une trace sur laquelle le corps peut exercer une influence fait de l'expertise en écriture une spécialité qui se distingue des autres disciplines forensiques. En cela, la trace écrite diffère des autres types de traces "inanimées" (physiques, chimiques, bigchimiques) considérées comme invariables (mais potentiellement sensibles à d'autres phénomènes tels que la température, la pression atmosphérique...). En effet, le mouvement d'écriture étant commandé et contrôlé par le cerveau, cela lui confère une certaine variabilité. Il est donc assez logique de penser que la connaissance des mécanismes neuroscientifiques à l'origine de ce mouvement facilitera la compréhension des phénomènes observés d'un point de vue forensique. Deux expériences ont été menées afin de comparer les performances de sujets écrivant dans différentes conditions (conventionnelle vs. non conventionnelles). Les résultats ont montré que cinq des sept conditions non conventionnelles n'avaient pas d'impact significatif sur la variabilité d'écriture. L'ensemble des résultats fournit aux experts forensiques des pistes leur permettant de mieux appréhender les écritures rédigées dans des conditions inhabituelles.
Resumo:
Current measures of ability emotional intelligence (EI)--including the well-known Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT)--suffer from several limitations, including low discriminant validity and questionable construct and incremental validity. We show that the MSCEIT is largely predicted by personality dimensions, general intelligence, and demographics having multiple R's with the MSCEIT branches up to .66; for the general EI factor this relation was even stronger (Multiple R = .76). As concerns the factor structure of the MSCEIT, we found support for four first-order factors, which had differential relations with personality, but no support for a higher-order global EI factor. We discuss implications for employing the MSCEIT, including (a) using the single branches scores rather than the total score, (b) always controlling for personality and general intelligence to ensure unbiased parameter estimates in the EI factors, and (c) correcting for measurement error. Failure to account for these methodological aspects may severely compromise predictive validity testing. We also discuss avenues for the improvement of ability-based tests.
Resumo:
Objective: Jaundice is the clinical manifestation, of hyperbilirubinemia. It is considered as a sign of either a liver disease or, less often, of a hemolytic disorder. It can be divided into obstructive and non obstructive type, involving increase of indirect (non-conjugated) bilirubin or increase of direct (conjugated) bilirubin, respectively, but it can be also manifested as mixed type. Methods: This article updates the current knoweledge concerning the jaundice's etiology, pathophysiological mechanisms, and complications ant treatment by reviewing of the latest medical literature. It also presents an approach of jaundice's treatment and pathogenesis, in special populations as in neonates and pregnant women. Results: The treatment is consistent in the management of the subjective diseases responsible for the jaundice and its complications.The clinical prognosis of the jaundice depends on the etiology. Surgical treatment of jaundiced patients is associated with high mortality and morbidity rates. Studies have shown that the severity of jaundice and the presence of malignant disease are importan risk factors for post-operative mortality. Conclusions: Early detection of jaundice is of vital importance because of its involvement in malignancy or in other benign conditions requiring immediate treatment in order to avoid further complications.