927 resultados para Error correction coding


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Web-based tool developed to automatically correct relational database schemas is presented. This tool has been integrated into a more general e-learning platform and is used to reinforce teaching and learning on database courses. This platform assigns to each student a set of database problems selected from a common repository. The student has to design a relational database schema and enter it into the system through a user friendly interface specifically designed for it. The correction tool corrects the design and shows detected errors. The student has the chance to correct them and send a new solution. These steps can be repeated as many times as required until a correct solution is obtained. Currently, this system is being used in different introductory database courses at the University of Girona with very promising results

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The author studies the error and complexity of the discrete random walk Monte Carlo technique for radiosity, using both the shooting and gathering methods. The author shows that the shooting method exhibits a lower complexity than the gathering one, and under some constraints, it has a linear complexity. This is an improvement over a previous result that pointed to an O(n log n) complexity. The author gives and compares three unbiased estimators for each method, and obtains closed forms and bounds for their variances. The author also bounds the expected value of the mean square error (MSE). Some of the results obtained are also shown

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Respiratory motion correction remains a challenge in coronary magnetic resonance imaging (MRI) and current techniques, such as navigator gating, suffer from sub-optimal scan efficiency and ease-of-use. To overcome these limitations, an image-based self-navigation technique is proposed that uses "sub-images" and compressed sensing (CS) to obtain translational motion correction in 2D. The method was preliminarily implemented as a 2D technique and tested for feasibility for targeted coronary imaging. METHODS: During a 2D segmented radial k-space data acquisition, heavily undersampled sub-images were reconstructed from the readouts collected during each cardiac cycle. These sub-images may then be used for respiratory self-navigation. Alternatively, a CS reconstruction may be used to create these sub-images, so as to partially compensate for the heavy undersampling. Both approaches were quantitatively assessed using simulations and in vivo studies, and the resulting self-navigation strategies were then compared to conventional navigator gating. RESULTS: Sub-images reconstructed using CS showed a lower artifact level than sub-images reconstructed without CS. As a result, the final image quality was significantly better when using CS-assisted self-navigation as opposed to the non-CS approach. Moreover, while both self-navigation techniques led to a 69% scan time reduction (as compared to navigator gating), there was no significant difference in image quality between the CS-assisted self-navigation technique and conventional navigator gating, despite the significant decrease in scan time. CONCLUSIONS: CS-assisted self-navigation using 2D translational motion correction demonstrated feasibility of producing coronary MRA data with image quality comparable to that obtained with conventional navigator gating, and does so without the use of additional acquisitions or motion modeling, while still allowing for 100% scan efficiency and an improved ease-of-use. In conclusion, compressed sensing may become a critical adjunct for 2D translational motion correction in free-breathing cardiac imaging with high spatial resolution. An expansion to modern 3D approaches is now warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global emergence and spread of malaria parasites resistant to antimalarial drugs is the major problem in malaria control. The genetic basis of the parasite's resistance to the antimalarial drug chloroquine (CQ) is well-documented, allowing for the analysis of field isolates of malaria parasites to address evolutionary questions concerning the origin and spread of CQ-resistance. Here, we present DNA sequence analyses of both the second exon of the Plasmodium falciparum CQ-resistance transporter (pfcrt) gene and the 5' end of the P. falciparum multidrug-resistance 1 (pfmdr-1) gene in 40 P. falciparum field isolates collected from eight different localities of Odisha, India. First, we genotyped the samples for the pfcrt K76T and pfmdr-1 N86Y mutations in these two genes, which are the mutations primarily implicated in CQ-resistance. We further analyzed amino acid changes in codons 72-76 of the pfcrt haplotypes. Interestingly, both the K76T and N86Y mutations were found to co-exist in 32 out of the total 40 isolates, which were of either the CVIET or SVMNT haplotype, while the remaining eight isolates were of the CVMNK haplotype. In total, eight nonsynonymous single nucleotide polymorphisms (SNPs) were observed, six in the pfcrt gene and two in the pfmdr-1 gene. One poorly studied SNP in the pfcrt gene (A97T) was found at a high frequency in many P. falciparum samples. Using population genetics to analyze these two gene fragments, we revealed comparatively higher nucleotide diversity in the pfcrt gene than in the pfmdr-1 gene. Furthermore, linkage disequilibrium was found to be tight between closely spaced SNPs of the pfcrt gene. Finally, both the pfcrt and the pfmdr-1 genes were found to evolve under the standard neutral model of molecular evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND The lysophosphatidic acid LPA₁ receptor regulates plasticity and neurogenesis in the adult hippocampus. Here, we studied whether absence of the LPA₁ receptor modulated the detrimental effects of chronic stress on hippocampal neurogenesis and spatial memory. METHODOLOGY/PRINCIPAL FINDINGS Male LPA₁-null (NULL) and wild-type (WT) mice were assigned to control or chronic stress conditions (21 days of restraint, 3 h/day). Immunohistochemistry for bromodeoxyuridine and endogenous markers was performed to examine hippocampal cell proliferation, survival, number and maturation of young neurons, hippocampal structure and apoptosis in the hippocampus. Corticosterone levels were measured in another a separate cohort of mice. Finally, the hole-board test assessed spatial reference and working memory. Under control conditions, NULL mice showed reduced cell proliferation, a defective population of young neurons, reduced hippocampal volume and moderate spatial memory deficits. However, the primary result is that chronic stress impaired hippocampal neurogenesis in NULLs more severely than in WT mice in terms of cell proliferation; apoptosis; the number and maturation of young neurons; and both the volume and neuronal density in the granular zone. Only stressed NULLs presented hypocortisolemia. Moreover, a dramatic deficit in spatial reference memory consolidation was observed in chronically stressed NULL mice, which was in contrast to the minor effect observed in stressed WT mice. CONCLUSIONS/SIGNIFICANCE These results reveal that the absence of the LPA₁ receptor aggravates the chronic stress-induced impairment to hippocampal neurogenesis and its dependent functions. Thus, modulation of the LPA₁ receptor pathway may be of interest with respect to the treatment of stress-induced hippocampal pathology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[This corrects the article on p. e12773 in vol. 5.].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo desarrolla una herramienta que haga posible al profesor la creación de pruebas y la corrección automática de estas, de forma que el proceso de evaluación sea mucho más rápido y sin ningún tipo de error humano.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Recently, some US cohorts have shown a moderate association between red and processed meat consumption and mortality supporting the results of previous studies among vegetarians. The aim of this study was to examine the association of red meat, processed meat, and poultry consumption with the risk of early death in the European Prospective Investigation into Cancer and Nutrition (EPIC). METHODS Included in the analysis were 448,568 men and women without prevalent cancer, stroke, or myocardial infarction, and with complete information on diet, smoking, physical activity and body mass index, who were between 35 and 69 years old at baseline. Cox proportional hazards regression was used to examine the association of meat consumption with all-cause and cause-specific mortality. RESULTS As of June 2009, 26,344 deaths were observed. After multivariate adjustment, a high consumption of red meat was related to higher all-cause mortality (hazard ratio (HR) = 1.14, 95% confidence interval (CI) 1.01 to 1.28, 160+ versus 10 to 19.9 g/day), and the association was stronger for processed meat (HR = 1.44, 95% CI 1.24 to 1.66, 160+ versus 10 to 19.9 g/day). After correction for measurement error, higher all-cause mortality remained significant only for processed meat (HR = 1.18, 95% CI 1.11 to 1.25, per 50 g/d). We estimated that 3.3% (95% CI 1.5% to 5.0%) of deaths could be prevented if all participants had a processed meat consumption of less than 20 g/day. Significant associations with processed meat intake were observed for cardiovascular diseases, cancer, and 'other causes of death'. The consumption of poultry was not related to all-cause mortality. CONCLUSIONS The results of our analysis support a moderate positive association between processed meat consumption and mortality, in particular due to cardiovascular diseases, but also to cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clinical use of the Stejskal-Tanner diffusion weighted images is hampered by the geometric distortions that result from the large residual 3-D eddy current field induced. In this work, we aimed to predict, using linear response theory, the residual 3-D eddy current field required for geometric distortion correction based on phantom eddy current field measurements. The predicted 3-D eddy current field induced by the diffusion-weighting gradients was able to reduce the root mean square error of the residual eddy current field to ~1 Hz. The model's performance was tested on diffusion weighted images of four normal volunteers, following distortion correction, the quality of the Stejskal-Tanner diffusion-weighted images was found to have comparable quality to image registration based corrections (FSL) at low b-values. Unlike registration techniques the correction was not hindered by low SNR at high b-values, and results in improved image quality relative to FSL. Characterization of the 3-D eddy current field with linear response theory enables the prediction of the 3-D eddy current field required to correct eddy current induced geometric distortions for a wide range of clinical and high b-value protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. METHODS Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. DISCUSSION This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been suggested that converting, via a process of cross-coding, the listing used by the Swiss Disability Insurance (SDI) for their statistics into codes of the International Classification of Impairments, Disabilities, and Handicaps (ICIDH) would improve the quality and international comparability of disability statistics for Switzerland. Using two different methods we tested the feasibility of this cross-coding on a consecutive sample of 204 insured persons, examined at one of the medical observation centres of the SDI. Cross-coding is impossible, for all practical purposes, in a proportion varying between 30% and 100%, depending on the method of cross-coding, the level of disablement and the required quality of the resulting codes. Failure is due to lack of validity of the SDI codes: diseases are poorly described, consequences of diseases (disability and handicap, including loss of earning capacity), insufficiently described or not at all. Assessment of disability and handicap would provide necessary information for the SDI. It is concluded that the SDI should promote the use of the ICIDH in Switzerland, especially among medical practitioners whose assessment of work capacity is the key element in the decision to award benefits or propose rehabilitation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges.METHODS: Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities.RESULTS: For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven.CONCLUSIONS: Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hermansky-Pudlak syndrome (HPS) is a genetic disorder characterized by oculocutaneous albinism, bleeding tendency and susceptibility to pulmonary fibrosis. No curative therapy is available. Genetic correction directed to the lungs, bone marrow and/or gastro-intestinal tract might provide alternative forms of treatment for the diseases multi-systemic complications. We demonstrate that lentiviral-mediated gene transfer corrects the expression and function of the HPS1 gene in patient dermal melanocytes, which opens the way to development of gene therapy for HPS.