899 resultados para Error correction methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Myocardial Perfusion Imaging (MPI) is a very important tool in the assessment of Coronary Artery Disease ( CAD ) patient s and worldwide data demonstrate an increasingly wider use and clinical acceptance. Nevertheless, it is a complex process and it is quite vulnerable concerning the amount and type of possible artefacts, some of them affecting seriously the overall quality and the clinical utility of the obtained data. One of the most in convenient artefacts , but relatively frequent ( 20% of the cases ) , is relate d with patient motion during image acquisition . Mostly, in those situations, specific data is evaluated and a decisi on is made between A) accept the results as they are , consider ing that t he “noise” so introduced does not affect too seriously the final clinical information, or B) to repeat the acquisition process . Another possib ility could be to use the “ Motion Correcti on Software” provided within the software package included in any actual gamma camera. The aim of this study is to compare the quality of the final images , obtained after the application of motion correction software and after the repetition of image acqui sition. Material and Methods Thirty cases of MPI affected by Motion Artefacts and repeated , were used. A group of three, independent (blinded for the differences of origin) expert Nuclear Medicine Clinicians had been invited to evaluate the 30 sets of thre e images - one set for each patient - being ( A) original image , motion uncorrected , (B) original image, motion corrected, and (C) second acquisition image, without motion . The results so obtained were statistically analysed . Results and Conclusion Results obtained demonstrate that the use of the Motion Correction Software is useful essentiall y if the amplitude of movement is not too important (with this specific quantification found hard to define precisely , due to discrepancies between clinicians and other factors , namely between one to another brand); when that is not the case and the amplitude of movement is too important , the n the percentage of agreement between clinicians is much higher and the repetition of the examination is unanimously considered ind ispensable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The quantification of th e differential renal function in adults can be difficult due to many factors - on e of the se is the variances in kidney depth and the attenuation related with all the tissue s between the kidney and the camera. Some authors refer that t he lower attenuation i n p ediatric patients makes unnecessary the use of attenuation correction algorithms. This study will com pare the values of differential renal function obtained with and with out attenuation correction techniques . Material and Methods: Images from a group consisting of 15 individuals (aged 3 years +/ - 2) were used and two attenuation correction method s were applied – Tonnesen correction factors and the geometric mean method . The mean time of acquisition (time post 99m Tc - DMSA administration) was 3.5 hours +/ - 0.8h. Results: T he absence of any method of attenuation correction apparently seems to lead to consistent values that seem to correlate well with the ones obtained with the incorporation of methods of attenuation correction . The differences found between the values obtained with and without attenuation correction were not significant. Conclusion: T he decision of not doing any kind of attenuation correction method can apparently be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for a really accurate value of the relative kidney uptake, then an attenuation correction method should be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Visual anomalies that affect school-age children represent an important public health problem. Data on the prevalence are lacking in Portugal but is needed for planning vision services. This study was conducted to determine the prevalence of strabismus, decreased visual acuity, and uncorrected refractive error in Portuguese children aged 6 to 11 years. Methods and materials: A cross-sectional study was carried out on a sample of 672 school-age children (7.69 ± 1.19 years). Children received an orthoptic assessment (visual acuity, ocular alignment, and ocular movements) and non-cycloplegic autorefraction. Results: After orthoptic assessment, 13.8% of children were considered abnormal (n = 93). Manifest strabismus was found in 4% of the children. Rates of esotropia (2.1%) were slightly higher than exotropia (1.8%). Strabismus rates were not statistically significant different per sex (p = 0.681) and grade (p = 0.228). Decreased visual acuity at distance was present in 11.3% of children. Visual acuity ≤20/66 (0.5 logMAR) was found in 1.3% of the children. We also found that 10.3% of children had an uncorrected refractive error. Conclusions: Strabismus affects a small proportion of the Portuguese school-age children. Decreased visual acuity and uncorrected refractive error affected a significant proportion of school-age children. New policies need to be developed to address this public health problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To examine whether demographic, socioeconomic conditions, oral health subjectivity and characterization of dental care are associated with users’ dissatisfaction with such are.METHODS Cross-sectional study of 781 people who required dental care in Montes Claros, MG, Southeastern Brazil, in 2012, a city with of medium-sized population situated in the North of Minas Gerais. Household interviews were conducted to assess the users’ dissatisfaction with dental care (dependent variable), demographic, socioeconomic conditions, oral health subjectivity and characterization of dental care (independent variables). Sample calculation was used for the finite population, with estimates made for proportions of dissatisfaction in 50.0% of the population, a 5.0% error margin, a non-response rate of 5.0% and a 2.0% design effect. Logistic regression was used, and the odds ratio was calculated with a 5% significance level and 95% confidence intervals.RESULTS Of the interviewed individuals, 9.0% (7.9%, with correction for design effect) were dissatisfied with the care provided. These were associated with lower educational level; negative self-assessment of oral health; perception that the care provider was unable to give dental care; negative evaluation of the way the patient was treated, the cleanliness of the rooms, based on the examination rooms and the toilets, and the size of the waiting and examination rooms.CONCLUSIONS The rate of dissatisfaction with dental care was low. This dissatisfaction was associated with socioeconomic conditions, subjectivity of oral health, skill of the health professionals relating to the professional-patient relationship and facility infrastructure. Educational interventions are suggested that aim at improving the quality of care among professionals by responsible agencies as is improving the infrastructure of the care units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Standard Uptake Value (SUV) is a measurement of the uptake in a tumour normalized on the basis of a distribution volume and is used to quantify 18F-Fluorodeoxiglucose (FDG) uptake in tumors, such as primary lung tumor. Several sources of error can affect its accuracy. Normalization can be based on body weight, body surface area (BSA) and lean body mass (LBM). The aim of this study is to compare the influence of 3 normalization volumes in the calculation of SUV: body weight (SUVW), BSA (SUVBSA) and LBM (SUVLBM), with and without glucose correction, in patients with known primary lung tumor. The correlation between SUV and weight, height, blood glucose level, injected activity and time between injection and image acquisition is evaluated. Methods: Sample included 30 subjects (8 female and 22 male) with primary lung tumor, with clinical indication for 18F-FDG Positron Emission Tomography (PET). Images were acquired on a Siemens Biography according to the department’s protocol. Maximum pixel SUVW was obtained for abnormal uptake focus through semiautomatic VOI with Quantification 3D isocontour (threshold 2.5). The concentration of radioactivity (kBq/ml) was obtained from SUVW, SUVBSA, SUVLBM and the glucose corrected SUV were mathematically obtained. Results: Statistically significant differences between SUVW, SUVBSA and SUVLBM and between SUVWgluc, SUVBSAgluc and SUVLBMgluc were observed (p=0.000<0.05). The blood glucose level showed significant positive correlations with SUVW (r=0.371; p=0.043) and SUVLBM (r=0.389; p=0.034). SUVBSA showed independence of variations with the blood glucose level. Conclusion: The measurement of a radiopharmaceutical tumor uptake normalized on the basis of different distribution volumes is still variable. Further investigation on this subject is recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proceedings of the Information Technology Applications in Biomedicine, Ioannina - Epirus, Greece, October 26-28, 2006

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The SiC optical processor for error detection and correction is realized by using double pin/pin a-SiC:H photodetector with front and back biased optical gating elements. Data shows that the background act as selector that pick one or more states by splitting portions of the input multi optical signals across the front and back photodiodes. Boolean operations such as exclusive OR (EXOR) and three bit addition are demonstrated optically with a combination of such switching devices, showing that when one or all of the inputs are present the output will be amplified, the system will behave as an XOR gate representing the SUM. When two or three inputs are on, the system acts as AND gate indicating the present of the CARRY bit. Additional parity logic operations are performed by use of the four incoming pulsed communication channels that are transmitted and checked for errors together. As a simple example of this approach, we describe an all optical processor for error detection and correction and then, provide an experimental demonstration of this fault tolerant reversible system, in emerging nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do European Master in Computational Logics, como requisito parcial para obtenção do grau de Mestre em Computational Logics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation to Obtain the Degree of Master in Biomedical Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare measurements of the upper arm cross-sectional areas (total arm area,arm muscle area, and arm fat area of healthy neonates) as calculated using anthropometry with the values obtained by ultrasonography. Materials and methods: This study was performed on 60 consecutively born healthy neonates: gestational age (mean6SD) 39.661.2 weeks, birth weight 3287.16307.7 g, 27 males (45%) and 33 females (55%). Mid-arm circumference and tricipital skinfold thickness measurements were taken on the left upper mid-arm according to the conventional anthropometric method to calculate total arm area, arm muscle area and arm fat area. The ultrasound evaluation was performed at the same arm location using a Toshiba sonolayer SSA-250AÒ, which allows the calculation of the total arm area, arm muscle area and arm fat area by the number of pixels enclosed in the plotted areas. Statistical analysis: whenever appropriate, parametric and non-parametric tests were used in order to compare measurements of paired samples and of groups of samples. Results: No significant differences between males and females were found in any evaluated measurements, estimated either by anthropometry or by ultrasound. Also the median of total arm area did not differ significantly with either method (P50.337). Although there is evidence of concordance of the total arm area measurements (r50.68, 95% CI: 0.55–0.77) the two methods of measurement differed for arm muscle area and arm fat area. The estimated median of measurements by ultrasound for arm muscle area were significantly lower than those estimated by the anthropometric method, which differed by as much as 111% (P,0.001). The estimated median ultrasound measurement of the arm fat was higher than the anthropometric arm fat area by as much as 31% (P,0.001). Conclusion: Compared with ultrasound measurements using skinfold measurements and mid-arm circumference without further correction may lead to overestimation of the cross-sectional area of muscle and underestimation of the cross-sectional fat area. The correlation between the two methods could be interpreted as an indication for further search of correction factors in the equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia do Ambiente, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate whether left ventricular end-systolic (ESD) diameters £ 51mm in patients (pt) with severe chronic mitral regurgitation (MR) are predictors of a poor prognosis after mitral valve surgery (MVS). METHODS: Eleven pt (aged 36±13 years) were studied in the preoperative period (pre), median of 36 days; in the early postoperative period (post1), median of 9 days; and in the late postoperative period (post2), mean of 38.5±37.6 months. Clinical and echocardiographic data were gathered from each pt with MR and systolic diameter ³51mm (mean = 57±4mm) to evaluate the result of MVS. Ten patients were in NYHA Class III/IV. RESULTS: All but 2 pt improved in functional class. Two pt died from heart failure and infectious endocarditis 14 and 11 months, respectively, after valve replacement. According to ejection fraction (EF) in post2, we identified 2 groups: group 1 (n=6), whose EF decreased in post1, but increased in post2 (p=0.01) and group 2 (n=5), whose EF decreased progressively from post1 to post2 (p=0.10). All pt with symptoms lasting £ 48 months had improvement in EF in post2 (p=0.01). CONCLUSION: ESD ³51mm are not always associated with a poor prognosis after MVS in patients with MR. Symptoms lasting up to 48 months are associated with improvement in left ventricular function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.