905 resultados para computer-aided qualitative data analysis software
Resumo:
Maxillofacial trauma resulting from falls in elderly patients is a major social and health care concern. Most of these traumatic events involve mandibular fractures. The aim of this study was to analyze stress distributions from traumatic loads applied on the symphyseal, parasymphyseal, and mandibular body regions in the elderly edentulous mandible using finite-element analysis (FEA). Computerized tomographic analysis of an edentulous macerated human mandible of a patient approximately 65 years old was performed. The bone structure was converted into a 3-dimensional stereolithographic model, which was used to construct the computer-aided design (CAD) geometry for FEA. The mechanical properties of cortical and cancellous bone were characterized as isotropic and elastic structures, respectively, in the CAD model. The condyles were constrained to prevent free movement in the x-, y-, and z-axes during simulation. This enabled the simulation to include the presence of masticatory muscles during trauma. Three different simulations were performed. Loads of 700 N were applied perpendicular to the surface of the cortical bone in the symphyseal, parasymphyseal, and mandibular body regions. The simulation results were evaluated according to equivalent von Mises stress distributions. Traumatic load at the symphyseal region generated low stress levels in the mental region and high stress levels in the mandibular neck. Traumatic load at the parasymphyseal region concentrated the resulting stress close to the mental foramen. Traumatic load in the mandibular body generated extensive stress in the mandibular body, angle, and ramus. FEA enabled precise mapping of the stress distribution in a human elderly edentulous mandible (neck and mandibular angle) in response to 3 different traumatic load conditions. This knowledge can help guide emergency responders as they evaluate patients after a traumatic event.
Resumo:
Due to the imprecise nature of biological experiments, biological data is often characterized by the presence of redundant and noisy data. This may be due to errors that occurred during data collection, such as contaminations in laboratorial samples. It is the case of gene expression data, where the equipments and tools currently used frequently produce noisy biological data. Machine Learning algorithms have been successfully used in gene expression data analysis. Although many Machine Learning algorithms can deal with noise, detecting and removing noisy instances from the training data set can help the induction of the target hypothesis. This paper evaluates the use of distance-based pre-processing techniques for noise detection in gene expression data classification problems. This evaluation analyzes the effectiveness of the techniques investigated in removing noisy data, measured by the accuracy obtained by different Machine Learning classifiers over the pre-processed data.
Resumo:
PURPOSE: The main goal of this study was to develop and compare two different techniques for classification of specific types of corneal shapes when Zernike coefficients are used as inputs. A feed-forward artificial Neural Network (NN) and discriminant analysis (DA) techniques were used. METHODS: The inputs both for the NN and DA were the first 15 standard Zernike coefficients for 80 previously classified corneal elevation data files from an Eyesys System 2000 Videokeratograph (VK), installed at the Departamento de Oftalmologia of the Escola Paulista de Medicina, São Paulo. The NN had 5 output neurons which were associated with 5 typical corneal shapes: keratoconus, with-the-rule astigmatism, against-the-rule astigmatism, "regular" or "normal" shape and post-PRK. RESULTS: The NN and DA responses were statistically analyzed in terms of precision ([true positive+true negative]/total number of cases). Mean overall results for all cases for the NN and DA techniques were, respectively, 94% and 84.8%. CONCLUSION: Although we used a relatively small database, results obtained in the present study indicate that Zernike polynomials as descriptors of corneal shape may be a reliable parameter as input data for diagnostic automation of VK maps, using either NN or DA.
Resumo:
Natural products have widespread biological activities, including inhibition of mitochondrial enzyme systems. Some of these activities, for example cytotoxicity, may be the result of alteration of cellular bioenergetics. Based on previous computer-aided drug design (CADD) studies and considering reported data on structure-activity relationships (SAR), an assumption regarding the mechanism of action of natural products against parasitic infections involves the NADH-oxidase inhibition. In this study, chemometric tools, such as: Principal Component Analysis (PCA), Consensus PCA (CPCA), and partial least squares regression (PLS), were applied to a set of forty natural compounds, acting as NADH-oxidase inhibitors. The calculations were performed using the VolSurf+ program. The formalisms employed generated good exploratory and predictive results. The independent variables or descriptors having a hydrophobic profile were strongly correlated to the biological data.
Resumo:
A four-parameter extension of the generalized gamma distribution capable of modelling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone and non-monotone failure rate functions, which are quite common in lifetime data analysis and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the exponentiated Weibull, exponentiated generalized half-normal, exponentiated gamma and generalized Rayleigh, among others. We derive two infinite sum representations for its moments. We calculate the density of the order statistics and two expansions for their moments. The method of maximum likelihood is used for estimating the model parameters and the observed information matrix is obtained. Finally, a real data set from the medical area is analysed.
Resumo:
Two major factors are likely to impact the utilisation of remotely sensed data in the near future: (1)an increase in the number and availability of commercial and non-commercial image data sets with a range of spatial, spectral and temporal dimensions, and (2) increased access to image display and analysis software through GIS. A framework was developed to provide an objective approach to selecting remotely sensed data sets for specific environmental monitoring problems. Preliminary applications of the framework have provided successful approaches for monitoring disturbed and restored wetlands in southern California.
Resumo:
This paper develops an interactive approach for exploratory spatial data analysis. Measures of attribute similarity and spatial proximity are combined in a clustering model to support the identification of patterns in spatial information. Relationships between the developed clustering approach, spatial data mining and choropleth display are discussed. Analysis of property crime rates in Brisbane, Australia is presented. A surprising finding in this research is that there are substantial inconsistencies in standard choropleth display options found in two widely used commercial geographical information systems, both in terms of definition and performance. The comparative results demonstrate the usefulness and appeal of the developed approach in a geographical information system environment for exploratory spatial data analysis.
Resumo:
Objective: To compare rates of self-reported use of health services between rural, remote and urban South Australians. Methods: Secondary data analysis from a population-based survey to assess health and well-being, conducted in South Australia in 2000. In all, 2,454 adults were randomly selected and interviewed using the computer-assisted telephone interview (CATI) system. We analysed health service use by Accessibility and Remoteness Index of Australia (ARIA) category. Results: There was no statistically significant difference in the median number of uses of the four types of health services studied across ARIA categories. Significantly fewer residents of highly accessible areas reported never using primary care services (14.4% vs. 22.2% in very remote areas), and significantly more reported high use ( greater than or equal to6 visits, 29.3% vs. 21.5%). Fewer residents of remote areas reported never attending hospital (65.6% vs. 73.8% in highly accessible areas). Frequency of use of mental health services was not statistically significantly different across ARIA categories. Very remote residents were more likely to spend at least one night in a public hospital (15.8%) than were residents of other areas (e.g. 5.9% for highly accessible areas). Conclusion: The self-reported frequency of use of a range of health services in South Australia was broadly similar across ARIA categories. However, use of primary care services was higher among residents of highly accessible areas and public hospital use increased with increasing remoteness. There is no evidence for systematic rural disadvantage in terms of self-reported health service utilisation in this State.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Aim To compare morphometric data of the eyelid fissure and the levator muscle function (LF) before and up to 6 months after transcutaneous injection with five units of Botox (R) in patients with upper lid retraction (ULR) from congestive or fibrotic thyroid eye disease (TED). Methods Twenty-four patients with ULR from TED were submitted to transcutaneous injection of 5 units (0.1 ml) of Botox in one eye only. Patients were divided into two groups: 12 with congestive-stage TED (CG), and 12 with fibrotic-stage TED (FG). Bilateral lid fissure measurements using digital imaging and computer-aided analysis were taken at baseline and at regular intervals 2 weeks, 1 month, 3 months and 6 months after unilateral Botox injection. Mean values taken at different follow-up points were compared for the two groups. Results Most patients experienced marked improvement in ULR, with a mean reduction of 3.81 mm in FG and 3.05 mm in CG. The upper eyelid margin reflex distance, fissure height and total area of exposed interpalpebral fissure were significantly smaller during 1 month in CG and during 3 months in FG. Reduction in LF and in the difference between lateral and medial lid fissure measurements was observed in both groups. The treatment lasted significantly longer in FG than in CG. Conclusions A single 5-unit Botox injection improved ULR, reduced LF and produced an adequate lid contour in patients with congestive or fibrotic TED. The effect lasts longer in patients with fibrotic orbitopathy than in patients with congestive orbitopathy.
Resumo:
The Sciatic Functional Index (SFI) is a quite useful tool for the evaluation of functional recovery of the sciatic nerve of rats in a number of experimental injuries and treatments. Although it is an objective method, it depends on the examiner`s ability to adequately recognize and mark the previously established footprint key points, which is an entirely subjective step, thus potentially interfering with the calculations according to the mathematical formulae proposed by different authors. Thus, an interpersonal evaluation of the reproducibility of an SFI computer-aided method was carried out here to study data variability. A severe crush injury was produced on a 5 mm-long segment of the right sciatic nerve of 20 Wistar rats (a 5000 g load directly applied for 10 min) and the SH was measured by four different examiners (an experienced one and three newcomers) preoperatively and at weekly intervals from the 1st to the 8th postoperative week. Three measurements were made for each print and the average was calculated and used for statistical analysis. The results showed that interpersonal correlation was high (0.82) in the 3rd, 4th, 5th, 7th and 8th weeks, with an unexpected but significant (p < 0.01) drop in the 6th week. There was virtually no interpersonal correlation (correlation index close to 0) on the 1st and 2nd weeks, a period during which the variability between animals and examiners (p =0.24 and 0.32, respectively) was similar, certainly due to a poor definition of the footprints. The authors conclude that the SFI method studied here is only reliable from the 3rd week on after a severe lesion of the sciatic nerve of rats. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Purpose: The purpose of our study was to compare signal characteristics and image qualities of MR imaging at 3.0 T and 1.5 T in patients with diffuse parenchymal liver disease. Materials and methods: 25 consecutive patients with diffuse parenchymal liver disease underwent abdominal MR imaging at both 3.0 T and 1.5 T within a 6-month interval. A retrospective study was conducted to obtain quantitative and qualitative data from both 3.0 T and 1.5 T MRI. Quantitative image analysis was performed by measuring the signal-to-noise ratios (SNRs) and the contrast-to-noise ratios (CNRs) by the Students t-test. Qualitative image analysis was assessed by grading each sequence on a 3- and 4-point scale, regarding the presence of artifacts and image quality, respectively. Statistical analysis consisted of the Wilcoxon signed-rank test. Results: the mean SNRs and CNRs of the liver parenchyma and the portal vein were significantly higher at 3.0 T than at 1.5 T on portal and equilibrium phases of volumetric interpolated breath-hold examination (VIBE) images (P < 0.05). The mean SNRs were significantly higher at 3.0 T than at 1.5 T on T1-weighted spoiled gradient echo (SGE) images (P < 0.05). However, there were no significantly differences on T2-weighted short-inversion-time inversion recovery (STIR) images. Overall image qualities of the 1.5 T noncontrast T1- and T2-weighted sequences were significantly better than 3.0 T (P < 0.01). In contrast, overall image quality of the 3.0 T post-gadolinium VIBE sequence was significantly better than 1.5 T (P< 0.01). Conclusions: MR imaging of post-gadolinium VIBE sequence at 3.0 T has quantitative and qualitative advantages of evaluating for diffuse parenchymal liver disease. (C) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background: One of the frequent questions in obstetric practice is to determine placental vascular changes that may account for abnormal Doppler flow velocity alterations in maternal uterine vessels from women and fetuses without pregnancy pathology. Methods: A retrospective morphometric study was realized using 27 placentas from patients submitted for Doppler flow velocity exam during pregnancy. The placentas were morphologically examined using hematoxylin-eosin staining. Measurements of villi were made with the use of a video camera coupled to a common light microscope and a computer with automatic image analyzing software. Results: Of the 27 placentas, 13 (48%) were of patients showing unaltered Doppler and 14 (52%) showing altered Doppler. The number of stem villi vessels was significantly larger in the placentas of patients with Doppler exam alterations (P = 0.003). This group also presented greater stem villi vessel thickness, although without significant difference. The number of intermediary and terminal villi vessels was greater in the placentas of patients with altered Doppler exams (P < 0.001), and a greater terminal villi area was observed in these cases (P < 0.001). Conclusion: The morphological proof that uterine artery Doppler flow velocity exam alterations are associated with placental vascular alterations demonstrates the importance of this exam during prenatal care, even in the absence of maternal-fetal alterations.
Resumo:
OBJECTIVES: 1. To critically evaluate a variety of mathematical methods of calculating effective population size (Ne) by conducting comprehensive computer simulations and by analysis of empirical data collected from the Moreton Bay population of tiger prawns. 2. To lay the groundwork for the application of the technology in the NPF. 3. To produce software for the calculation of Ne, and to make it widely available.
Resumo:
Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.