38 resultados para Sectional Twin Data
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.
Resumo:
In this work the identification and diagnosis of various stages of chronic liver disease is addressed. The classification results of a support vector machine, a decision tree and a k-nearest neighbor classifier are compared. Ultrasound image intensity and textural features are jointly used with clinical and laboratorial data in the staging process. The classifiers training is performed by using a population of 97 patients at six different stages of chronic liver disease and a leave-one-out cross-validation strategy. The best results are obtained using the support vector machine with a radial-basis kernel, with 73.20% of overall accuracy. The good performance of the method is a promising indicator that it can be used, in a non invasive way, to provide reliable information about the chronic liver disease staging.
Resumo:
In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
A heteropaternal male twin case with two men being alleged fathers was investigated as requested by the Court. Up to 37 PCR-based polymorphic DNA systems were studied in this case which was complicated by a paternal ACTBP2 mutation detected in one twin. This is the first report on a STR mutation in a double paternity case where both biological fathers were indisputably identified. The STR systems enable the resolution of these complex genetic relationships even in a case where a mutation in one STR locus was encountered.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica - Ramo Manutenção e Produção
Resumo:
A detailed analytic and numerical study of baryogenesis through leptogenesis is performed in the framework of the standard model of electroweak interactions extended by the addition of three right-handed neutrinos, leading to the seesaw mechanism. We analyze the connection between GUT-motivated relations for the quark and lepton mass matrices and the possibility of obtaining a viable leptogenesis scenario. In particular, we analyze whether the constraints imposed by SO(10) GUTs can be compatible with all the available solar, atmospheric and reactor neutrino data and, simultaneously, be capable of producing the required baryon asymmetry via the leptogenesis mechanism. It is found that the Just-So(2) and SMA solar solutions lead to a viable leptogenesis even for the simplest SO(10) GUT, while the LMA, LOW and VO solar solutions would require a different hierarchy for the Dirac neutrino masses in order to generate the observed baryon asymmetry. Some implications on CP violation at low energies and on neutrinoless double beta decay are also considered. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
The hand is one of the most important instruments of the human body, mainly due to the possibility of grip movements. Grip strength has been described as an important predictor of functional capacity. There are several factors that may influence it, such as gender, age and anthropometric characteristics. Functional capacity refers to the ability to perform daily activities which allow the individual to self-care and to live with autonomy. Composite Physical Function (CPF) scale is an evaluation tool for functional capacity that includes daily activities, self-care, sports activities, upper limb function and gait capacity. In 2011, Portugal had 15% of young population (0-14years) and 19% of elderly population (over 65 years). Considering the double-ageing phenomen, it is important to understand the effect of the grip strength in elderly individuals, considering their characteristics, as the need to maintainin dependency as long as possible.
Resumo:
For a long time the allegorical activity was considered dogmatic and equated with artistic fossilization, archaic religious propensity and lack of creativity. However, Walter Benjamin (1928) and Paul De Man (1969), among other illustrious thinkers, came to its defense, exalting, instead, its cryptic, hybrid and abstract nature, which, incidentally, are the main characteristics of modern art. “Twin Peaks – Fire Walk with Me” (David Lynch, 1992) is a wonderful object of analysis, despite being one of the most misunderstood films in the history of cinema. The fact that its narrative is a prequel to the cult television series “Twin Peaks” and incorporates many of the characters of that show, explicitly denigrating the moral image of the protagonist, Laura Palmer, brought about an intense rejection by the fans of the series, as well as the indifference of the cinephilic community in general. However, one must go deeper, in order to understand Lynch’s brave accomplishment and its artfulness. Indeed, the opus is a powerful cinematic allegory because it contains a double layer of metaphorical meaning, one of them being explicitly metacinematic. Thus, besides assuming itself as a filmic daimonic allegory, occurring in a spiritual universe of Good versus Evil, the film is also an authorial discourse on cinema itself. More specifically, it is an allegory of spectatorship, according to Robert Stam’s definition, where the existence and crossing over to “another side” duplicates the architecture of movie theatres and the psychic processes involved in film viewing.
Resumo:
Introduction: Visual anomalies that affect school-age children represent an important public health problem. Data on the prevalence are lacking in Portugal but is needed for planning vision services. This study was conducted to determine the prevalence of strabismus, decreased visual acuity, and uncorrected refractive error in Portuguese children aged 6 to 11 years. Methods and materials: A cross-sectional study was carried out on a sample of 672 school-age children (7.69 ± 1.19 years). Children received an orthoptic assessment (visual acuity, ocular alignment, and ocular movements) and non-cycloplegic autorefraction. Results: After orthoptic assessment, 13.8% of children were considered abnormal (n = 93). Manifest strabismus was found in 4% of the children. Rates of esotropia (2.1%) were slightly higher than exotropia (1.8%). Strabismus rates were not statistically significant different per sex (p = 0.681) and grade (p = 0.228). Decreased visual acuity at distance was present in 11.3% of children. Visual acuity ≤20/66 (0.5 logMAR) was found in 1.3% of the children. We also found that 10.3% of children had an uncorrected refractive error. Conclusions: Strabismus affects a small proportion of the Portuguese school-age children. Decreased visual acuity and uncorrected refractive error affected a significant proportion of school-age children. New policies need to be developed to address this public health problem.