87 resultados para data complexity
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.
Resumo:
In this work the identification and diagnosis of various stages of chronic liver disease is addressed. The classification results of a support vector machine, a decision tree and a k-nearest neighbor classifier are compared. Ultrasound image intensity and textural features are jointly used with clinical and laboratorial data in the staging process. The classifiers training is performed by using a population of 97 patients at six different stages of chronic liver disease and a leave-one-out cross-validation strategy. The best results are obtained using the support vector machine with a radial-basis kernel, with 73.20% of overall accuracy. The good performance of the method is a promising indicator that it can be used, in a non invasive way, to provide reliable information about the chronic liver disease staging.
Resumo:
In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.
Resumo:
A presente investigação procurou descrever, de forma exaustiva, o processo de previsão, negociação, implementação e avaliação do Contrato de Execução celebrado entre a Câmara Municipal de Sintra e o Ministério da Educação em 2009. Este contrato corresponde a um instrumento previsto na regulamentação do quadro de transferências de competências para os municípios em matéria de educação, de acordo com o regime previsto no Decreto-Lei n.º 144/2008, de 28 de julho. Definida a problemática e os objetivos, a investigação centrou-se num estudo de caso no qual foi feita a descrição e interpretação do processo e das ações desenvolvidas pelos intervenientes no período compreendido entre 2008 e 2011. Recorreu-se à confrontação dos dados obtidos através da análise das fontes documentais e do recurso às entrevistas realizadas aos responsáveis pelo Pelouro da Educação e diretores dos Agrupamentos de Escolas, à luz da revisão da literatura e do contributo de diferentes trabalhos de investigadores nesta matéria. A investigação permitiu concluir que o processo de contratualização foi algo complexo face à realidade deste Município e que o normativo apresenta várias lacunas no que diz respeito à contratualização da referida transferência de competências, designadamente porque procura generalizar algo que não é, de todo, generalizável – o campo da educação face à complexidade dos territórios educativos em causa e aos dos intervenientes envolvidos no mesmo.
Resumo:
Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática
Resumo:
The regulatory mechanisms by which hydrogen peroxide (H2O2) modulates the activity of transcription factors in bacteria (OxyR and PerR), lower eukaryotes (Yap1, Maf1, Hsf1 and Msn2/4) and mammalian cells (AP-1, NRF2, CREB, HSF1, HIF-1, TP53, NF-κB, NOTCH, SP1 and SCREB-1) are reviewed. The complexity of regulatory networks increases throughout the phylogenetic tree, reaching a high level of complexity in mammalians. Multiple H2O2 sensors and pathways are triggered converging in the regulation of transcription factors at several levels: (1) synthesis of the transcription factor by upregulating transcription or increasing both mRNA stability and translation; (ii) stability of the transcription factor by decreasing its association with the ubiquitin E3 ligase complex or by inhibiting this complex; (iii) cytoplasm-nuclear traffic by exposing/masking nuclear localization signals, or by releasing the transcription factor from partners or from membrane anchors; and, (iv) DNA binding and nuclear transactivation by modulating transcription factor affinity towards DNA, co-activators or repressors, and by targeting specific regions of chromatin to activate individual genes. We also discuss how H2O2 biological specificity results from diverse thiol protein sensors, with different reactivity of their sulfhydryl groups towards H2O2, being activated by different concentrations and times of exposure to H2O2. The specific regulation of local H2O2 concentrations is also crucial and results from H2O2 localized production and removal controlled by signals. Finally, we formulate equations to extract from typical experiments quantitative data concerning H2O2 reactivity with sensor molecules. Rate constants of 140 M-1s−1 and ≥ 1.3 × 103 M-1s−1 were estimated, respectively, for the reaction of H2O2 with KEAP1 and with an unknown target that mediates NRF2 protein synthesis. In conclusion, the multitude of H2O2 targets and mechanisms provides an opportunity for highly specific effects on gene regulation that depend on the cell type and on signals received from the cellular microenvironment.
Resumo:
Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Mestrado em Auditoria
Resumo:
A detailed analytic and numerical study of baryogenesis through leptogenesis is performed in the framework of the standard model of electroweak interactions extended by the addition of three right-handed neutrinos, leading to the seesaw mechanism. We analyze the connection between GUT-motivated relations for the quark and lepton mass matrices and the possibility of obtaining a viable leptogenesis scenario. In particular, we analyze whether the constraints imposed by SO(10) GUTs can be compatible with all the available solar, atmospheric and reactor neutrino data and, simultaneously, be capable of producing the required baryon asymmetry via the leptogenesis mechanism. It is found that the Just-So(2) and SMA solar solutions lead to a viable leptogenesis even for the simplest SO(10) GUT, while the LMA, LOW and VO solar solutions would require a different hierarchy for the Dirac neutrino masses in order to generate the observed baryon asymmetry. Some implications on CP violation at low energies and on neutrinoless double beta decay are also considered. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Signal subspace identification is a crucial first step in many hyperspectral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction, yielding gains in algorithm performance and complexity and in data storage. This paper introduces a new minimum mean square error-based approach to infer the signal subspace in hyperspectral imagery. The method, which is termed hyperspectral signal identification by minimum error, is eigen decomposition based, unsupervised, and fully automatic (i.e., it does not depend on any tuning parameters). It first estimates the signal and noise correlation matrices and then selects the subset of eigenvalues that best represents the signal subspace in the least squared error sense. State-of-the-art performance of the proposed method is illustrated by using simulated and real hyperspectral images.
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
The hand is one of the most important instruments of the human body, mainly due to the possibility of grip movements. Grip strength has been described as an important predictor of functional capacity. There are several factors that may influence it, such as gender, age and anthropometric characteristics. Functional capacity refers to the ability to perform daily activities which allow the individual to self-care and to live with autonomy. Composite Physical Function (CPF) scale is an evaluation tool for functional capacity that includes daily activities, self-care, sports activities, upper limb function and gait capacity. In 2011, Portugal had 15% of young population (0-14years) and 19% of elderly population (over 65 years). Considering the double-ageing phenomen, it is important to understand the effect of the grip strength in elderly individuals, considering their characteristics, as the need to maintainin dependency as long as possible.
Resumo:
A estimativa da idade gestacional (IG) em restos cadavéricos fetais é importante em contextos forenses. Para esse efeito, os especialistas forenses recorrem à avaliação do padrão de calcificação dentária e/ou ao estudo do esqueleto. Neste último, o comprimento das diáfises de ossos longos é um dos métodos mais utilizados, sendo utilizadas equações de regressão de obras pouco atuais ou baseadas em dados ecográficos, cujas medições diferem das efetuadas diretamente no osso. Este trabalho tem como objetivo principal a obtenção de equações de regressão para a população Portuguesa, com base na medição das diáfises de fémur, tíbia e úmero, utilizando radiografias postmortem. A amostra é constituída por 80 fetos de IG conhecida. Tratando-se de um estudo retrospectivo, os casos foram selecionados com base nas informações clínicas e anatomopatológicas, excluindo-se aqueles cujo normal crescimento se encontrava efetiva ou potencialmente comprometido. Os resultados confirmaram uma forte correlação entre o comprimento das diáfises estudadas e a IG, apresentando o fémur a correlação mais forte (r=0.967; p <0,01). Assim, foi possível obter uma equação de regressão para cada um dos ossos estudados. Concluindo, os objetivos do estudo foram atingidos com a obtenção das equações de regressão para os ossos estudados. Pretende-se, futuramente, alargar a amostra para validar e consolidar os resultados obtidos neste estudo.