938 resultados para multi-classification constrained-covariance regres
Resumo:
Chronic liver disease (CLD) is most of the time an asymptomatic, progressive, and ultimately potentially fatal disease. In this study, an automatic hierarchical procedure to stage CLD using ultrasound images, laboratory tests, and clinical records are described. The first stage of the proposed method, called clinical based classifier (CBC), discriminates healthy from pathologic conditions. When nonhealthy conditions are detected, the method refines the results in three exclusive pathologies in a hierarchical basis: 1) chronic hepatitis; 2) compensated cirrhosis; and 3) decompensated cirrhosis. The features used as well as the classifiers (Bayes, Parzen, support vector machine, and k-nearest neighbor) are optimally selected for each stage. A large multimodal feature database was specifically built for this study containing 30 chronic hepatitis cases, 34 compensated cirrhosis cases, and 36 decompensated cirrhosis cases, all validated after histopathologic analysis by liver biopsy. The CBC classification scheme outperformed the nonhierachical one against all scheme, achieving an overall accuracy of 98.67% for the normal detector, 87.45% for the chronic hepatitis detector, and 95.71% for the cirrhosis detector.
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.
Resumo:
Tese de Doutoramento, Ciências do Mar (Biologia Marinha)
Resumo:
In this work the identification and diagnosis of various stages of chronic liver disease is addressed. The classification results of a support vector machine, a decision tree and a k-nearest neighbor classifier are compared. Ultrasound image intensity and textural features are jointly used with clinical and laboratorial data in the staging process. The classifiers training is performed by using a population of 97 patients at six different stages of chronic liver disease and a leave-one-out cross-validation strategy. The best results are obtained using the support vector machine with a radial-basis kernel, with 73.20% of overall accuracy. The good performance of the method is a promising indicator that it can be used, in a non invasive way, to provide reliable information about the chronic liver disease staging.
Resumo:
In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.
Resumo:
Steatosis, also known as fatty liver, corresponds to an abnormal retention of lipids within the hepatic cells and reflects an impairment of the normal processes of synthesis and elimination of fat. Several causes may lead to this condition, namely obesity, diabetes, or alcoholism. In this paper an automatic classification algorithm is proposed for the diagnosis of the liver steatosis from ultrasound images. The features are selected in order to catch the same characteristics used by the physicians in the diagnosis of the disease based on visual inspection of the ultrasound images. The algorithm, designed in a Bayesian framework, computes two images: i) a despeckled one, containing the anatomic and echogenic information of the liver, and ii) an image containing only the speckle used to compute the textural features. These images are computed from the estimated RF signal generated by the ultrasound probe where the dynamic range compression performed by the equipment is taken into account. A Bayes classifier, trained with data manually classified by expert clinicians and used as ground truth, reaches an overall accuracy of 95% and a 100% of sensitivity. The main novelties of the method are the estimations of the RF and speckle images which make it possible to accurately compute textural features of the liver parenchyma relevant for the diagnosis.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores. Área de Especialização em Sistemas e Planeamento Industrial.
Resumo:
O mercado accionista, de uma forma global, tem-se revelado nos últimos tempos uma das principais fontes de incentivo ao mercado de valores mobiliários. O seu impacto junto do público em geral é enorme e a sua importância para as empresas é vital. Interessa, então, perceber como é que a teoria financeira tem obordado a avaliação e a compreensão do processo de formação de uma cotação. Desde os anos 50 até aos dias de hoje, interessa perceber como é que os diferentes autores têm tratado esta abordagem e quais os resultados deste confronto. Interessa sobretudo perceber o abordogem de Stephen Ross e a teoria do arbitragem. Na sequência desta obordagem e com o aparecimento do Multi Index Model, passou a ser possível extimar com maior precisão a evolução da cotação, na medida em que esta estaria dependente de um vasto conjunto de variavéis, que abragem uma vasta área de influência. O contributo de Ross é por isso decisivo. No final interessa reter a melhor técnica e teoria, que defende os interesses do investidor. Face o isto resta, então, saber qual a melhor técnica estatística para proceder a estes estudos empíricos.
Resumo:
Purpose: To describe and compare the content of instruments that assess environmental factors using the International Classification of Functioning, Disability and Health (ICF). Methods: A systematic search of PubMed, CINAHL and PEDro databases was conducted using a pre-determined search strategy. The identified instruments were screened independently by two investigators, and meaningful concepts were linked to the most precise ICF category according to published linking rules. Results: Six instruments were included, containing 526 meaningful concepts. Instruments had between 20% and 98% of items linked to categories in Chapter 1. The highest percentage of items from one instrument linked to categories in Chapters 2–5 varied between 9% and 50%. The presence or absence of environmental factors in a specific context is assessed in 3 instruments, while the other 3 assess the intensity of the impact of environmental factors. Discussion: Instruments differ in their content, type of assessment, and have several items linked to the same ICF category. Most instruments primarily assess products and technology (Chapter 1), highlighting the need to deepen the discussion on the theory that supports the measurement of environmental factors. This discussion should be thorough and lead to the development of methodologies and new tools that capture the underlying concepts of the ICF.
Resumo:
OBJECTIVE: To develop a Charlson-like comorbidity index based on clinical conditions and weights of the original Charlson comorbidity index. METHODS: Clinical conditions and weights were adapted from the International Classification of Diseases, 10th revision and applied to a single hospital admission diagnosis. The study included 3,733 patients over 18 years of age who were admitted to a public general hospital in the city of Rio de Janeiro, southeast Brazil, between Jan 2001 and Jan 2003. The index distribution was analyzed by gender, type of admission, blood transfusion, intensive care unit admission, age and length of hospital stay. Two logistic regression models were developed to predict in-hospital mortality including: a) the aforementioned variables and the risk-adjustment index (full model); and b) the risk-adjustment index and patient's age (reduced model). RESULTS: Of all patients analyzed, 22.3% had risk scores >1, and their mortality rate was 4.5% (66.0% of them had scores >1). Except for gender and type of admission, all variables were retained in the logistic regression. The models including the developed risk index had an area under the receiver operating characteristic curve of 0.86 (full model), and 0.76 (reduced model). Each unit increase in the risk score was associated with nearly 50% increase in the odds of in-hospital death. CONCLUSIONS: The risk index developed was able to effectively discriminate the odds of in-hospital death which can be useful when limited information is available from hospital databases.
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.
Resumo:
Este trabalho visa apresentar um enquadramento da realidade económica e industrial do sector transformador de granitos ornamentais em Portugal e fazer uma análise do processo de serragem, com engenhos multi-lâminas e granalha de aço, na medida em que este é o método de seccionamento de blocos de granito mais utilizado pelas grandes indústrias do sector. Tendo em conta a importância económica desta operação produtiva na indústria em causa, foi definido como fito deste projecto a análise estatística dos custos de produção; a definição de fórmulas de cálculo que permitam prever o custo médio de serragem; e o estudo de soluções economicamente viáveis e ambientalmente sustentáveis para o problema das lamas resultantes do expurgo dos engenhos. Para a persecução deste projecto foi realizada uma recolha de dados implementando rotinas de controlo e registo dos mesmos, em quadros de produção normalizados e de fácil preenchimento, pelos operadores destes equipamentos. Esta recolha de dados permitiu isolar, quantificar e formular os factores de rentabilização do processo de serragem selecionando, dentro da amostra de estudo obtida, um conjunto de serragens com características similares e com valores próximos dos valores da média estatística. Apartir dos dados destas serragens foram geradas curvas de tendência polinomial com as quais se analisaram as variações provocadas no custo médio de serragem, pelas variações do factor em estudo. A formulação dos factores de rentabilização e os dados estatísticos obtidos permitiram depois o desenvolvimento de fórmulas de cálculo do custo médio de serragem que establecem o custo de produção diferenciado em função das espessuras com, ou sem, a incorporação dos factores de rentabilização. Como consequência do projecto realizado obteve-se um conjunto de conclusões util, para o sector industrial em causa, que evidencia a importancia da Ocupação dos engenhos e rentabilização de um espaço confinado, da Resistência oferecida à serragem pelos granitos, e da Diferença de altura entre os blocos de uma mesma carga, nos custos de transformação.
Resumo:
A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).
Resumo:
This chapter aims to demonstrate how PAOL - Unit for Innovation in Education, a project from ISCAP - School of Accounting and Administration of Oporto ....
Resumo:
Conferência: 39th Annual Conference of the IEEE Industrial-Electronics-Society (IECON) - NOV 10-14, 2013