971 resultados para Scale invariant feature transform
Resumo:
Purpose - To develop and validate a psychometric scale for assessing image quality for chest radiographs.
Resumo:
Para uma melhor avaliação e definição do plano de intervenção do indivíduo, é cada vez mais importante a existência instrumentos de avaliação válidos e fiáveis para a população portuguesa. Objetivo: Traduzir e adaptar para a população Portuguesa a escala Trunk Impairment Scale (TIS) em pacientes pós-AVE, e avaliar as propriedades psicométricas da mesma. Metodologia: A TIS foi traduzida para o Português e adaptada culturalmente para a população portuguesa. As propriedades psicométricas da mesma, incluindo validade, fiabilidade, concordância inter-observadores, consistência interna, sensibilidade, especificidade, poder de resposta, foram avaliadas numa população diagnosticada com AVE e num grupo de controlo de participantes saudáveis. Participaram neste estudo 80 indivíduos, divididos em dois grupos, nomeadamente indivíduos pós-AVE (40) e um grupo sem patologia (40). Os participantes foram submetidos à aplicação das escalas de Berg, Medida de Independência Funcional e Escala de Desempenho Físico Fugl Meyer e a TIS de modo a avaliar as propriedades psicométricas desta. As avaliações foram realizadas por duas fisioterapeutas experientes e o re-teste foi realizado após 48 horas. Os dados foram registados e trabalhados com o programa informático SPSS 21.0. Resultados: Relativamente aos valores obtidos, verificou-se que, quanto à consistência interna da TIS estes apresentam-se de forma moderada a elevada (alfa Cronbach = 0,909). Quanto à fiabilidade inter-observadores, os itens com menor valor são os itens 1 e 4 (0,759 e 0,527, respetivamente) e os itens com valor de Kappa mais alto são os itens 5 e 6 (0,830 e 0,893, respetivamente). Relativamente à validade de critério, verificou-se que não houve correlação entre a escala de Desempenho Físico Fugl-Meyer, a escala de Equilibrio de Berg e a Medida de Independência Funcional, ou seja, os valores obtidos r=0,166; r=0,017; r= -0,002, respetivamente. Quanto à validade de construção, constatou-se que o valor da mediana é mais elevado nos itens 1 a 5, logo sugere que haja diferenças entre o grupo de indivíduos pós-AVE e o grupo de indivíduos saudáveis (p<0,001). Entre os outros dois itens (6 e 7) não foram encontradas diferenças nas respostas nos dois grupos, sendo o valor de p > 0,001. Conclusão: Os resultados obtidos neste estudo sugerem que a versão portuguesa da TIS apresenta bons níveis de fiabilidade, consistência interna e também apresenta bons resultados no que refere à concordância inter-observadores.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
This work describes the utilization of Pulsed Electric Fields to control the protozoan contamination of a microalgae culture, in an industrial 2.7m3 microalgae photobioreactor. The contaminated culture was treated with Pulsed Electric Fields, PEF, for 6h with an average of 900V/cm, 65μs pulses of 50Hz. Working with recirculation, all the culture was uniformly exposed to the PEF throughout the assay. The development of the microalgae and protozoan populations was followed and the results showed that PEF is effective on the selective elimination of protozoa from microalgae cultures, inflicting on the protozoa growth halt, death or cell rupture, without affecting microalgae productivity. Specifically, the results show a reduction of the active protozoan population of 87% after 6h treatment and 100% after few days of normal cultivation regime. At the same time, microalgae growth rate remained unaffected. © 2014 Elsevier B.V.
Resumo:
Climatic reconstructions based on palynological data from Aquitaine outcrops emphasize an important degradation phase during the Lower Serravallian. Climatic and environmental changes can be related to sea-level variations (Bur 5 / Lan 1, Lan 2 / Ser 1 and Ser 2 cycles). Transgressive phases feature warmer conditions and more open environments whereas regressive phases are marked by a cooler climate and an extent of the forest cover. From Langhian to Middle Serravallian, a general cooling is highlighted, with disappearance of most megathermic taxa and a transition from warm and dry climate to warm-temperate and much more humid conditions. Conclusions are consistent with studies on bordering areas and place the major degradation phase around 14 My. The palynologic data allow filling a gap in the climatic evolution of Southern France, as a connection between Lower and Upper Miocene, both well recorded. These results document, on Western Europe scale, latitudinal climatic gradient across Northern hemisphere while featuring a transition between Mediterranean area and northeastern Atlantic frontage.
Resumo:
Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
In the last decade, local image features have been widely used in robot visual localization. In order to assess image similarity, a strategy exploiting these features compares raw descriptors extracted from the current image with those in the models of places. This paper addresses the ensuing step in this process, where a combining function must be used to aggregate results and assign each place a score. Casting the problem in the multiple classifier systems framework, in this paper we compare several candidate combiners with respect to their performance in the visual localization task. For this evaluation, we selected the most popular methods in the class of non-trained combiners, namely the sum rule and product rule. A deeper insight into the potential of these combiners is provided through a discriminativity analysis involving the algebraic rules and two extensions of these methods: the threshold, as well as the weighted modifications. In addition, a voting method, previously used in robot visual localization, is assessed. Furthermore, we address the process of constructing a model of the environment by describing how the model granularity impacts upon performance. All combiners are tested on a visual localization task, carried out on a public dataset. It is experimentally demonstrated that the sum rule extensions globally achieve the best performance, confirming the general agreement on the robustness of this rule in other classification problems. The voting method, whilst competitive with the product rule in its standard form, is shown to be outperformed by its modified versions.
Resumo:
In machine learning and pattern recognition tasks, the use of feature discretization techniques may have several advantages. The discretized features may hold enough information for the learning task at hand, while ignoring minor fluctuations that are irrelevant or harmful for that task. The discretized features have more compact representations that may yield both better accuracy and lower training time, as compared to the use of the original features. However, in many cases, mainly with medium and high-dimensional data, the large number of features usually implies that there is some redundancy among them. Thus, we may further apply feature selection (FS) techniques on the discrete data, keeping the most relevant features, while discarding the irrelevant and redundant ones. In this paper, we propose relevance and redundancy criteria for supervised feature selection techniques on discrete data. These criteria are applied to the bin-class histograms of the discrete features. The experimental results, on public benchmark data, show that the proposed criteria can achieve better accuracy than widely used relevance and redundancy criteria, such as mutual information and the Fisher ratio.
Resumo:
Trabalho de Projecto apresentado como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica
Resumo:
We examine the constraints on the two Higgs doublet model (2HDM) due to the stability of the scalar potential and absence of Landau poles at energy scales below the Planck scale. We employ the most general 2HDM that incorporates an approximately Standard Model (SM) Higgs boson with a flavor aligned Yukawa sector to eliminate potential tree-level Higgs-mediated flavor changing neutral currents. Using basis independent techniques, we exhibit robust regimes of the 2HDM parameter space with a 125 GeV SM-like Higgs boson that is stable and perturbative up to the Planck scale. Implications for the heavy scalar spectrum are exhibited.
Resumo:
Master’s Thesis in Computer Engineering
Resumo:
More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.
Resumo:
The study of electricity markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring process produced. Currently, lots of information concerning electricity markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge to define realistic scenarios, which are essential for understanding and forecast electricity markets behavior. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of electricity markets and of the behaviour of the involved entities. In this paper an adaptable tool capable of downloading, parsing and storing data from market operators’ websites is presented, assuring constant updating and reliability of the stored data.
Resumo:
The study of Electricity Markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring produced. Currently, lots of information concerning Electricity Markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge, to define realistic scenarios, essential for understanding and forecast Electricity Markets behaviour. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of Electricity Markets and the behaviour of the involved entities. In this paper we present an adaptable tool capable of downloading, parsing and storing data from market operators’ websites, assuring actualization and reliability of stored data.