982 resultados para Computer techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Fine-needle aspiration cytology (FNAC) of serous membrane effusions may fulfil a challenging role in the diagnostic analysis of both primary and metastatic disease. From this perspective, liquid-based cytology (LBC) represents a feasible and reliable method for empowering the performance of ancillary techniques (ie, immunocytochemistry and molecular testing) with high diagnostic accuracy. METHODS: In total, 3171 LBC pleural and pericardic effusions were appraised between January 2000 and December 2013. They were classified as negative for malignancy (NM), suspicious for malignancy (SM), or positive for malignancy (PM). RESULTS: The cytologic diagnoses included 2721 NM effusions (2505 pleural and 216 pericardic), 104 SM effusions (93 pleural and 11 pericardic), and 346 PM effusions (321 pleural and 25 pericardic). The malignant pleural series included 76 unknown malignancies (36 SM and 40 PM effusions), 174 metastatic lesions (85 SM and 89 PM effusions), 14 lymphomas (3 SM and 11 PM effusions), 16 mesotheliomas (5 SM and 11 SM effusions), and 3 myelomas (all SM effusions). The malignant pericardic category included 20 unknown malignancies (5 SM and 15 PM effusions), 15 metastatic lesions (1 SM and 14 PM effusions), and 1 lymphoma (1 PM effusion). There were 411 conclusive immunocytochemical analyses and 47 molecular analyses, and the authors documented 88% sensitivity, 100% specificity, 98% diagnostic accuracy, 98% negative predictive value, and 100% positive predictive value for FNAC. CONCLUSIONS: FNAC represents a primary diagnostic tool for effusions and a reliable approach with which to determine the correct follow-up. Furthermore, LBC is useful for ancillary techniques, such as immunocytochemistry and molecular analysis, with feasible diagnostic and predictive utility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Engenharia Clínica)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the limitations of different types of remote sensing images, automated land-cover classifications of the Amazon várzea may yield poor accuracy indexes. One way to improve accuracy is through the combination of images from different sensors, by either image fusion or multi-sensor classifications. Therefore, the objective of this study was to determine which classification method is more efficient in improving land cover classification accuracies for the Amazon várzea and similar wetland environments - (a) synthetically fused optical and SAR images or (b) multi-sensor classification of paired SAR and optical images. Land cover classifications based on images from a single sensor (Landsat TM or Radarsat-2) are compared with multi-sensor and image fusion classifications. Object-based image analyses (OBIA) and the J.48 data-mining algorithm were used for automated classification, and classification accuracies were assessed using the kappa index of agreement and the recently proposed allocation and quantity disagreement measures. Overall, optical-based classifications had better accuracy than SAR-based classifications. Once both datasets were combined using the multi-sensor approach, there was a 2% decrease in allocation disagreement, as the method was able to overcome part of the limitations present in both images. Accuracy decreased when image fusion methods were used, however. We therefore concluded that the multi-sensor classification method is more appropriate for classifying land cover in the Amazon várzea.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT The spatial distribution of forest biomass in the Amazon is heterogeneous with a temporal and spatial variation, especially in relation to the different vegetation types of this biome. Biomass estimated in this region varies significantly depending on the applied approach and the data set used for modeling it. In this context, this study aimed to evaluate three different geostatistical techniques to estimate the spatial distribution of aboveground biomass (AGB). The selected techniques were: 1) ordinary least-squares regression (OLS), 2) geographically weighted regression (GWR) and, 3) geographically weighted regression - kriging (GWR-K). These techniques were applied to the same field dataset, using the same environmental variables derived from cartographic information and high-resolution remote sensing data (RapidEye). This study was developed in the Amazon rainforest from Sucumbíos - Ecuador. The results of this study showed that the GWR-K, a hybrid technique, provided statistically satisfactory estimates with the lowest prediction error compared to the other two techniques. Furthermore, we observed that 75% of the AGB was explained by the combination of remote sensing data and environmental variables, where the forest types are the most important variable for estimating AGB. It should be noted that while the use of high-resolution images significantly improves the estimation of the spatial distribution of AGB, the processing of this information requires high computational demand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Commercial stents, especially metallic ones, present several disadvantages, and this gives rise to the necessity of producing or coating stents with different materials, like natural polymers, in order to improve their biocompatibility and minimize the disadvantages of metallic ones. This review paper discusses some applications of natural-based polymers in stents, namely polylactic acid (PLA) for stent development and chitosan for biocompatible coatings of stents . Furthermore, some effective stent functionalization techniques will be discussed, namely Layer by Layer (LBL) technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia de Eletrónica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the trend towards tolerating hardware unreliability, accuracy is exchanged for cost savings. Running on less reliable machines, functionally correct code becomes risky and one needs to know how risk propagates so as to mitigate it. Risk estimation, however, seems to live outside the average programmer’s technical competence and core practice. In this paper we propose that program design by source-to-source transformation be risk-aware in the sense of making probabilistic faults visible and supporting equational reasoning on the probabilistic behaviour of programs caused by faults. This reasoning is carried out in a linear algebra extension to the standard, `a la Bird-Moor algebra of programming. This paper studies, in particular, the propagation of faults across standard program transformation techniques known as tupling and fusion, enabling the fault of the whole to be expressed in terms of the faults of its parts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temporal logics targeting real-time systems are traditionally undecidable. Based on a restricted fragment of MTL-R, we propose a new approach for the runtime verification of hard real-time systems. The novelty of our technique is that it is based on incremental evaluation, allowing us to e↵ectively treat duration properties (which play a crucial role in real-time systems). We describe the two levels of operation of our approach: offline simplification by quantifier removal techniques; and online evaluation of a three-valued interpretation for formulas of our fragment. Our experiments show the applicability of this mechanism as well as the validity of the provided complexity results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil (área de especialização em Estruturas e Geotecnia)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book was produced in the scope of a research project entitled “Navigating with ‘Magalhães’: Study on the Impact of Digital Media in Schoolchildren”. This study was conducted between May 2010 and May 2013 at the Communication and Society Research Centre, University of Minho, Portugal and it was funded by the Portuguese Foundation for Science and Technology (PTDC/CCI-COM/101381/2008).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

(Excerto) In times past, learning to read, write and do arithmetic was to get on course to earn the “writ of emancipation” in society. These skills are still essential today, but are not enough to live in society. Reading and critically understanding the world we live in, with all its complexity, difficulties and challenges, require not only other skills (learning to search for and validate information, reading with new codes and grammar, etc) but, to a certain extent, also metaskills, matrixes and mechanisms that are transversal to the different and new literacies, are necessary. They are needed not just to interpret but equally to communicate and participate in the little worlds that make up our everyday activities as well as, in a broader sense, in the world of the polis, which today is a global world.