897 resultados para Method of Theoretical Images
Resumo:
Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.
Resumo:
One of the main problems of hyperspectral data analysis is the presence of mixed pixels due to the low spatial resolution of such images. Linear spectral unmixing aims at inferring pure spectral signatures and their fractions at each pixel of the scene. The huge data volumes acquired by hyperspectral sensors put stringent requirements on processing and unmixing methods. This letter proposes an efficient implementation of the method called simplex identification via split augmented Lagrangian (SISAL) which exploits the graphics processing unit (GPU) architecture at low level using Compute Unified Device Architecture. SISAL aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The proposed implementation is performed in a pixel-by-pixel fashion using coalesced accesses to memory and exploiting shared memory to store temporary data. Furthermore, the kernels have been optimized to minimize the threads divergence, therefore achieving high GPU occupancy. The experimental results obtained for the simulated and real hyperspectral data sets reveal speedups up to 49 times, which demonstrates that the GPU implementation can significantly accelerate the method's execution over big data sets while maintaining the methods accuracy.
Resumo:
Dissertation to Obtain the Degree of Master in Biomedical Engineering
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Geológica (Georrecursos)
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.
Resumo:
Tese de Doutoramento em Arquitectura / Cultura Arquitectónica.
Resumo:
Fluorescence in situ hybridization (FISH) is based on the use of fluorescent staining dyes, however, the signal intensity of the images obtained by microscopy is seldom quantified with accuracy by the researcher. The development of innovative digital image processing programs and tools has been trying to overcome this problem, however, the determination of fluorescent intensity in microscopy images still has issues due to the lack of precision in the results and the complexity of existing software. This work presents FISHji, a set of new ImageJ methods for automated quantification of fluorescence in images obtained by epifluorescence microscopy. To validate the methods, results obtained by FISHji were compared with results obtained by flow cytometry. The mean correlation between FISHji and flow cytometry was high and significant, showing that the imaging methods are able to accurately assess the signal intensity of fluorescence images. FISHji are available for non-commercial use at http://paginas.fe.up.pt/nazevedo/.
Resumo:
Esta propuesta de investigación se enmarca en los encuentros y discusiones que se están realizando en la Escuela de Artes de la Facultad de Filosofía y Humanidades (CEPIA, CIFFyH y SeCyT, de la U.N.C), en relación a las problemáticas de la investigación en Artes. Los encuentros llevados a cabo en diversas ocasiones y a lo largo de 2009, demuestran la necesidad de establecer un campo particular de la investigación y su relación con la práctica artística. Este proyecto nace de estas inquietudes y establece ciertos ejes de trabajo que permiten poner en práctica ciertos esbozos imaginados. En este sentido, nuestra propuesta se centra en definir la misma construcción escénica como objeto de estudio, delimitando sobre éste la problemática de lo real en el trabajo de la ficción, proponiendo además un equipo que permita investigar en su propio desarrollo creativo las diversas variables que entran en juego. De este modo, se trabajará en una propuesta de laboratorio escénico donde los planteos de orden teórico atraviesen la práctica y, a su vez, la observación de ésta permita una reelaboración y profundización del pensamiento contemporáneo sobre la problemática ejecución/representación, desde los diversos órdenes en que ésta interviene. La idea de representación teatral que planteaba Aristóteles señala que las acciones devienen necesariamente en la definición del carácter de los personajes. Este concepto es clave en el desarrollo del teatro occidental y por ende en las diferentes concepciones de actor. La definición de acción dada por Aristóteles es problemática para parte del teatro contemporáneo ya que supone que toda acción es mimética. También da por supuesto que en el teatro se conforman personajes, y que la unidad narrativa está dada por una programática, que es definida por la acción.La presente investigación se propone indagar en la relación entre la ejecución de la acción y su representación en el desarrollo de un laboratorio teatral. Esto implica que necesariamente es aplicada al trabajo escénico. Nuestra hipótesis de partida es que la relación conflictiva entre acción, ejecución de la misma y representación, se produce a partir de la operación material sobre lo real . Estas intersecciones podrían ser pensadas como una teoría del montaje donde la corporalidad es el principio necesario e irreductible de la construcción. La intersección de lo real es una problemática que permite ahondar sobre los procedimientos por los cuales se construye la escena. La idea de un teatro material, obliga a pensar con qué procedimientos se construye ficción. Un teatro que intenta recalar en lo “real” como modo de señalar la cosa misma, se propone, desde la perspectiva de la realización, indagar en los mecanismos de su construcción (procedimientos), por lo cual supone que la actividad teatral puede dar cuenta de los procesos por los cuales se realiza. La realización de una acción, puede remitir a sí misma y genera una relación “extraña” y ambigua con el mundo de referencia. La acción en sí misma, pone en cuestión la idea de modelo y da cuenta de una crisis en la representación. La teoría ha intentado dividir y sistematizar de manera binaria la manifestación teatral: Teatro de Representación/ Teatro Performático, para distinguir un teatro vinculado a la creación de personajes o para relacionarlo a un teatro de ejecución. Sin embargo, pensamos que es posible encontrar en la producción escénica, intersecciones de lo real que median el mundo de la representación y el de la performance para la construcción de ficción. Nuestra hipótesis de base es que si intervenimos el plano de la ejecución en el actor, la representación varía sustancialmente sus mecanismos de producción de sentido. Este primer planteo no es conflictivo hasta que se pone de manifiesto lo real. This research’s proposal is framed into the meetings and discussions that have been taking place in the School of Arts of the Faculty of Philosophy and Humanities (CEPIA, CIFFyH y SeCyT, of the National University of Cordoba), concerning the difficulties of research in Arts. The meetings carried out along 2009, demonstrate the need to establish a particular field of research and its relation to the practice of arts. This project is born from these concerns and it establishes central axis for the work, that enables us to put on practice some sketches imagined. In this regard, our proposal focuses on defining the construction of the scene as object of study, in itself, delimiting the issue of “the real” in the work of fiction. Proposing, furthermore, a team for researching the many variables that comes into play, in their own creative development. Consequently, the work will be developed in the method of a scenic laboratory, where the theoretical proposals cross over the practice and, in turn, these observations allow a reworking and deepen of contemporary thinking about the problematic of performance / representation, from the diverse orders in which it intervene. The idea of theatrical performance proposed by Aristotle, indicate that actions necessarily turn into the definition of the nature of the characters. This concept is key to the development of Western theaterand consequently on the different conceptions of actor. The definition of action given by Aristotle is problematic for a part of the contemporary theater, because it assumes that every action is mimetic. Also presumes that in the theatre the characters are formed, and narrative unit is given by a programmatic, which is defined by the action. This research proposes to explore the relationship between implementation of the action and its representation in the development of a theatrical laboratory. This implies that it is applied necessarily to the scenic practice. Our preliminary hypothesis is that the co.
Resumo:
The primary purpose of this exploratory empirical study is to examine the structural stability of a limited number of alternative explanatory factors of strategic change. On the basis of theoretical arguments and prior empirical evidence from two traditional perspectives, we propose an original empirical framework to analyse whether these potential explanatory factors have remained stable over time in a highly turbulent environment. This original question is explored in a particular setting: the population of Spanish private banks. The firms of this industry have experienced a high level of strategic mobility as a consequence of fundamental changes undergone in their environmental conditions over the last two decades (mainly changes related to the new banking and financial regulation process). Our results consistently support that the effect of most explanatory factors of strategic mobility considered did not remain stable over the whole period of analysis. From this point of view, the study sheds new light on major debates and dilemmas in the field of strategy regarding why firms change their competitive patterns over time and, hence, to what extent the "contextdependency" of alternative views of strategic change as their relative validation can vary over time for a given population. Methodologically, this research makes two major contributions to the study of potential determinants of strategic change. First, the definition and measurement of strategic change employing a new grouping method, the Model-based Cluster Method or MCLUST. Second, in order to asses the possible effect of determinants of strategic mobility we have controlled the non-observable heterogeneity using logistic regression models for panel data.
Resumo:
Intraoperative cardiac imaging plays a key role during transcatheter aortic valve replacement. In recent years, new techniques and new tools for improved image quality and virtual navigation have been proposed, in order to simplify and standardize stent valve positioning and implantation. But routine performance of the new techniques may require major economic investments or specific knowledge and skills and, for this reason, they may not be accessible to the majority of cardiac centres involved in transcatheter valve replacement projects. Additionally, they still require injections of contrast medium to obtain computed images. Therefore, we have developed and describe here a very simple and intuitive method of positioning balloon-expandable stent valves, which represents the evolution of the 'dumbbell' technique for echocardiography-guided transcatheter valve replacement without angiography. This method, based on the partial inflation of the balloon catheter during positioning, traps the crimped valve in the aortic valve orifice and, consequently, very near to the ideal landing zone. It does not require specific echocardiographic knowledge; it does not require angiographies that increase the risk of postoperative kidney failure in elderly patients, and it can be also performed in centres not equipped with a hybrid operating room.
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
The sensitivity of altitudinal and latitudinal tree-line ecotones to climate change, particularly that of temperature, has received much attention. To improve our understanding of the factors affecting tree-line position, we used the spatially explicit dynamic forest model TreeMig. Although well-suited because of its landscape dynamics functions, TreeMig features a parabolic temperature growth response curve, which has recently been questioned. and the species parameters are not specifically calibrated for cold temperatures. Our main goals were to improve the theoretical basis of the temperature growth response curve in the model and develop a method for deriving that curve's parameters from tree-ring data. We replaced the parabola with an asymptotic curve, calibrated for the main species at the subalpine (Swiss Alps: Pinus cembra, Larix decidua, Picea abies) and boreal (Fennoscandia: Pinus sylvestris, Betula pubescens, P. abies) tree-lines. After fitting new parameters, the growth curve matched observed tree-ring widths better. For the subalpine species, the minimum degree-day sum allowing, growth (kDDMin) was lowered by around 100 degree-days; in the case of Larix, the maximum potential ring-width was increased to 5.19 mm. At the boreal tree-line, the kDDMin for P. sylvestris was lowered by 210 degree-days and its maximum ring-width increased to 2.943 mm; for Betula (new in the model) kDDMin was set to 325 degree-days and the maximum ring-width to 2.51 mm; the values from the only boreal sample site for Picea were similar to the subalpine ones, so the same parameters were used. However, adjusting the growth response alone did not improve the model's output concerning species' distributions and their relative importance at tree-line. Minimum winter temperature (MinWiT, mean of the coldest winter month), which controls seedling establishment in TreeMig, proved more important for determining distribution. Picea, P. sylvestris and Betula did not previously have minimum winter temperature limits, so these values were set to the 95th percentile of each species' coldest MinWiT site (respectively -7, -11, -13). In a case study for the Alps, the original and newly calibrated versions of TreeMig were compared with biomass data from the National Forest Inventor), (NFI). Both models gave similar, reasonably realistic results. In conclusion, this method of deriving temperature responses from tree-rings works well. However, regeneration and its underlying factors seem more important for controlling species' distributions than previously thought. More research on regeneration ecology, especially at the upper limit of forests. is needed to improve predictions of tree-line responses to climate change further.
Resumo:
La sospita de bacterièmia relacionada a catèter (BRC) necessita la retirada d’aquest, confirmant-se a posteriori només en un 15-25%. La diferencia en el temps de positivització d´ hemocultius (DTP) ha demostrat ser un mètode fiable per el diagnòstic de BRC evitant la retirada del catèter. Amb la intenció de comprovar la utilitat clínica de la DTP, l’hem comparada amb un mètode diagnòstic estàndard. Hem inclòs 133 pacients ingressats a una unitat de cures intensives portadors de catèters venosos centrals. 56 pacients s’han aleatoritzats. No hem trobat diferències significatives en quant a morbi-mortalitat en els 2 grups havent evitat 70% de retirada innecessària de catèters en el grup de DTP.
Resumo:
A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques