975 resultados para post-processing


Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present an algorithm for estimating dense image correspondences. Our versatile approach lends itself to various tasks typical for video post-processing, including image morphing, optical flow estimation, stereo rectification, disparity/depth reconstruction, and baseline adjustment. We incorporate recent advances in feature matching, energy minimization, stereo vision, and data clustering into our approach. At the core of our correspondence estimation we use Efficient Belief Propagation for energy minimization. While state-of-the-art algorithms only work on thumbnail-sized images, our novel feature downsampling scheme in combination with a simple, yet efficient data term compression, can cope with high-resolution data. The incorporation of SIFT (Scale-Invariant Feature Transform) features into data term computation further resolves matching ambiguities, making long-range correspondence estimation possible. We detect occluded areas by evaluating the correspondence symmetry, we further apply Geodesic matting to automatically determine plausible values in these regions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diabetic Retinopathy (DR) is a complication of diabetes that can lead to blindness if not readily discovered. Automated screening algorithms have the potential to improve identification of patients who need further medical attention. However, the identification of lesions must be accurate to be useful for clinical application. The bag-of-visual-words (BoVW) algorithm employs a maximum-margin classifier in a flexible framework that is able to detect the most common DR-related lesions such as microaneurysms, cotton-wool spots and hard exudates. BoVW allows to bypass the need for pre- and post-processing of the retinographic images, as well as the need of specific ad hoc techniques for identification of each type of lesion. An extensive evaluation of the BoVW model, using three large retinograph datasets (DR1, DR2 and Messidor) with different resolution and collected by different healthcare personnel, was performed. The results demonstrate that the BoVW classification approach can identify different lesions within an image without having to utilize different algorithms for each lesion reducing processing time and providing a more flexible diagnostic system. Our BoVW scheme is based on sparse low-level feature detection with a Speeded-Up Robust Features (SURF) local descriptor, and mid-level features based on semi-soft coding with max pooling. The best BoVW representation for retinal image classification was an area under the receiver operating characteristic curve (AUC-ROC) of 97.8% (exudates) and 93.5% (red lesions), applying a cross-dataset validation protocol. To assess the accuracy for detecting cases that require referral within one year, the sparse extraction technique associated with semi-soft coding and max pooling obtained an AUC of 94.2 ± 2.0%, outperforming current methods. Those results indicate that, for retinal image classification tasks in clinical practice, BoVW is equal and, in some instances, surpasses results obtained using dense detection (widely believed to be the best choice in many vision problems) for the low-level descriptors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to evaluate the stress distribution in the cervical region of a sound upper central incisor in two clinical situations, standard and maximum masticatory forces, by means of a 3D model with the highest possible level of fidelity to the anatomic dimensions. Two models with 331,887 linear tetrahedral elements that represent a sound upper central incisor with periodontal ligament, cortical and trabecular bones were loaded at 45º in relation to the tooth's long axis. All structures were considered to be homogeneous and isotropic, with the exception of the enamel (anisotropic). A standard masticatory force (100 N) was simulated on one of the models, while on the other one a maximum masticatory force was simulated (235.9 N). The software used were: PATRAN for pre- and post-processing and Nastran for processing. In the cementoenamel junction area, tensile forces reached 14.7 MPa in the 100 N model, and 40.2 MPa in the 235.9 N model, exceeding the enamel's tensile strength (16.7 MPa). The fact that the stress concentration in the amelodentinal junction exceeded the enamel's tensile strength under simulated conditions of maximum masticatory force suggests the possibility of the occurrence of non-carious cervical lesions such as abfractions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tailoring specified vibration modes is a requirement for designing piezoelectric devices aimed at dynamic-type applications. A technique for designing the shape of specified vibration modes is the topology optimization method (TOM) which finds an optimum material distribution inside a design domain to obtain a structure that vibrates according to specified eigenfrequencies and eigenmodes. Nevertheless, when the TOM is applied to dynamic problems, the well-known grayscale or intermediate material problem arises which can invalidate the post-processing of the optimal result. Thus, a more natural way for solving dynamic problems using TOM is to allow intermediate material values. This idea leads to the functionally graded material (FGM) concept. In fact, FGMs are materials whose properties and microstructure continuously change along a specific direction. Therefore, in this paper, an approach is presented for tailoring user-defined vibration modes, by applying the TOM and FGM concepts to design functionally graded piezoelectric transducers (FGPT) and non-piezoelectric structures (functionally graded structures-FGS) in order to achieve maximum and/or minimum vibration amplitudes at certain points of the structure, by simultaneously finding the topology and material gradation function. The optimization problem is solved by using sequential linear programming. Two-dimensional results are presented to illustrate the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In natural estuaries, contaminant transport is driven by the turbulent momentum mixing. The predictions of scalar dispersion can rarely be predicted accurately because of a lack of fundamental understanding of the turbulence structure in estuaries. Herein detailed turbulence field measurements were conducted at high frequency and continuously for up to 50 hours per investigation in a small subtropical estuary with semi-diurnal tides. Acoustic Doppler velocimetry was deemed the most appropriate measurement technique for such small estuarine systems with shallow water depths (less than 0.5 m at low tides), and a thorough post-processing technique was applied. The estuarine flow is always a fluctuating process. The bulk flow parameters fluctuated with periods comparable to tidal cycles and other large-scale processes. But turbulence properties depended upon the instantaneous local flow properties. They were little affected by the flow history, but their structure and temporal variability were influenced by a variety of mechanisms. This resulted in behaviour which deviated from that for equilibrium turbulent boundary layer induced by velocity shear only. A striking feature of the data sets is the large fluctuations in all turbulence characteristics during the tidal cycle. This feature was rarely documented, but an important difference between the data sets used in this study from earlier reported measurements is that the present data were collected continuously at high frequency during relatively long periods. The findings bring new lights in the fluctuating nature of momentum exchange coefficients and integral time and length scales. These turbulent properties should not be assumed constant.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In natural estuaries, the predictions of scalar dispersion are rarely predicted accurately because of a lack of fundamental understanding of the turbulence structure in estuaries. Herein detailed turbulence field measurements were conducted continuously at high frequency for 50 hours in the upper zone of a small subtropical estuary with semi-diurnal tides. Acoustic Doppler velocimetry was deemed the most appropriate measurement technique for such shallow water depths (less than 0.4 m at low tides), and a thorough post-processing technique was applied. In addition, some experiments were conducted in laboratory under controlled conditions using water and soil samples collected in the estuary to test the relationship between acoustic backscatter strength and suspended sediment load. A striking feature of the field data set was the large fluctuations in all turbulence characteristics during the tidal cycle, including the suspended sediment flux. This feature was rarely documented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Land related information about the Earth's surface is commonIJ found in two forms: (1) map infornlation and (2) satellite image da ta. Satellite imagery provides a good visual picture of what is on the ground but complex image processing is required to interpret features in an image scene. Increasingly, methods are being sought to integrate the knowledge embodied in mop information into the interpretation task, or, alternatively, to bypass interpretation and perform biophysical modeling directly on derived data sources. A cartographic modeling language, as a generic map analysis package, is suggested as a means to integrate geographical knowledge and imagery in a process-oriented view of the Earth. Specialized cartographic models may be developed by users, which incorporate mapping information in performing land classification. In addition, a cartographic modeling language may be enhanced with operators suited to processing remotely sensed imagery. We demonstrate the usefulness of a cartographic modeling language for pre-processing satellite imagery, and define two nerv cartographic operators that evaluate image neighborhoods as post-processing operations to interpret thematic map values. The language and operators are demonstrated with an example image classification task.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introdução – A mamografia é o principal método de diagnóstico por imagem utilizado no rastreio e diagnóstico do cancro da mama, sendo a modalidade de imagem recomendada em vários países da Europa e Estados Unidos para utilização em programas de rastreio. A implementação da tecnologia digital causou alterações na prática da mamografia, nomeadamente a necessidade de adaptar os programas de controlo de qualidade. Objetivos – Caracterizar a tecnologia instalada para mamografia em Portugal e as práticas adotadas na sua utilização pelos profissionais de saúde envolvidos. Concluir sobre o nível de harmonização das práticas em mamografia em Portugal e a conformidade com as recomendações internacionais. Identificar oportunidades para otimização que permitam assegurar a utilização eficaz e segura da tecnologia. Metodologia – Pesquisa e recolha de dados sobre a tecnologia instalada, fornecidos por fontes governamentais, prestadores de serviços de mamografia e indústria. Construção de três questionários, orientados ao perfil do médico radiologista, técnico de radiologia com atividade em mamografia digital e técnico de radiologia coordenador. Os questionários foram aplicados em 65 prestadores de serviços de mamografia selecionados com base em critérios de localização geográfica, tipo de tecnologia instalada e perfil da instituição. Resultados – Foram identificados 441 sistemas para mamografia em Portugal. A tecnologia mais frequente (62%) e vulgarmente conhecida por radiografia computorizada (computed radiography) é constituída por um detector (image plate) de material fotoestimulável inserido numa cassete de suporte e por um sistema de processamento ótico. A maioria destes sistemas (78%) está instalada em prestadores privados. Aproximadamente 12% dos equipamentos instalados são sistemas para radiografia digital direta (Direct Digital Radiography – DDR). Os critérios para seleção dos parâmetros técnicos de exposição variam, observando-se que em 65% das instituições são adotadas as recomendações dos fabricantes do equipamento. As ferramentas de pós-processamento mais usadas pelos médicos radiologistas são o ajuste do contraste e brilho e magnificação total e/ou localizada da imagem. Quinze instituições (em 19) têm implementado um programa de controlo de qualidade. Conclusões – Portugal apresenta um parque de equipamentos heterogéneo que inclui tecnologia obsoleta e tecnologia “topo de gama”. As recomendações/guidelines (europeias ou americanas) não são adotadas formalmente na maioria das instituições como guia para fundamentação das práticas em mamografia, dominando as recomendações dos fabricantes do equipamento. Foram identificadas, pelos técnicos de radiologia e médicos radiologistas, carências de formação especializada, nomeadamente nas temáticas da intervenção mamária, otimização da dose e controlo da qualidade. A maioria dos inquiridos concorda com a necessidade de certificação da prática da mamografia em Portugal e participaria num programa voluntário. ABSTRACT - Introduction – Mammography is the gold standard for screening and imaging diagnosis of breast disease. It is the imaging modality recommended by screening programs in various countries in Europe and the United States. The implementation of the digital technology promoted changes in mammography practice and triggered the need to adjust quality control programs. Aims –Characterize the technology for mammography installed in Portugal. Assess practice in use in mammography and its harmonization and compliance to international guidelines. Identify optimization needs to promote an effective and efficient use of digital mammography to full potential. Methodology – Literature review was performed. Data was collected from official sources (governmental bodies, mammography healthcare providers and medical imaging industry) regarding the number and specifications of mammography equipment installed in Portugal. Three questionnaires targeted at radiologists, breast radiographers and the chief-radiographer were designed for data collection on the technical and clinical practices in mammography. The questionnaires were delivered in a sample of 65 mammography providers selected according to geographical criteria, type of technology and institution profile. Results – Results revealed 441 mammography systems installed in Portugal. The most frequent (62%) technology type are computerized systems (CR) mostly installed in the private sector (78%). 12% are direct radiography systems (DDR). The criteria for selection of the exposure parameters differ between the institutions with the majority (65%) following the recommendations from the manufacturers. The use of available tools for post-processing is limited being the most frequently reported tools used the contrast/ brightness and Zoom or Pan Magnification tools. Fifteen participant institutions (out of 19) have implemented a quality control programme. Conclusions – The technology for mammography in Portugal is heterogeneous and includes both obsolete and state of the art equipment. International guidelines (European or American) are not formally implemented and the manufacturer recommendations are the most frequently used guidance. Education and training needs were identified amongst the healthcare professionals (radiologists and radiographers) with focus in the areas of mammography intervention, patient dose optimization and quality control. The majority of the participants agree with the certification of mammography in Portugal.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mestrado em Radiações Aplicadas às Tecnologias da Saúde. Especialização: Ressonância Magnética.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Frame rate upconversion (FRUC) is an important post-processing technique to enhance the visual quality of low frame rate video. A major, recent advance in this area is FRUC based on trilateral filtering which novelty mainly derives from the combination of an edge-based motion estimation block matching criterion with the trilateral filter. However, there is still room for improvement, notably towards reducing the size of the uncovered regions in the initial estimated frame, this means the estimated frame before trilateral filtering. In this context, proposed is an improved motion estimation block matching criterion where a combined luminance and edge error metric is weighted according to the motion vector components, notably to regularise the motion field. Experimental results confirm that significant improvements are achieved for the final interpolated frames, reaching PSNR gains up to 2.73 dB, on average, regarding recent alternative solutions, for video content with varied motion characteristics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Discrete data representations are necessary, or at least convenient, in many machine learning problems. While feature selection (FS) techniques aim at finding relevant subsets of features, the goal of feature discretization (FD) is to find concise (quantized) data representations, adequate for the learning task at hand. In this paper, we propose two incremental methods for FD. The first method belongs to the filter family, in which the quality of the discretization is assessed by a (supervised or unsupervised) relevance criterion. The second method is a wrapper, where discretized features are assessed using a classifier. Both methods can be coupled with any static (unsupervised or supervised) discretization procedure and can be used to perform FS as pre-processing or post-processing stages. The proposed methods attain efficient representations suitable for binary and multi-class problems with different types of data, being competitive with existing methods. Moreover, using well-known FS methods with the features discretized by our techniques leads to better accuracy than with the features discretized by other methods or with the original features. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores - Ramo de Sistemas Autónomos

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os métodos utilizados pela Medicina moderna no âmbito da Imagem Molecular e na sua capacidade de diagnosticar a partir da “Função do Orgão” em vez da “Morfologia do Orgão”, vieram trazer á componente fundamental desta modalidade da Imagiologia Médica – A Medicina Nuclear – uma importância acrescida, que se tem traduzido num aumento significativo no recurso á sua utilização nas diferentes formas das suas aplicações clínicas. Para além dos aspectos meramente clínicos, que só por si seriam suficientes para ocupar várias dissertações como a presente; a própria natureza desta técnica de imagem, com a sua inerente baixa resolução e tempos longos de aquisição, vieram trazer preocupações acrescidas quanto ás questões relacionadas com a produtividade (nº de estudos a realizar por unidade de tempo); com a qualidade (aumento da resolução da imagem obtida) e, com os níveis de actividade radioactiva injectada nos pacientes (dose de radiação efectiva sobre as populações). Conhecidas que são então as limitações tecnológicas associadas ao desenho dos equipamentos destinados á aquisição de dados em Medicina Nuclear, que apesar dos avanços introduzidos, mantêm mais ou menos inalteráveis os conceitos base de funcionamento de uma Câmara Gama, imaginou-se a alteração significativa dos parâmetros de aquisição (tempo, resolução, actividade), actuando não ao nível das condições técnico-mecânicas dessa aquisição, mas essencialmente ao nível do pós-processamento dos dados adquiridos segundo os métodos tradicionais e que ainda constituem o estado da arte desta modalidade. Este trabalho tem então como objectivo explicar por um lado, com algum pormenor, as bases tecnológicas que desde sempre têm suportado o funcionamento dos sistemas destinados á realização de exames de Medicina Nuclear, mas sobretudo, apresentar as diferenças com os inovadores métodos, que aplicando essencialmente conhecimento (software), permitiram responder ás questões acima levantadas.