33 resultados para Distributed parameters
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
Esta tese pretende contribuir para o estudo e análise dos factores relacionados com as técnicas de aquisição de imagens radiológicas digitais, a qualidade diagnóstica e a gestão da dose de radiação em sistema de radiologia digital. A metodologia encontra-se organizada em duas componentes. A componente observacional, baseada num desenho do estudo de natureza retrospectiva e transversal. Os dados recolhidos a partir de sistemas CR e DR permitiram a avaliação dos parâmetros técnicos de exposição utilizados em radiologia digital, a avaliação da dose absorvida e o índice de exposição no detector. No contexto desta classificação metodológica (retrospectiva e transversal), também foi possível desenvolver estudos da qualidade diagnóstica em sistemas digitais: estudos de observadores a partir de imagens arquivadas no sistema PACS. A componente experimental da tese baseou-se na realização de experiências em fantomas para avaliar a relação entre dose e qualidade de imagem. As experiências efectuadas permitiram caracterizar as propriedades físicas dos sistemas de radiologia digital, através da manipulação das variáveis relacionadas com os parâmetros de exposição e a avaliação da influência destas na dose e na qualidade da imagem. Utilizando um fantoma contraste de detalhe, fantomas antropomórficos e um fantoma de osso animal, foi possível objectivar medidas de quantificação da qualidade diagnóstica e medidas de detectabilidade de objectos. Da investigação efectuada, foi possível salientar algumas conclusões. As medidas quantitativas referentes à performance dos detectores são a base do processo de optimização, permitindo a medição e a determinação dos parâmetros físicos dos sistemas de radiologia digital. Os parâmetros de exposição utilizados na prática clínica mostram que a prática não está em conformidade com o referencial Europeu. Verifica-se a necessidade de avaliar, melhorar e implementar um padrão de referência para o processo de optimização, através de novos referenciais de boa prática ajustados aos sistemas digitais. Os parâmetros de exposição influenciam a dose no paciente, mas a percepção da qualidade de imagem digital não parece afectada com a variação da exposição. Os estudos que se realizaram envolvendo tanto imagens de fantomas como imagens de pacientes mostram que a sobreexposição é um risco potencial em radiologia digital. A avaliação da qualidade diagnóstica das imagens mostrou que com a variação da exposição não se observou degradação substancial da qualidade das imagens quando a redução de dose é efectuada. Propõe-se o estudo e a implementação de novos níveis de referência de diagnóstico ajustados aos sistemas de radiologia digital. Como contributo da tese, é proposto um modelo (STDI) para a optimização de sistemas de radiologia digital.
Evaluation of exposure parameters in plain radiography: a comparative study with european guidelines
Resumo:
Typical distribution of exposure parameters in plain radiography is unknown in Portugal. This study aims to identify exposure parameters that are being used in plain radiography in the Lisbon area and to compare the collected data with European references [Commission of European Communities (CEC) guidelines]. The results show that in four examinations (skull, chest, lumbar spine and pelvis), there is a strong tendency of using exposure times above the European recommendation. The X-ray tube potential values (in kV) are below the recommended values from CEC guidelines. This study shows that at a local level (Lisbon region), radiographic practice does not comply with CEC guidelines concerning exposure techniques. Further national/local studies are recommended with the objective to improve exposure optimisation and technical procedures in plain radiography. This study also suggests the need to establish national/local diagnostic reference levels and to proceed to effective measurements for exposure optimisation.
Resumo:
This work reports on the synthesis of CrO2 thin films by atmospheric pressure CVD using chromium trioxide (CrO3) and oxygen. Highly oriented (100) CrO2 films containing highly oriented (0001) Cr2O3 were grown onto Al2O3(0001) substrates. Films display a sharp magnetic transition at 375 K and a saturation magnetization of 1.92 mu(B)/f.u., close to the bulk value of 2 mu(B)/f.u. for the CrO2.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. But standard protocols of the Gated-SPECT studies require long acquisition times for each study. It is therefore important to reduce as much as possible the total duration of image acquisition. However, it is known that this reduction leads to decrease on counts statistics per projection and raises doubts about the validity of the functional parameters determined by Gated-SPECT. Considering that, it’s difficult to carry out this analysis in real patients. For ethical, logistical and economical matters, simulated studies could be required for this analysis. Objective: Evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters (LVEF – left ventricular ejection fraction, EDV – end-diastolic volume, ESV – end-sistolic volume) using routine software procedures.
Resumo:
This paper presents a distributed model predictive control (DMPC) for indoor thermal comfort that simultaneously optimizes the consumption of a limited shared energy resource. The control objective of each subsystem is to minimize the heating/cooling energy cost while maintaining the indoor temperature and used power inside bounds. In a distributed coordinated environment, the control uses multiple dynamically decoupled agents (one for each subsystem/house) aiming to achieve satisfaction of coupling constraints. According to the hourly power demand profile, each house assigns a priority level that indicates how much is willing to bid in auction for consume the limited clean resource. This procedure allows the bidding value vary hourly and consequently, the agents order to access to the clean energy also varies. Despite of power constraints, all houses have also thermal comfort constraints that must be fulfilled. The system is simulated with several houses in a distributed environment.
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
Processes are a central entity in enterprise collaboration. Collaborative processes need to be executed and coordinated in a distributed Computational platform where computers are connected through heterogeneous networks and systems. Life cycle management of such collaborative processes requires a framework able to handle their diversity based on different computational and communication requirements. This paper proposes a rational for such framework, points out key requirements and proposes it strategy for a supporting technological infrastructure. Beyond the portability of collaborative process definitions among different technological bindings, a framework to handle different life cycle phases of those definitions is presented and discussed. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Nickel-copper metallic foams were electrodeposited from an acidic electrolyte, using hydrogen bubble evolution as a dynamic template. Their morphology and chemical composition was studied by scanning electron microscopy and related to the deposition parameters (applied current density and deposition time). For high currents densities (above 1 A cm(-2)) the nickel-copper deposits have a three-dimensional foam-like morphology with randomly distributed nearly-circular pores whose walls present an open dendritic structure. The nickel-copper foams are crystalline and composed of pure nickel and a copper-rich phase containing nickel in solid solution. The electrochemical behaviour of the material was studied by cyclic voltammetry and chronopotentiometry (charge-discharge curves) aiming at its application as a positive electrode for supercapacitors. Cyclic voltammograms showed that the Ni-Cu foams have a pseudocapacitive behaviour. The specific capacitance was calculated from charge-discharge data and the best value (105 F g(-1) at 1 mA cm(-2)) was obtained for nickel-copper foams deposited at 1.8 A cm(-2) for 180 s. Cycling stability of these foams was also assessed and they present a 90 % capacitance retention after 10,000 cycles at 10 mA cm(-2).
Resumo:
Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular - Ramo de especialização: Intervenção Cardiovascular
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV). The purpose of this study is to evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters using routine software procedures. Methods: Gated-SPET studies were simulated using Monte Carlo GATE package and NURBS phantom. Simulated data were reconstructed and processed using the commercial software package Quantitative Gated-SPECT. The Bland-Altman and Mann-Whitney-Wilcoxon tests were used to analyze the influence of the number of total counts in the calculation of LV myocardium functional parameters. Results: In studies simulated with 3MBq in the myocardium there were significant differences in the functional parameters: Left ventricular ejection fraction (LVEF), end-systolic volume (ESV), Motility and Thickness; between studies acquired with 15s/projection and 30s/projection. Simulations with 4.2MBq show significant differences in LVEF, end-diastolic volume (EDV) and Thickness. Meanwhile in the simulations with 5.4MBq and 8.4MBq the differences were statistically significant for Motility and Thickness. Conclusion: The total number of counts per simulation doesn't significantly interfere with the determination of Gated-SPET functional parameters using the administered average activity of 450MBq to 5.4MBq in myocardium.