11 resultados para Error correction methods

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In video communication systems, the video signals are typically compressed and sent to the decoder through an error-prone transmission channel that may corrupt the compressed signal, causing the degradation of the final decoded video quality. In this context, it is possible to enhance the error resilience of typical predictive video coding schemes using as inspiration principles and tools from an alternative video coding approach, the so-called Distributed Video Coding (DVC), based on the Distributed Source Coding (DSC) theory. Further improvements in the decoded video quality after error-prone transmission may also be obtained by considering the perceptual relevance of the video content, as distortions occurring in different regions of a picture have a different impact on the user's final experience. In this context, this paper proposes a Perceptually Driven Error Protection (PDEP) video coding solution that enhances the error resilience of a state-of-the-art H.264/AVC predictive video codec using DSC principles and perceptual considerations. To increase the H.264/AVC error resilience performance, the main technical novelties brought by the proposed video coding solution are: (i) design of an improved compressed domain perceptual classification mechanism; (ii) design of an improved transcoding tool for the DSC-based protection mechanism; and (iii) integration of a perceptual classification mechanism in an H.264/AVC compliant codec with a DSC-based error protection mechanism. The performance results obtained show that the proposed PDEP video codec provides a better performing alternative to traditional error protection video coding schemes, notably Forward Error Correction (FEC)-based schemes. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hoje em dia, há cada vez mais informação audiovisual e as transmissões ou ficheiros multimédia podem ser partilhadas com facilidade e eficiência. No entanto, a adulteração de conteúdos vídeo, como informação financeira, notícias ou sessões de videoconferência utilizadas num tribunal, pode ter graves consequências devido à importância desse tipo de informação. Surge então, a necessidade de assegurar a autenticidade e a integridade da informação audiovisual. Nesta dissertação é proposto um sistema de autenticação de vídeo H.264/Advanced Video Coding (AVC), denominado Autenticação de Fluxos utilizando Projecções Aleatórias (AFPA), cujos procedimentos de autenticação, são realizados ao nível de cada imagem do vídeo. Este esquema permite um tipo de autenticação mais flexível, pois permite definir um limite máximo de modificações entre duas imagens. Para efectuar autenticação é utilizada uma nova técnica de autenticação de imagens, que combina a utilização de projecções aleatórias com um mecanismo de correcção de erros nos dados. Assim é possível autenticar cada imagem do vídeo, com um conjunto reduzido de bits de paridade da respectiva projecção aleatória. Como a informação de vídeo é tipicamente, transportada por protocolos não fiáveis pode sofrer perdas de pacotes. De forma a reduzir o efeito das perdas de pacotes, na qualidade do vídeo e na taxa de autenticação, é utilizada Unequal Error Protection (UEP). Para validação e comparação dos resultados implementou-se um sistema clássico que autentica fluxos de vídeo de forma típica, ou seja, recorrendo a assinaturas digitais e códigos de hash. Ambos os esquemas foram avaliados, relativamente ao overhead introduzido e da taxa de autenticação. Os resultados mostram que o sistema AFPA, utilizando um vídeo com qualidade elevada, reduz o overhead de autenticação em quatro vezes relativamente ao esquema que utiliza assinaturas digitais e códigos de hash.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introdução – A estimativa da função renal relativa (FRR) através de cintigrafia renal (CR) com ácido dimercaptossuccínico marcado com tecnécio-99 metaestável (99mTc-DMSA) pode ser influenciada pela profundidade renal (PR), atendendo ao efeito de atenuação por parte dos tecidos moles que envolvem os rins. Dado que raramente é conhecida esta mesma PR, diferentes métodos de correção de atenuação (CA) foram desenvolvidos, nomeadamente os que utilizam fórmulas empíricas, como os de Raynaud, de Taylor ou de Tonnesen, ou recorrendo à aplicação direta da média geométrica (MG). Objetivos – Identificar a influência dos diferentes métodos de CA na quantificação da função renal relativa através da CR com 99mTc-DMSA e avaliar a respetiva variabilidade dos resultados de PR. Metodologia – Trinta e um pacientes com indicação para realização de CR com 99mTc-DMSA foram submetidos ao mesmo protocolo de aquisição. O processamento foi efetuado por dois operadores independentes, três vezes por exame, variando para o mesmo processamento o método de determinação da FRR: Raynaud, Taylor, Tonnesen, MG ou sem correção de atenuação (SCA). Aplicou-se o teste de Friedman para o estudo da influência dos diferentes métodos de CA e a correlação de Pearson para a associação e significância dos valores de PR com as variáveis idade, peso e altura. Resultados – Da aplicação do teste de Friedman verificaram-se diferenças estatisticamente significativas entre os vários métodos (p=0,000), excetuando as comparações SCA/Raynaud, Tonnesen/MG e Taylor/MG (p=1,000) para ambos os rins. A correlação de Pearson demonstra que a variável peso apresenta uma correlação forte positiva com todos os métodos de cálculo da PR. Conclusões – O método de Taylor, entre os três métodos de cálculo de PR, é o que apresenta valores de FRR mais próximos da MG. A escolha do método de CA influencia significativamente os parâmetros quantitativos de FRR.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mestrado em Medicina Nuclear - Área de especialização: Tomografia por Emissão de Positrões

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A previously developed model is used to numerically simulate real clinical cases of the surgical correction of scoliosis. This model consists of one-dimensional finite elements with spatial deformation in which (i) the column is represented by its axis; (ii) the vertebrae are assumed to be rigid; and (iii) the deformability of the column is concentrated in springs that connect the successive rigid elements. The metallic rods used for the surgical correction are modeled by beam elements with linear elastic behavior. To obtain the forces at the connections between the metallic rods and the vertebrae geometrically, non-linear finite element analyses are performed. The tightening sequence determines the magnitude of the forces applied to the patient column, and it is desirable to keep those forces as small as possible. In this study, a Genetic Algorithm optimization is applied to this model in order to determine the sequence that minimizes the corrective forces applied during the surgery. This amounts to find the optimal permutation of integers 1, ... , n, n being the number of vertebrae involved. As such, we are faced with a combinatorial optimization problem isomorph to the Traveling Salesman Problem. The fitness evaluation requires one computing intensive Finite Element Analysis per candidate solution and, thus, a parallel implementation of the Genetic Algorithm is developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Visual anomalies that affect school-age children represent an important public health problem. Data on the prevalence are lacking in Portugal but is needed for planning vision services. This study was conducted to determine the prevalence of strabismus, decreased visual acuity, and uncorrected refractive error in Portuguese children aged 6 to 11 years. Methods and materials: A cross-sectional study was carried out on a sample of 672 school-age children (7.69 ± 1.19 years). Children received an orthoptic assessment (visual acuity, ocular alignment, and ocular movements) and non-cycloplegic autorefraction. Results: After orthoptic assessment, 13.8% of children were considered abnormal (n = 93). Manifest strabismus was found in 4% of the children. Rates of esotropia (2.1%) were slightly higher than exotropia (1.8%). Strabismus rates were not statistically significant different per sex (p = 0.681) and grade (p = 0.228). Decreased visual acuity at distance was present in 11.3% of children. Visual acuity ≤20/66 (0.5 logMAR) was found in 1.3% of the children. We also found that 10.3% of children had an uncorrected refractive error. Conclusions: Strabismus affects a small proportion of the Portuguese school-age children. Decreased visual acuity and uncorrected refractive error affected a significant proportion of school-age children. New policies need to be developed to address this public health problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Standard Uptake Value (SUV) is a measurement of the uptake in a tumour normalized on the basis of a distribution volume and is used to quantify 18F-Fluorodeoxiglucose (FDG) uptake in tumors, such as primary lung tumor. Several sources of error can affect its accuracy. Normalization can be based on body weight, body surface area (BSA) and lean body mass (LBM). The aim of this study is to compare the influence of 3 normalization volumes in the calculation of SUV: body weight (SUVW), BSA (SUVBSA) and LBM (SUVLBM), with and without glucose correction, in patients with known primary lung tumor. The correlation between SUV and weight, height, blood glucose level, injected activity and time between injection and image acquisition is evaluated. Methods: Sample included 30 subjects (8 female and 22 male) with primary lung tumor, with clinical indication for 18F-FDG Positron Emission Tomography (PET). Images were acquired on a Siemens Biography according to the department’s protocol. Maximum pixel SUVW was obtained for abnormal uptake focus through semiautomatic VOI with Quantification 3D isocontour (threshold 2.5). The concentration of radioactivity (kBq/ml) was obtained from SUVW, SUVBSA, SUVLBM and the glucose corrected SUV were mathematically obtained. Results: Statistically significant differences between SUVW, SUVBSA and SUVLBM and between SUVWgluc, SUVBSAgluc and SUVLBMgluc were observed (p=0.000<0.05). The blood glucose level showed significant positive correlations with SUVW (r=0.371; p=0.043) and SUVLBM (r=0.389; p=0.034). SUVBSA showed independence of variations with the blood glucose level. Conclusion: The measurement of a radiopharmaceutical tumor uptake normalized on the basis of different distribution volumes is still variable. Further investigation on this subject is recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The SiC optical processor for error detection and correction is realized by using double pin/pin a-SiC:H photodetector with front and back biased optical gating elements. Data shows that the background act as selector that pick one or more states by splitting portions of the input multi optical signals across the front and back photodiodes. Boolean operations such as exclusive OR (EXOR) and three bit addition are demonstrated optically with a combination of such switching devices, showing that when one or all of the inputs are present the output will be amplified, the system will behave as an XOR gate representing the SUM. When two or three inputs are on, the system acts as AND gate indicating the present of the CARRY bit. Additional parity logic operations are performed by use of the four incoming pulsed communication channels that are transmitted and checked for errors together. As a simple example of this approach, we describe an all optical processor for error detection and correction and then, provide an experimental demonstration of this fault tolerant reversible system, in emerging nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Czech schools two teaching methods of reading are used: the analytic-synthetic (conventional) and genetic (created in the 1990s). They differ in theoretical foundations and in methodology. The aim of this paper is to describe the above mentioned theoretical approaches and present the results of study that followed the differences in the development of initial reading skills between these methods. A total of 452 first grade children (age 6-8) were assessed by a battery of reading tests at the beginning and at the end of the first grade and at the beginning of the second grade. 350 pupils participated all three times. Based on data analysis the developmental dynamics of reading skills in both methods and the main differences in several aspects of reading abilities (e.g. the speed of reading, reading technique, error rate in reading) are described. The main focus is on the reading comprehension development. Results show that pupils instructed using genetic approach scored significantly better on used reading comprehension tests, especially in the first grade. Statistically significant differences occurred between classes independently of each method. Therefore, other factors such as teacher´s role and class composition are discussed.