949 resultados para thesis coding
Resumo:
The male hypermethylated (MHM) region, located near the middle of the short arm of the Z chromosome of chickens, consists of approximately 210 tandem repeats of a BamHI 2.2-kb sequence unit. Cytosines of the CpG dinucleotides of this region are extensively methylated on the two Z chromosomes in the male but much less methylated on the single Z chromosome in the female. The state of methylation of the MHM region is established after fertilization by about the 1-day embryonic stage. The MHM region is transcribed only in the female from the particular strand into heterogeneous, high molecular-mass, non-coding RNA, which is accumulated at the site of transcription, adjacent to the DMRT1 locus, in the nucleus. The transcriptional silence of the MHM region in the male is most likely caused by the CpG methylation, since treatment of the male embryonic fibroblasts with 5-azacytidine results in hypo-methylation and active transcription of this region. In ZZW triploid chickens, MHM regions are hypomethylated and transcribed on the two Z chromosomes, whereas MHM regions are hypermethylated and transcriptionally inactive on the three Z chromosomes in ZZZ triploid chickens, suggesting a possible role of the W chromosome on the state of the MHM region.
Resumo:
A major limitation in any high-performance digital communication system is the linearity region of the transmitting amplifier. Nonlinearities typically lead to signal clipping. Efficient communication in such conditions requires maintaining a low peak-to-average power ratio (PAR) in the transmitted signal while achieving a high throughput of data. Excessive PAR leads either to frequent clipping or to inadequate resolution in the analog-to-digital or digital-to-analog converters. Currently proposed signaling schemes for future generation wireless communications suffer from a high PAR. This paper presents a new signaling scheme for channels with clipping which achieves a PAR as low as 3. For a given linear range in the transmitter's digital-to-analog converter, this scheme achieves a lower bit-error rate than existing multicarrier schemes, owing to increased separation between constellation points. We present the theoretical basis for this new scheme, approximations for the expected bit-error rate, and simulation results. (C) 2002 Elsevier Science (USA).
Resumo:
A plasmid DNA directing transcription of the infectious full-length RNA genome of Kunjin (KUN) virus in vivo from a mammalian expression promoter was used to vaccinate mice intramuscularly. The KUN viral cDNA encoded in the plasmid contained the mutation in the NS1 protein (Pro-250 to Leu) previously shown to attenuate KUN virus in weanling mice. KUN virus was isolated from the blood of immunized mice 3-4 days after DNA inoculation, demonstrating that infectious RNA was being transcribed in vivo; however, no symptoms of virus-induced disease were observed. By 19 days postimmunization, neutralizing antibody was detected in the serum of immunized animals. On challenge with lethal doses of the virulent New York strain of West Nile (WN) or wild-type KUN virus intracerebrally or intraperitoneally, mice immunized with as little as 0.1-1 mug of KUN plasmid DNA were solidly protected against disease. This finding correlated with neutralization data in vitro showing that serum from KUN DNA-immunized mice neutralized KUN and WN,viruses with similar efficiencies. The results demonstrate that delivery of an attenuated but replicating KUN virus via a plasmid DNA vector may provide an effective vaccination strategy against virulent strains of WN virus.
Resumo:
Objective: To develop a 'quality use of medicines' coding system for the assessment of pharmacists' medication reviews and to apply it to an appropriate cohort. Method: A 'quality use of medicines' coding system was developed based on findings in the literature. These codes were then applied to 216 (111 intervention, 105 control) veterans' medication profiles by an independent clinical pharmacist who was supported by a clinical pharmacologist with the aim to assess the appropriateness of pharmacy interventions. The profiles were provided for veterans participating in a randomised, controlled trial in private hospitals evaluating the effect of medication review and discharge counselling. The reliability of the coding was tested by two independent clinical pharmacists in a random sample of 23 veterans from the study population. Main outcome measure: Interrater reliability was assessed by applying Cohen's kappa score on aggregated codes. Results: The coding system based on the literature consisted of 19 codes. The results from the three clinical pharmacists suggested that the original coding system had two major problems: (a) a lack of discrimination for certain recommendations e. g. adverse drug reactions, toxicity and mortality may be seen as variations in degree of a single effect and (b) certain codes e. g. essential therapy were in low prevalence. The interrater reliability for an aggregation of all codes into positive, negative and clinically non-significant codes ranged from 0.49-0.58 (good to fair). The interrater reliability increased to 0.72-0.79 (excellent) when all negative codes were excluded. Analysis of the sample of 216 profiles showed that the most prevalent recommendations from the clinical pharmacists were a positive impact in reducing adverse responses (31.9%), an improvement in good clinical pharmacy practice (25.5%) and a positive impact in reducing drug toxicity (11.1%). Most medications were assigned the clinically non-significant code (96.6%). In fact, the interventions led to a statistically significant difference in pharmacist recommendations in the categories; adverse response, toxicity and good clinical pharmacy practice measured by the quality use of medicine coding system. Conclusion: It was possible to use the quality use of medicine coding system to rate the quality and potential health impact of pharmacists' medication reviews, and the system did pick up differences between intervention and control patients. The interrater reliability for the summarised coding system was fair, but a larger sample of medication regimens is needed to assess the non-summarised quality use of medicines coding system.
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
Wyner-Ziv (WZ) video coding is a particular case of distributed video coding, the recent video coding paradigm based on the Slepian-Wolf and Wyner-Ziv theorems that exploits the source correlation at the decoder and not at the encoder as in predictive video coding. Although many improvements have been done over the last years, the performance of the state-of-the-art WZ video codecs still did not reach the performance of state-of-the-art predictive video codecs, especially for high and complex motion video content. This is also true in terms of subjective image quality mainly because of a considerable amount of blocking artefacts present in the decoded WZ video frames. This paper proposes an adaptive deblocking filter to improve both the subjective and objective qualities of the WZ frames in a transform domain WZ video codec. The proposed filter is an adaptation of the advanced deblocking filter defined in the H.264/AVC (advanced video coding) standard to a WZ video codec. The results obtained confirm the subjective quality improvement and objective quality gains that can go up to 0.63 dB in the overall for sequences with high motion content when large group of pictures are used.
Resumo:
Wyner - Ziv (WZ) video coding is a particular case of distributed video coding (DVC), the recent video coding paradigm based on the Slepian - Wolf and Wyner - Ziv theorems which exploits the source temporal correlation at the decoder and not at the encoder as in predictive video coding. Although some progress has been made in the last years, WZ video coding is still far from the compression performance of predictive video coding, especially for high and complex motion contents. The WZ video codec adopted in this study is based on a transform domain WZ video coding architecture with feedback channel-driven rate control, whose modules have been improved with some recent coding tools. This study proposes a novel motion learning approach to successively improve the rate-distortion (RD) performance of the WZ video codec as the decoding proceeds, making use of the already decoded transform bands to improve the decoding process for the remaining transform bands. The results obtained reveal gains up to 2.3 dB in the RD curves against the performance for the same codec without the proposed motion learning approach for high motion sequences and long group of pictures (GOP) sizes.
Resumo:
A oferta de serviços baseados em comunicações sem fios tem vindo a crescer exponencialmente na última década. Cada vez mais são exigidas maiores taxas de transmissão assim como uma melhor QoS, sem comprometer a potência de transmissão ou argura de banda disponível. A tecnologia MIMO consegue oferecer um aumento da capacidade destes sistemas sem requerer aumento da largura de banda ou da potência transmitida. O trabalho desenvolvido nesta dissertação consistiu no estudo dos sistemas MIMO, caracterizados pela utilização de múltiplas antenas para transmitir e receber a informação. Com um sistema deste tipo consegue-se obter um ganho de diversidade espacial utilizando códigos espaço-temporais, que exploram simultaneamente o domínio espacial e o domínio do tempo. Nesta dissertação é dado especial ênfase à codificação por blocos no espaço-tempo de Alamouti, a qual será implementada em FPGA, nomeadamente a parte de recepção. Esta implementação é efectuada para uma configuração de antenas 2x1, utilizando vírgula flutuante e para três tipos de modulação: BPSK, QPSK e 16-QAM. Por fim será analisada a relação entre a precisão alcançada na representação numérica dos resultados e os recursos consumidos pela FPGA. Com a arquitectura adoptada conseguem se obter taxas de transferência na ordem dos 29,141 Msimb/s (sem pipelines) a 262,674 Msimb/s (com pipelines), para a modulação BPSK.
Resumo:
Hoje em dia, há cada vez mais informação audiovisual e as transmissões ou ficheiros multimédia podem ser partilhadas com facilidade e eficiência. No entanto, a adulteração de conteúdos vídeo, como informação financeira, notícias ou sessões de videoconferência utilizadas num tribunal, pode ter graves consequências devido à importância desse tipo de informação. Surge então, a necessidade de assegurar a autenticidade e a integridade da informação audiovisual. Nesta dissertação é proposto um sistema de autenticação de vídeo H.264/Advanced Video Coding (AVC), denominado Autenticação de Fluxos utilizando Projecções Aleatórias (AFPA), cujos procedimentos de autenticação, são realizados ao nível de cada imagem do vídeo. Este esquema permite um tipo de autenticação mais flexível, pois permite definir um limite máximo de modificações entre duas imagens. Para efectuar autenticação é utilizada uma nova técnica de autenticação de imagens, que combina a utilização de projecções aleatórias com um mecanismo de correcção de erros nos dados. Assim é possível autenticar cada imagem do vídeo, com um conjunto reduzido de bits de paridade da respectiva projecção aleatória. Como a informação de vídeo é tipicamente, transportada por protocolos não fiáveis pode sofrer perdas de pacotes. De forma a reduzir o efeito das perdas de pacotes, na qualidade do vídeo e na taxa de autenticação, é utilizada Unequal Error Protection (UEP). Para validação e comparação dos resultados implementou-se um sistema clássico que autentica fluxos de vídeo de forma típica, ou seja, recorrendo a assinaturas digitais e códigos de hash. Ambos os esquemas foram avaliados, relativamente ao overhead introduzido e da taxa de autenticação. Os resultados mostram que o sistema AFPA, utilizando um vídeo com qualidade elevada, reduz o overhead de autenticação em quatro vezes relativamente ao esquema que utiliza assinaturas digitais e códigos de hash.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.