958 resultados para Memory space
Resumo:
A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.
Resumo:
A specialised reconfigurable architecture for telecommunication base-band processing is augmented with testing resources. The routing network is linked via virtual wire hardware modules to reduce the area occupied by connecting buses. The number of switches within the routing matrices is also minimised, which increases throughput without sacrificing flexibility. The testing algorithm was developed to systematically search for faults in the processing modules and the flexible high-speed routing network within the architecture. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This algorithm scans all possible paths within the interconnection network exhaustively and reports all faults. Strategies can be inserted to bypass minor faults
Resumo:
A imagem digital adquirida pelo sistema da placa de fósforo foto ativada é visualizada no monitor do computador em um formato denominado DICOM. Este formato ocupa muito espaço para armazenamento, o que dificulta o arquivamento e transmissão da imagem pela Internet. O objetivo deste estudo foi avaliar a influência da compressão JPEG, nos Fatores de Qualidade 100, 80 e 60 na reprodutibilidade da marcação de pontos cefalométricos em imagens de telerradiografias em norma lateral comparadas com o formato DICOM. A amostra consistiu de 120 imagens de telerradiografias em norma lateral obtidas a partir de 30 indivíduos, dos quais se obteve uma radiografia digital no formato DICOM. Essas imagens foram convertidas para o formato JPEG. Após o cegamento e randomização da amostra, três Ortodontistas calibrados marcaram a localização de 12 pontos cefalométricos em cada imagem utilizando o sistema de coordenadas X e Y. Esse procedimento foi repetido após 1 mês. A reprodutibilidade intra e inter observador foi calculada usando o teste de correlação intraclasse. Para comparação entre os grupos de compressão e DICOM na reprodutibilidade de marcação dos pontos utilizou se a Análise de Variância (ANOVA) a um critério para medidas repetidas. Os resultados mostraram que as marcações dos pontos cefalométricos foram bastante reprodutíveis, exceto para o ponto Órbita na coordenada X. Os diferentes formatos de arquivo mostraram estatisticamente iguais para cada ponto e eixo aferido. As compressões JPEG estudadas das imagens de telerradiografias em norma lateral não tiveram efeito na reprodutibilidade da marcação dos pontos cefalométricos testados.
Resumo:
A imagem radiográfica digital adquirida pelo sistema de placa de fósforo foto ativada é normalmente visualizada no monitor do computador no formato DICOM, caracterizado pela alta resolução das imagens. Como este formato ocupa muito espaço para armazenamento, as imagens digitais são submetidas a uma compressão, que otimiza a capacidade de espaço dos computadores e reduz o tempo de transmissão pela Internet. O objetivo desse estudo foi avaliar a influência da compressão TIFF e JPEG na reprodutibilidade intra e interexaminador da marcação de pontos cefalométricos em imagens de telerradiografias em norma lateral comparadas com o formato DICOM. A amostra consistiu de 90 imagens de telerradiografias obtidas a partir de 30 indivíduos, dos quais se obteve uma radiografia digital exibida no formato DICOM. Estas imagens foram convertidas para os formatos JPEG, com Fator de Qualidade 80 e TIFF. Após o cegamento e randomização da amostra, três ortodontistas calibrados marcaram a localização de 15 pontos cefalométricos em cada imagem utilizando o sistema de coordenadas x e y. Os resultados mostraram que as marcações dos pontos cefalométricos apresentaram concordância de reprodutibilidade tanto intra como interexaminador, exceto para os pontos Go, Po, Or, B e Pog . Os diferentes formatos de arquivo mostraram resultados estatisticamente semelhantes para cada ponto e eixo aferido. As compressões JPEG e TIFF estudadas não tiveram efeito, em imagens de telerradiografias em norma lateral, na reprodutibilidade intra e interexaminadores da marcação dos pontos cefalométricos testados.(AU)
Resumo:
The focus of this thesis is placed on text data compression based on the fundamental coding scheme referred to as the American Standard Code for Information Interchange or ASCII. The research objective is the development of software algorithms that result in significant compression of text data. Past and current compression techniques have been thoroughly reviewed to ensure proper contrast between the compression results of the proposed technique with those of existing ones. The research problem is based on the need to achieve higher compression of text files in order to save valuable memory space and increase the transmission rate of these text files. It was deemed necessary that the compression algorithm to be developed would have to be effective even for small files and be able to contend with uncommon words as they are dynamically included in the dictionary once they are encountered. A critical design aspect of this compression technique is its compatibility to existing compression techniques. In other words, the developed algorithm can be used in conjunction with existing techniques to yield even higher compression ratios. This thesis demonstrates such capabilities and such outcomes, and the research objective of achieving higher compression ratio is attained.
Resumo:
With the progress of devices technology, generation and use of energy ways, power quality parameters start to influence more significantly the various kinds of power consumers. Currently, there are many types of devices that analyze power quality. However, there is a need to create devices, and perform measurements and calculate parameters, find flaws, suggest changes, and to support the management of the installation. In addition, you must ensure that such devices are accessible. To maintain this balance, one magnitude measuring method should be used which does not require great resources processing or memory. The work shows that application of the Goertzel algorithm, compared with the commonly used FFT allows measurements to be made using much less hardware resources, available memory space to implement management functions. The first point of the work is the research of troubles that are more common for low voltage consumers. Then we propose the functional diagram indicate what will be measured, calculated, what problems will be detected and that solutions can be found. Through the Goertzel algorithm simulation using Scilab, is possible to calculate frequency components of a distorted signal with satisfactory results. Finally, the prototype is assembled and tests are carried out by adjusting the parameters necessary for one to maintain a reliable device without increasing its cost.
Resumo:
Cryptography is the main form to obtain security in any network. Even in networks with great energy consumption restrictions, processing and memory limitations, as the Wireless Sensors Networks (WSN), this is no different. Aiming to improve the cryptography performance, security and the lifetime of these networks, we propose a new cryptographic algorithm developed through the Genetic Programming (GP) techniques. For the development of the cryptographic algorithm’s fitness criteria, established by the genetic GP, nine new cryptographic algorithms were tested: AES, Blowfish, DES, RC6, Skipjack, Twofish, T-DES, XTEA and XXTEA. Starting from these tests, fitness functions was build taking into account the execution time, occupied memory space, maximum deviation, irregular deviation and correlation coefficient. After obtaining the genetic GP, the CRYSEED and CRYSEED2 was created, algorithms for the 8-bits devices, optimized for WSNs, i.e., with low complexity, few memory consumption and good security for sensing and instrumentation applications.
Resumo:
Cryptography is the main form to obtain security in any network. Even in networks with great energy consumption restrictions, processing and memory limitations, as the Wireless Sensors Networks (WSN), this is no different. Aiming to improve the cryptography performance, security and the lifetime of these networks, we propose a new cryptographic algorithm developed through the Genetic Programming (GP) techniques. For the development of the cryptographic algorithm’s fitness criteria, established by the genetic GP, nine new cryptographic algorithms were tested: AES, Blowfish, DES, RC6, Skipjack, Twofish, T-DES, XTEA and XXTEA. Starting from these tests, fitness functions was build taking into account the execution time, occupied memory space, maximum deviation, irregular deviation and correlation coefficient. After obtaining the genetic GP, the CRYSEED and CRYSEED2 was created, algorithms for the 8-bits devices, optimized for WSNs, i.e., with low complexity, few memory consumption and good security for sensing and instrumentation applications.
Resumo:
Acknowledgements This work was supported by a grant from the Netherlands Institute for Advanced Study in the Humanities and Social Sciences (NIAS) and The Carnegie Trust for the Universities of Scotland [31860].
Resumo:
Reflejando la ciudad moderna, el flâneur se encuentra atrapado entre dos tiempos: el presente, que se transforma continuamente, y el pasado, que permanece en forma de vestigios materiales. Desde su aparición en la literatura, la trayectoria del flâneur va ligada a los principales cambios históricos, sociales y culturales, que se plasman en distintas texturas urbanas. En todas ellas ejerce una función de nexo entre la ciudad antigua y la nueva que se superpone. El objeto del presente artículo es analizar la evolución de esta figura histórico y literaria como punto de partida para interpretar la obra en prosa Tetralogie der Erinnerung (1992-2004) de Dieter Forte.
Resumo:
Recently, there has been considerable interest in solving viscoelastic problems in 3D particularly with the improvement in modern computing power. In many applications the emphasis has been on economical algorithms which can cope with the extra complexity that the third dimension brings. Storage and computer time are of the essence. The advantage of the finite volume formulation is that a large amount of memory space is not required. Iterative methods rather than direct methods can be used to solve the resulting linear systems efficiently.
Resumo:
O projeto "Minas e Memórias da Urgeiriça" focaliza-se nos contributos da arte enquanto vetor de sensibilização ambiental, entendida como um veículo privilegiado para promover comportamentos e atitudes mais sustentáveis, responsáveis e cívicas, apelando para a mudança ou para o reforço de boas práticas ambientais. O projeto foi implementado no Concelho de Nelas, distrito de Viseu, baseando-se na história socioambiental das Minas da Urgeiriça, localizadas na freguesia de Canas de Senhorim. Abarcou um conjunto de atividades de índoles pedagógico e artístico, que se interligaram coerentemente, iniciando-se com o seminário “Urgeiriça: Antes, Agora e Depois?”, seguindo-se a performance “Escuridão” complementada pelos cantares de hinos dos ex-trabalhadores das Minas da Urgeiriça, culminando na visita à instalação artística “Escavações”. Os resultados estão ancorados em duas premissas fundamentais: por um lado, os testemunhos reais dos ex-trabalhadores mineiros expressaram a influência da ação do homem sobre a natureza de forma descomedida e irresponsável, que continua a acarretar consequências nefastas para o ser humano, e por outro, demonstraram as intervenções a que o local se sujeitou (da exploração desenfreada de recursos até à progressiva requalificação). Para tal, conseguiu-se revitalizar e reforçar as memórias do espaço (físico e mental) e materializá-las através da arte como forma de sensibilizar a comunidade local acerca das ações resultantes da interação do homem versus ambiente, sendo encaradas como meio privilegiado de desenvolvimento comunitário, gerador de competências e de mudanças.
Resumo:
This thesis is a study of how the contents of volatile memory on the Windows operating system can be better understood and utilised for the purposes of digital forensic investigations. It proposes several techniques to improve the analysis of memory, with a focus on improving the detection of unknown code such as malware. These contributions allow the creation of a more complete reconstruction of the state of a computer at acquisition time, including whether or not the computer has been infected by malicious code.
Resumo:
The paper examines the role of shared spaces in divided cities in promoting future sustainable communities and spaces described as inclusive to all. It addresses the current challenges that prevent such inclusiveness and suggests future trends of its development to be of benefit to the wider city community. It explains how spaces in divided cities are carved up into perceived ownerships and territorialized areas, which increases tension on the shared space between territories; the control of which can often lead to inter-community disputes. The paper reports that common shared space in-between conflicting communities takes on increased importance since the nature of the conflict places emphasis on communities’ confidence, politically and socially, while also highlighting the necessity for confidence in inclusion and feeling secure in the public domain. In order to achieve sustainable environments, strategies to promote shared spaces require further focus on the significance of everyday dynamics as essential aspects for future integration and conflict resolution.