988 resultados para Custo de bit
Resumo:
We investigate the use of different direct detection modulation formats in a wavelength switched optical network. We find the minimum time it takes a tunable sampled grating distributed Bragg reflector laser to recover after switching from one wavelength channel to another for different modulation formats. The recovery time is investigated utilizing a field programmable gate array which operates as a time resolved bit error rate detector. The detector offers 93 ps resolution operating at 10.7 Gb/s and allows for all the data received to contribute to the measurement, allowing low bit error rates to be measured at high speed. The recovery times for 10.7 Gb/s non-return-to-zero on–off keyed modulation, 10.7 Gb/s differentially phase shift keyed signal and 21.4 Gb/s differentially quadrature phase shift keyed formats can be as low as 4 ns, 7 ns and 40 ns, respectively. The time resolved phase noise associated with laser settling is simultaneously measured for 21.4 Gb/s differentially quadrature phase shift keyed data and it shows that the phase noise coupled with frequency error is the primary limitation on transmitting immediately after a laser switching event.
Bit-error rate performance of 20 Gbit/s WDM RZ-DPSK non-slope matched submarine transmission systems
Resumo:
Applying direct error counting, we assess the performance of 20 Gbit/s wavelength-division multiplexing return-to-zero differential phase-shift keying (RZ-DPSK) transmission at 0.4 bit/(s Hz) spectral efficiency for application on installed non-zero dispersion-shifted fibre based transoceanic submarine systems. The impact of the pulse duty cycle on the system performance is investigated and the reliability of the existing theoretical approaches to the BER estimation for the RZ-DPSK format is discussed.
Resumo:
The distribution of the secret key is the weakest link of many data encryption systems. Quantum key distribution (QKD) schemes provide attractive solutions [1], however their implementation remains challenging and their range and bit-rate are limited. Moreover, practical QKD systems, employ real-life components and are, therefore, vulnerable to diverse attack schemes [2]. Ultra-Long fiber lasers (UFLs) have been drawing much attention recently because of their fundamentally different properties compared to conventional lasers as well as their unique applications [3]. Here, we demonstrate a 100Bps, practically secure key distribution, over a 500km link, employing Raman gain UFL. Fig. 1(a) depicts a schematic of the UFL system. Each user has an identical set of two wavelength selective mirrors centered at l0 and l 1. In order to exchange a key-bit, each user independently choose one of these mirrors and introduces it as a laser reflector at their end. If both users choose identical mirrors, a clear signal develops and the bits in these cases are discarded. However if they choose complementary mirrors, (1, 0 or 0, 1 states), the UFL remains below lasing threshold and no signal evolves. In these cases, an eavesdropper can only detect noise and is unable to determine the mirror choice of the users, where the choice of mirrors represent a single key bit (e.g. Alice's choice of mirror is the key-bit). These bits are kept and added to the key. The absence of signal in the secure states faxilitates fast measurements to distinguish between the non-secure and the secure states and to determine the key-bit in the later case, Sequentially reapeating the single bit exchange protocol generate the entire keys of any desirable length. © 2013 IEEE.
Resumo:
We demonstrate an accurate BER estimation method for QPSK CO-OFDM transmission based on the probability density function of the received QPSK symbols. Using a 112Gbs QPSK CO-OFDM transmission as an example, we show that this method offers the most accurate estimate of the system's performance in comparison with other known approaches.
Resumo:
Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^
Resumo:
The purpose of the research is to investigate the emerging data security methodologies that will work with most suitable applications in the academic, industrial and commercial environments. Of several methodologies considered for Advanced Encryption Standard (AES), MARS (block cipher) developed by IBM, has been selected. Its design takes advantage of the powerful capabilities of modern computers to allow a much higher level of performance than can be obtained from less optimized algorithms such as Data Encryption Standards (DES). MARS is unique in combining virtually every design technique known to cryptographers in one algorithm. The thesis presents the performance of 128-bit cipher flexibility, which is a scaled down version of the algorithm MARS. The cryptosystem used showed equally comparable performance in speed, flexibility and security, with that of the original algorithm. The algorithm is considered to be very secure and robust and is expected to be implemented for most of the applications.
Resumo:
During the drilling of oil and natural gas are generated solid waste, liquid and gaseous. These solid fragments, which are known as cuttings, are carried to the surface through the drilling fluid. Furthermore, this fluid serves to cool the bit, keeping the internal pressure of the well, and others. This solid residue is very polluting, because it has incorporated beyond the drilling fluid, which has several chemical additives harmful to the environment, some heavy metals that are harmful to the environment, such as lead. To minimize the residue generated, are currently being studied numerous techniques to mitigate the problems that such waste can cause to the environment, like addition of cuttings in the composition of soil cement brick masonry construction, addition of cuttings on the clay matrix for the manufacture of solid masonry bricks and ceramic blocks and coprocessing of the cuttings in cement. So, the main objective of this work is the incorporation of cuttings drilling of oil wells, the cement slurry used in the cementing operation of the well. This cuttings used in this study, arising from the formation Pendências, was milled and separated in a sieve of 100 mesh. After grinding had a mean particle sike in order of 86 mm and crystal structure containing phases of quartz and calcite type, characteristic of the Portland cement. Were formulated and prepared slurries of cement with density 13 lb / gal, containing different concentrations of gravel, and realized characterization tests API SPEC 10A and RP 10B. Free water tests showed values lower than 5.9% and the rheological model that best described the behavior of the mixtures was the power. The results of compressive strength (10.3 MPa) and stability (Dr <0.5 lb / gal) had values within the set of operational procedures. Thus, the gravel from the drilling operation, may be used as binders in addition to Portland cement oil wells, in order to reuse this waste and reduce the cost of the cement paste.
Resumo:
The motion capture is a main tool for quantitative motion analyses. Since the XIX century, several motion caption systems have been developed for biomechanics study, animations, games and movies. The biomechanics and kinesiology involves and depends on knowledge from distinct fields, the engineering and health sciences. A precise human motion analysis requires knowledge from both fields. It is necessary then the use of didactics tools and methods for research and teaching for learning aid. The devices for analysis and motion capture currently that are found on the market and on educational institutes presents difficulties for didactical practice, which are the difficulty of transportation, high cost and limited freedom for the user towards the data acquisition. Therefore, the motion analysis is qualitatively performed or is quantitatively performed in highly complex laboratories. Based is these problems, this work presents the development of a motion capture system for didactic use hence a cheap, light, portable and easily used device with a free software. This design includes the selection of the device, the software development for that and tests. The developed system uses the device Kinect, from Microsoft, for its low cost, low weight, portability and easy use, and delivery tree-dimensional data with only one peripheral device. The proposed programs use the hardware to make motion captures, store them, reproduce them, process the motion data and graphically presents the data.
Resumo:
This study evaluates the cost effectiveness of municipalities in the state of Rio Grande do Norte in the execution of spending in basic education carried out in 2011, as well as analyze the determinants of the inefficiency of the same. For this, we used two methodological approaches (i) stochastic frontier cost, and (ii) analyze data envelopment (DEA), which allows to identify the efficient frontier of the municipalities analyzed non-parametrically. Results show that municipalities under review achieved low efficiency rates in the stochastic frontier cost, while the DEA method they achieved higher rates where nineteen among them reached full efficiency. The results suggest that a significant portion of the Potiguar municipalities should review its administrative practices, especially the means of allocation of resources. In regard to determining the efficiency observed distinct results by the two methods.
Resumo:
Reconstructing past ocean salinity is important for assessing paleoceanographic change and therefore past climatic dynamics. Commonly, sea water salinity reconstruction is based on foraminifera oxygen isotope ratio values combined with sea surface temperature reconstruction. However, the approach relies on multiple proxies, resulting in relatively large uncertainty and, consequently, relatively low accuracy of salinity estimates. An alternative tool for past ocean salinity reconstruction is the hydrogen isotope composition of long chain (C37) alkenones (dDalkenone). Here, we applied dDalkenone to a 39 ka long coastal sediment record from the Eastern South African continental shelf in the Mozambique Channel, close to the Zambezi River mouth. Despite changes in global sea water dD related to glacial - interglacial ice volume effects, no clear changes were observed in the dDalkenone record throughout the entire 39 ka. The BIT index record from the same core showed high BIT values during the glacial and low values during the Holocene. This indicates a more pronounced freshwater influence at the core location during the glacial, resulting in alkenones depleted in deuterium during that time and, thereby, explains the lack of a clear glacial-interglacial alkenone dD shift. Correlation between the BIT index and dDalkenone during the glacial period suggests that increased continental runoff potentially changed the growth conditions of the alkenone producing haptophytes, promoting coastal haptophyte species with generally more enriched dDalkenone values. We therefore suggest that the application of dDalkenone for reconstructing past salinity in coastal settings may be complicated by changes in the alkenone producing haptophyte community.
Resumo:
Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011).
Resumo:
As previsões mais recentes, e em relação à “crise” da água, apontam para que até ao ano de 2025 quase dois terços da humanidade possa vir a sofrer com a escassez de água potável, afetando praticamente todos os países do mundo, incluindo os países desenvolvidos, a menos que haja uma redução da procura e/ou o desenvolvimento de novas fontes de água potável. Face ao atual panorama mundial de escassez de água potável, devido às alterações climáticas, deficiente gestão e exploração dos recursos hídricos e crescente procura de água para a agricultura, consumo doméstico e industrial, a dessalinização da água do mar apresenta-se como uma solução segura e confiável para fazer face a este problema. Nesta dissertação são descritas as várias técnicas de dessalinização da água do mar em uso corrente, assim como as novas tecnologias e aplicações futuras. De todas as técnicas, a Reverse Osmosis (RO) foi tratada com mais detalhe neste estudo, por ser das técnicas mais utilizadas em todo mundo e porque é a mais promissora em termos de custo/benefício, mesmo em regiões onde antes a sua aplicação era impensável. Esta dissertação foca aspetos importantes relacionados com o processo da dessalinização, nomeadamente, os impactes ambientais e relação custo/benefício associados ao processo, e as perspetivas futuras. Finalmente procedeu-se a uma análise do processo de dessalinização aplicado à situação de Cabo Verde, onde os recursos hídricos de água potável são escassos, e a dessalinização apresenta-se como a principal fonte de água potável para abastecimento público.