6 resultados para P-Sequential Space
em Aston University Research Archive
Resumo:
Aim of the work is the implementation of a low temperature reforming (LT reforming) unit downstream the Haloclean pyrolyser in order to enhance the heating value of the pyrolysis gas. Outside the focus of this work was to gain a synthesis gas quality for further use. Temperatures between 400 °C and 500 °C were applied. A commercial pre-reforming catalyst on a nickel basis from Südchemie was chosen for LT reforming. As biogenic feedstock wheat straw has been used. Pyrolysis of wheat straw at 450 °C by means of Haloclean pyrolysis leads to 28% of char, 50% of condensate and 22% of gas. The condensate separates in a water phase and an organic phase. The organic phase is liquid, but contains viscous compounds. These compounds could underlay aging and could lead to solid tars which can cause post processing problems. Therefore, the implementation of a catalytic reformer is not only of interest from an energetic point of view, it is generally interesting for tar conversion purposes after pyrolysis applications. By using a fixed bed reforming unit at 450–490 °C and space velocities about 3000 l/h the pyrolysis gas volume flow could be increased to about 58%. This corresponds to a decrease of the yields of condensates by means of catalysis up to 17%, the yield of char remains unchanged, since pyrolysis conditions are the same. The heating value in the pyrolysis gas could be increased by the factor of 1.64. Hydrogen concentrations up to 14% could be realised.
Resumo:
Classification is the most basic method for organizing resources in the physical space, cyber space, socio space and mental space. To create a unified model that can effectively manage resources in different spaces is a challenge. The Resource Space Model RSM is to manage versatile resources with a multi-dimensional classification space. It supports generalization and specialization on multi-dimensional classifications. This paper introduces the basic concepts of RSM, and proposes the Probabilistic Resource Space Model, P-RSM, to deal with uncertainty in managing various resources in different spaces of the cyber-physical society. P-RSM’s normal forms, operations and integrity constraints are developed to support effective management of the resource space. Characteristics of the P-RSM are analyzed through experiments. This model also enables various services to be described, discovered and composed from multiple dimensions and abstraction levels with normal form and integrity guarantees. Some extensions and applications of the P-RSM are introduced.
Resumo:
We describe a free space quantum cryptography system which is designed to allow continuous unattended key exchanges for periods of several days, and over ranges of a few kilometres. The system uses a four-laser faint-pulse transmission system running at a pulse rate of 10MHz to generate the required four alternative polarization states. The receiver module similarly automatically selects a measurement basis and performs polarization measurements with four avalanche photodiodes. The controlling software can implement the full key exchange including sifting, error correction, and privacy amplification required to generate a secure key.
Resumo:
A ground-based laser system for space-debris cleaning will use powerful laser pulses that can self-focus while propagating through the atmosphere. We demonstrate that for the relevant laser parameters, this self-focusing can noticeably decrease the laser intensity on the target. We show that the detrimental effect can be, to a great extent, compensated for by applying the optimal initial beam defocusing. The effect of laser elevation on the system performance is discussed.
Resumo:
This research focuses on automatically adapting a search engine size in response to fluctuations in query workload. Deploying a search engine in an Infrastructure as a Service (IaaS) cloud facilitates allocating or deallocating computer resources to or from the engine. Our solution is to contribute an adaptive search engine that will repeatedly re-evaluate its load and, when appropriate, switch over to a dierent number of active processors. We focus on three aspects and break them out into three sub-problems as follows: Continually determining the Number of Processors (CNP), New Grouping Problem (NGP) and Regrouping Order Problem (ROP). CNP means that (in the light of the changes in the query workload in the search engine) there is a problem of determining the ideal number of processors p active at any given time to use in the search engine and we call this problem CNP. NGP happens when changes in the number of processors are determined and it must also be determined which groups of search data will be distributed across the processors. ROP is how to redistribute this data onto processors while keeping the engine responsive and while also minimising the switchover time and the incurred network load. We propose solutions for these sub-problems. For NGP we propose an algorithm for incrementally adjusting the index to t the varying number of virtual machines. For ROP we present an ecient method for redistributing data among processors while keeping the search engine responsive. Regarding the solution for CNP, we propose an algorithm determining the new size of the search engine by re-evaluating its load. We tested the solution performance using a custom-build prototype search engine deployed in the Amazon EC2 cloud. Our experiments show that when we compare our NGP solution with computing the index from scratch, the incremental algorithm speeds up the index computation 2{10 times while maintaining a similar search performance. The chosen redistribution method is 25% to 50% faster than other methods and reduces the network load around by 30%. For CNP we present a deterministic algorithm that shows a good ability to determine a new size of search engine. When combined, these algorithms give an adapting algorithm that is able to adjust the search engine size with a variable workload.
Resumo:
We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.