908 resultados para Static-order-trade-off


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, the main source for the production of liquid transportation fuels is petroleum, the continued use of which faces many challenges including depleting oil reserves, significant oil price rises, and environmental concerns over global warming which is widely believed to be due to fossil fuel derived CO2 emissions and other greenhouse gases. In this respect, lignocellulosic or plant biomass is a particularly interesting resource as it is the only renewable source of organic carbon that can be converted into liquid transportation fuels. The gasification of biomass produces syngas which can then be converted into synthetic liquid hydrocarbon fuels by means of the Fischer-Tropsch (FT) synthesis. This process has been widely considered as an attractive option for producing clean liquid hydrocarbon fuels from biomass that have been identified as promising alternatives to conventional fossil fuels like diesel and kerosene. The resulting product composition in FT synthesis is influenced by the type of catalyst and the reaction conditions that are used in the process. One of the issues facing this conversion process is the development of a technology that can be scaled down to match the scattered nature of biomass resources, including lower operating pressures, without compromising liquid composition. The primary aims of this work were to experimentally explore FT synthesis at low pressures for the purpose of process down-scaling and cost reduction, and to investigate the potential for obtaining an intermediate FT synthetic crude liquid product that can be integrated into existing refineries under the range of process conditions employed. Two different fixed-bed micro-reactors were used for FT synthesis; a 2cm3 reactor at the University of Rio de Janeiro (UFRJ) and a 20cm3 reactor at Aston University. The experimental work firstly involved the selection of a suitable catalyst from three that were available. Secondly, a parameter study was carried out on the 20cm3 reactor using the selected catalyst to investigate the influence of reactor temperature, reactor pressure, space velocity, the H2/CO molar ratio in the feed syngas and catalyst loading on the reaction performance measured as CO conversion, catalyst stability, product distribution, product yields and liquid hydrocarbon product composition. From this parameter study a set of preferred operating conditions was identified for low pressure FT synthesis. The three catalysts were characterized using BET, XRD, TPR and SEM. The catalyst selected was an unpromoted Co/Al2O3 catalyst. FT synthesis runs on the 20cm3 reactor at Aston were conducted for 48 hours. Permanent gases and light hydrocarbons (C1-C5) were analysed in an online GC-TCD/FID at hourly intervals. The liquid hydrocarbons collected were analyzed offline using GC-MS for determination of fuel composition. The parameter study showed that CO conversion and liquid hydrocarbon yields increase with increasing reactor pressure up to around 8 bar, above which the effect of pressure is small. The parameters that had the most significant influence on CO conversion, product selectivity and liquid hydrocarbon yields were reactor temperature and catalyst loading. The preferred reaction conditions identified for this research were: T = 230ºC, P = 10 bar, H2/CO = 2.0, WHSV = 2.2 h-1, and catalyst loading = 2.0g. Operation in the low range of pressures studied resulted in low CO conversions and liquid hydrocarbon yields, indicating that low pressure BTL-FT operation may not be industrially viable as the trade off in lower CO conversions and once-through liquid hydrocarbon product yields has to be carefully weighed against the potential cost savings resulting from process operation at lower pressures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a large scale numerical investigation of heterogeneous terrestrial optical communications systems and the upgrade of fourth generation terrestrial core to metro legacy interconnects to fifth generation transmission system technologies. Retrofitting (without changing infrastructure) is considered for commercial applications. ROADM are crucial enabling components for future core network developments however their re-routing ability means signals can be switched mid-link onto sub-optimally configured paths which raises new challenges in network management. System performance is determined by a trade-off between nonlinear impairments and noise, where the nonlinear signal distortions depend critically on deployed dispersion maps. This thesis presents a comprehensive numerical investigation into the implementation of phase modulated signals in transparent reconfigurable wavelength division multiplexed fibre optic communication terrestrial heterogeneous networks. A key issue during system upgrades is whether differential phase encoded modulation formats are compatible with the cost optimised dispersion schemes employed in current 10 Gb/s systems. We explore how robust transmission is to inevitable variations in the dispersion mapping and how large the margins are when suboptimal dispersion management is applied. We show that a DPSK transmission system is not drastically affected by reconfiguration from periodic dispersion management to lumped dispersion mapping. A novel DPSK dispersion map optimisation methodology which reduces drastically the optimisation parameter space and the many ways to deploy dispersion maps is also presented. This alleviates strenuous computing requirements in optimisation calculations. This thesis provides a very efficient and robust way to identify high performing lumped dispersion compensating schemes for use in heterogeneous RZ-DPSK terrestrial meshed networks with ROADMs. A modified search algorithm which further reduces this number of configuration combinations is also presented. The results of an investigation of the feasibility of detouring signals locally in multi-path heterogeneous ring networks is also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pervasive environments are characterised by highly heterogeneous services and mobile devices with dynamic availability. Approaches such as that proposed by the Connect project provide means to enable such systems to be discovered and composed, through mediation where necessary. As services appear and disappear, the set of feasible compositions changes. In such a pervasive environment, a designer encounters two related challenges: what goals it is reasonable to pursue in the current context and how to use the services presently available to achieve his goals. This paper proposes an approach to design service compositions, facilitating an interactive process to find the trade-off between the possible and the desirable. Following our approach, the system finds at runtime, where possible, compositions related to the developer's requirements. This process can realise the intent the developer specifies at design time, taking into account the services available at runtime, without a prohibitive level of pre-specification, inappropriate for such dynamic environments. © 2012 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is desirable that energy performance improvement is not realized at the expense of other network performance parameters. This paper investigates the trade off between energy efficiency, spectral efficiency and user QoS performance for a multi-cell multi-user radio access network. Specifically, the energy consumption ratio (ECR) and the spectral efficiency of several common frequency domain packet schedulers in a cellular E-UTRAN downlink are compared for both the SISO transmission mode and the 2x2 Alamouti Space Frequency Block Code (SFBC) MIMO transmission mode. It is well known that the 2x2 SFBC MIMO transmission mode is more spectrally efficient compared to the SISO transmission mode, however, the relationship between energy efficiency and spectral efficiency is undecided. It is shown that, for the E-UTRAN downlink with fixed transmission power, spectral efficiency improvement results into energy efficiency improvement. The effect of SFBC MIMO versus SISO on the user QoS performance is also studied. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through direct modeling, a reduction of pattern-dependent errors in a standard fiber-based transmission link at 40 Gbits/s rate is demonstrated by application of a skewed data pre-encoding. The trade-off between the improvement of the bit error rate and the loss in the data rate is examined. © 2007 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through modelling of direct error computation, a reduction of pattern- dependent errors in a standard fiber-based transmission link at 40 Gb/s rate is demonstrated by application of a skewed data pre-encoding. The trade-off between the bit-error rate improvement and the data rate loss is examined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markets are useful mechanisms for performing resource al- location in fully decentralised computational and other systems, since they can possess a range of desirable properties, such as efficiency, decentralisation, robustness and scalability. In this paper we investigate the behaviour of co-evolving evolutionary market agents as adaptive offer generators for sellers in a multi-attribute posted-offer market. We demonstrate that the evolutionary approach enables sellers to automatically position themselves in market niches, created by heterogeneous buyers. We find that a trade-off exists for the evolutionary sellers between maintaining high population diversity to facilitate movement between niches and low diversity to exploit the current niche and maximise cumulative payoff. We characterise the trade-off from the perspective of the system as a whole, and subsequently from that of an individual seller. Our results highlight a decision on risk aversion for resource providers, but crucially we show that rational self-interested sellers would not adopt the behaviour likely to lead to the ideal result from the system point of view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concern over the quality of delivering video streaming services in mobile wireless networks is addressed in this work. A framework that enhances the Quality of Experience (QoE) of end users through a quality driven resource allocation scheme is proposed. To play a key role, an objective no-reference quality metric, Pause Intensity (PI), is adopted to derive a resource allocation algorithm for video streaming. The framework is examined in the context of 3GPP Long Term Evolution (LTE) systems. The requirements and structure of the proposed PI-based framework are discussed, and results are compared with existing scheduling methods on fairness, efficiency and correlation (between the required and allocated data rates). Furthermore, it is shown that the proposed framework can produce a trade-off between the three parameters through the QoE-aware resource allocation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A hybrid passive-active damping solution with improved system stability margin and enhanced dynamic performance is proposed for high power grid interactive converters. In grid connected active rectifier/inverter application, line side LCL filter improves the high frequency attenuation and makes the converter compatible with the stringent grid power quality regulations. Passive damping though offers a simple and reliable solution but it reduces overall converter efficiency. Active damping solutions do not increase the system losses but can guarantee the stable operation up to a certain speed of dynamic response which is limited by the maximum bandwidth of the current controller. This paper examines this limit and introduces a concept of hybrid passive-active damping solution with improved stability margin and high dynamic performance for line side LCL filter based active rectifier/inverter applications. A detailed design, analysis of the hybrid approach and trade-off between system losses and dynamic performance in grid connected applications are reported. Simulation and experimental results from a 10 kVA prototype demonstrate the effectiveness of the proposed solution. An analytical study on system stability and dynamic response with the variations of various controller and passive filter parameters is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel kind of Airy-based pulse with an invariant propagation in lossy dispersive media is proposed. The basic principle is based on an optical energy trade-off between different parts of the pulse caused by the chromatic dispersion, which is used to compensate the attenuation losses of the propagation medium. Although the ideal concept of the proposed pulses implies infinite pulse energy, the numerical simulations show that practical finite energy pulses can be designed to obtain a partially invariant propagation over a finite distance of propagation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When visual sensor networks are composed of cameras which can adjust the zoom factor of their own lens, one must determine the optimal zoom levels for the cameras, for a given task. This gives rise to an important trade-off between the overlap of the different cameras’ fields of view, providing redundancy, and image quality. In an object tracking task, having multiple cameras observe the same area allows for quicker recovery, when a camera fails. In contrast having narrow zooms allow for a higher pixel count on regions of interest, leading to increased tracking confidence. In this paper we propose an approach for the self-organisation of redundancy in a distributed visual sensor network, based on decentralised multi-objective online learning using only local information to approximate the global state. We explore the impact of different zoom levels on these trade-offs, when tasking omnidirectional cameras, having perfect 360-degree view, with keeping track of a varying number of moving objects. We further show how employing decentralised reinforcement learning enables zoom configurations to be achieved dynamically at runtime according to an operator’s preference for maximising either the proportion of objects tracked, confidence associated with tracking, or redundancy in expectation of camera failure. We show that explicitly taking account of the level of overlap, even based only on local knowledge, improves resilience when cameras fail. Our results illustrate the trade-off between maintaining high confidence and object coverage, and maintaining redundancy, in anticipation of future failure. Our approach provides a fully tunable decentralised method for the self-organisation of redundancy in a changing environment, according to an operator’s preferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A framework that aims to best utilize the mobile network resources for video applications is presented in this paper. The main contribution of the work proposed is the QoE-driven optimization method that can maintain a desired trade-off between fairness and efficiency in allocating resources in terms of data rates to video streaming users in LTE networks. This method is concerned with the control of the user satisfaction level from the service continuity's point of view and applies appropriate QoE metrics (Pause Intensity and variations) to determine the scheduling strategies in combination with the mechanisms used for adaptive video streaming such as 3GP/MPEG-DASH. The superiority of the proposed algorithms are demonstrated, showing how the resources of a mobile network can be optimally utilized by using quantifiable QoE measurements. This approach can also find the best match between demand and supply in the process of network resource distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.