30 resultados para Bitrate overhead

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the potential of pre-setting 11kV overhead line ratings over a time period of sufficient length to be useful to the real-time management of overhead lines. This forecast is based on short and long term freely available weather forecasts and is used to help investigate the potential for realising dynamic rating benefits on the electricity network. A comparison between the realisable benefits in ratings using this forecast data, over the period of a year has been undertaken.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

People readily perceive smooth luminance variations as being due to the shading produced by undulations of a 3-D surface (shape-from-shading). In doing so, the visual system must simultaneously estimate the shape of the surface and the nature of the illumination. Remarkably, shape-from-shading operates even when both these properties are unknown and neither can be estimated directly from the image. In such circumstances humans are thought to adopt a default illumination model. A widely held view is that the default illuminant is a point source located above the observer's head. However, some have argued instead that the default illuminant is a diffuse source. We now present evidence that humans may adopt a flexible illumination model that includes both diffuse and point source elements. Our model estimates a direction for the point source and then weights the contribution of this source according to a bias function. For most people the preferred illuminant direction is overhead with a strong diffuse component.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compared to packings trays are more cost effective column internals because they create a large interfacial area for mass transfer by the interaction of the vapour on the liquid. The tray supports a mass of froth or spray which on most trays (including the most widely used sieve trays) is not in any way controlled. The two important results of the gas/liquid interaction are the tray efficiency and the tray throughput or capacity. After many years of practical experience, both may be predicted by empirical correlations, despite the lack of understanding. It is known that the tray efficiency is in part determined by the liquid flow pattern and the throughput by the liquid froth height which in turn depends on the liquid hold-up and vapour velocity. This thesis describes experimental work on sieve trays in an air-water simulator, 2.44 m in diameter. The liquid flow pattern, for flow rates similar to those used in commercial scale distillation, was observed experimentally by direct observation; by water-cooling, to simulate mass transfer; use of potassium permanganate dye to observe areas of longer residence time; and by height of clear liquid measurements across the tray and in the downcomer using manometers. This work presents experiments designed to evaluate flow control devices proposed to improve the gas liquid interaction and hence improve the tray efficiency and throughput. These are (a) the use of intermediate weirs to redirect liquid to the sides of the tray so as to remove slow moving/stagnant liquid and (b) the use of vapour-directing slots designed to use the vapour to cause liquid to be directed towards the outlet weir thus reducing the liquid hold-up at a given rate i.e. increased throughput. This method also has the advantage of removing slow moving/stagnant liquid. In the experiments using intermediate weirs, which were placed in the centre of the tray. it was found that in general the effect of an intermediate weir depends on the depth of liquid downstream of the weir. If the weir is deeper than the downstream depth it will cause the upstream liquid to be deeper than the downstream liquid. If the weir is not as deep as deep as the downstream depth it may have little or no effect on the upstream depth. An intermediate weir placed at an angle to the direction of flow of liquid increases the liquid towards the sides of the tray without causing an increase in liquid hold-up/ froth height. The maximum proportion of liquid caused to flow sideways by the weir is between 5% and 10%. Experimental work using vapour-directing slots on a rectangular sieve tray has shown that the horizontal momentum that is imparted to the liquid is dependent upon the size of the slot. If too much momentum is transferred to the liquid it causes hydraulic jumps to occur at the mouth of the slot coupled with liquid being entrained, The use of slots also helps to eliminate the hydraulic gradient across sieve trays and provides a more uniform froth height on the tray. By comparing the results obtained of the tray and point efficiencies, it is shown that a slotted tray reduces both values by approximately 10%. This reduction is due to the fact that with a slotted tray the liquid has a reduced residence time Ion the tray coupled also with the fact that large size bubbles are passing through the slots. The effectiveness of using vapour-directing slots on a full circular tray was investigated by using dye to completely colour the biphase. The removal of the dye by clear liquid entering the tray was monitored using an overhead camera. Results obtained show that the slots are successful in their aim of reducing slow moving liquid from the sides of the tray, The net effect of this is an increase in tray efficiency. Measurements of slot vapour-velocity found it to be approximately equal to the hole velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmission of a 73.7 Tb/s (96x3x256-Gb/s) DP-16QAM mode-division-multiplexed signal over 119km of few-mode fiber transmission line incorporating an inline multi mode EDFA and a phase plate based mode (de-)multiplexer is demonstrated. Data-aided 6x6 MIMO digital signal processing was used to demodulate the signal. The total demonstrated net capacity, taking into account 20% of FEC-overhead and 7.5% additional overhead (Ethernet and training sequences), is 57.6 Tb/s, corresponding to a spectral efficiency of 12 bits/s/Hz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of soft state (i.e., the state that will expire unless been refreshed) has been widely used in the design of network signaling protocols. The approaches of refreshing state in multi-hop networks can be classified to end-to-end (E2E) and hop-by-hop (HbH) refreshes. In this article we propose an effective Markov chain based analytical model for both E2E and HbH refresh approaches. Simulations verify the analytical models, which can be used to study the impacts of link characteristics on the performance (e.g., state synchronization and message overhead), as a guide on configuration and optimization of soft state signaling protocols. © 2009 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose to increase residual carrier frequency offset tolerance based on short perfect reconstruction pulse shaping for coherent optical-orthogonal frequency division multiplexing. The proposed method suppresses the residual carrier frequency offset induced penalty at the receiver, without requiring any additional overhead and exhaustive signal processing. The Q-factor improvement contributed by the proposed method is 1.6 dB and 1.8 dB for time-frequency localization maximization and out-of-band energy minimization pulse shapes, respectively. Finally, the transmission span gain under the influence of residual carrier frequency offset is ̃62% with out-of-band energy minimization pulse shape. © 2014 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a novel phase noise estimation scheme for CO-OFDM, in which pilot subcarriers are deliberately correlated to the data subcarriers. This technique reduces the overhead by a factor of 2. © OSA 2014.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We experimentally demonstrate a novel fibre nonlinearity compensation technique for CO-OFDM based on phase-conjugated pilots (PCPs), showing that, by varying the PCP overhead a performance improvement up to 4 dB can be achieved allowing highly flexible adaptation to link characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included. © 2010 The authors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic asset rating (DAR) is one of the number of techniques that could be used to facilitate low carbon electricity network operation. Previous work has looked at this technique from an asset perspective. This paper focuses, instead, from a network perspective by proposing a dynamic network rating (DNR) approach. The models available for use with DAR are discussed and compared using measured load and weather data from a trial network area within Milton Keynes in the central area of the U.K. This paper then uses the most appropriate model to investigate, through a network case study, the potential gains in dynamic rating compared to static rating for the different network assets - transformers, overhead lines, and cables. This will inform the network operator of the potential DNR gains on an 11-kV network with all assets present and highlight the limiting assets within each season.