11 resultados para Threshold crypto-graphic schemes and algorithms

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of unrepeatered transmission of a seven Nyquist-spaced 10 GBd PDM-16QAM superchannel using full signal band coherent detection and multi-channel digital back propagation (MC-DBP) to mitigate nonlinear effects is analysed. For the first time in unrepeatered transmission, the performance of two amplification systems is investigated and directly compared in terms of achievable information rates (AIRs): 1) erbium-doped fibre amplifier (EDFA) and 2) second-order bidirectional Raman pumped amplification. The experiment is performed over different span lengths, demonstrating that, for an AIR of 6.8 bit/s/Hz, the Raman system enables an increase of 93 km (36 %) in span length. Further, at these distances, MC-DBP gives an improvement in AIR of 1 bit/s/Hz (to 7.8 bit/s/Hz) for both amplification schemes. The theoretical AIR gains for Raman and MC-DBP are shown to be preserved when considering low-density parity-check codes. Additionally, MC-DBP algorithms for both amplification schemes are compared in terms of performance and computational complexity. It is shown that to achieve the maximum MC-DBP gain, the Raman system requires approximately four times the computational complexity due to the distributed impact of fibre nonlinearity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current rate of global biodiversity loss led many governments to sign the international agreement ‘Halting Biodiversity Loss by 2010 and beyond’ in 2001. The UK government was one of these and has a number of methods to tackle this, such as: commissioning specific technical guidance and supporting the UK Biodiversity Acton Plan (BAP) targets. However, by far the most effective influence the government has upon current biodiversity levels is through the town planning system. This is due to the control it has over all phases of a new development scheme’s lifecycle.There is an increasing myriad of regulations, policies and legislation, which deal with biodiversity protection and enhancement across the hierarchical spectrum: from the global and European level, down to regional and local levels. With these drivers in place, coupled with the promotion of benefits and incentives, increasing biodiversity value ought to be an achievable goal on most, if not all development sites. However, in the professional world, this is not the case due to a number of obstructions. Many of these tend to be ‘process’ barriers, which are particularly prevalent with ‘urban’ and ‘major’ development schemes, and is where the focus of this research paper lies.The paper summarises and discusses the results of a questionnaire survey, regarding obstacles to maximising biodiversity enhancements on major urban development schemes. The questionnaire was completed by Local Government Ecologists in England. The paper additionally refers to insights from previous action research, specialist interviews, and case studies, to reveal the key process obstacles.Solutions to these obstacles are then alluded to and recommendations are made within the discussion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fatigue crack growth rate tests have been performed on Nimonic AP1, a powder formed Ni-base superalloy, in air and vacuum at room temperature. These show that threshold values are higher, and near-threshold (faceted) crack growth rates are lower, in vacuum than in air, although at high growth rates, in the “structure-insensitive” regime, R-ratio and a dilute environment have little effect. Changing the R-ratio from 0.1 to 0.5 in vacuum does not alter near-threshold crack growth rates very much, despite more extensive secondary cracking being noticeable at R= 0.5. In vacuum, rewelding occurs at contact points across the crack as ΔK falls. This leads to the production of extensive fracture surface damage and bulky fretting debris, and is thought to be a significant contributory factor to the observed increase in threshold values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents new methodology and algorithms that can be used to analyse and measure the hand tremor and fatigue of surgeons while performing surgery. This will assist them in deriving useful information about their fatigue levels, and make them aware of the changes in their tool point accuracies. This thesis proposes that muscular changes of surgeons, which occur through a day of operating, can be monitored using Electromyography (EMG) signals. The multi-channel EMG signals are measured at different muscles in the upper arm of surgeons. The dependence of EMG signals has been examined to test the hypothesis that EMG signals are coupled with and dependent on each other. The results demonstrated that EMG signals collected from different channels while mimicking an operating posture are independent. Consequently, single channel fatigue analysis has been performed. In measuring hand tremor, a new method for determining the maximum tremor amplitude using Principal Component Analysis (PCA) and a new technique to detrend acceleration signals using Empirical Mode Decomposition algorithm were introduced. This tremor determination method is more representative for surgeons and it is suggested as an alternative fatigue measure. This was combined with the complexity analysis method, and applied to surgically captured data to determine if operating has an effect on a surgeon’s fatigue and tremor levels. It was found that surgical tremor and fatigue are developed throughout a day of operating and that this could be determined based solely on their initial values. Finally, several Nonlinear AutoRegressive with eXogenous inputs (NARX) neural networks were evaluated. The results suggest that it is possible to monitor surgeon tremor variations during surgery from their EMG fatigue measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting future need for water resources has traditionally been, at best, a crude mixture of art and science. This has prevented the evaluation of water need from being carried out in either a consistent or comprehensive manner. This inconsistent and somewhat arbitrary approach to water resources planning led to well publicised premature developments in the 1970's and 1980's but privatisation of the Water Industry, including creation of the Office of Water Services and the National Rivers Authority in 1989, turned the tide of resource planning to the point where funding of schemes and their justification by the Regulators could no longer be assumed. Furthermore, considerable areas of uncertainty were beginning to enter the debate and complicate the assessment It was also no longer appropriate to consider that contingencies would continue to lie solely on the demand side of the equation. An inability to calculate the balance between supply and demand may mean an inability to meet standards of service or, arguably worse, an excessive provision of water resources and excessive costs to customers. United Kingdom Water Industry Research limited (UKWlR) Headroom project in 1998 provided a simple methodology for the calculation of planning margins. This methodology, although well received, was not, however, accepted by the Regulators as a tool sufficient to promote resource development. This thesis begins by considering the history of water resource planning in the UK, moving on to discuss events following privatisation of the water industry post·1985. The mid section of the research forms the bulk of original work and provides a scoping exercise which reveals a catalogue of uncertainties prevalent within the supply-demand balance. Each of these uncertainties is considered in terms of materiality, scope, and whether it can be quantified within a risk analysis package. Many of the areas of uncertainty identified would merit further research. A workable, yet robust, methodology for evaluating the balance between water resources and water demands by using a spreadsheet based risk analysis package is presented. The technique involves statistical sampling and simulation such that samples are taken from input distributions on both the supply and demand side of the equation and the imbalance between supply and demand is calculated in the form of an output distribution. The percentiles of the output distribution represent different standards of service to the customer. The model allows dependencies between distributions to be considered, for improved uncertainties to be assessed and for the impact of uncertain solutions to any imbalance to be calculated directly. The method is considered a Significant leap forward in the field of water resource planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of professional sectors have recently moved away from their longstanding career model of up-or-out promotion and embraced innovative alternatives. Professional labor is a critical resource in professional service firms. Therefore, changes to these internal labor markets are likely to trigger other innovations, for example in knowledge management, incentive schemes and team composition. In this chapter we look at how new career models affect the core organizing model of professional firms and, in turn, their capacity for and processes of innovation. We consider how professional firms link the development of human capital and the division of professional labor to distinctive demands for innovation and how novel career systems help them respond to these demands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear programming (LP) is the most widely used optimization technique for solving real-life problems because of its simplicity and efficiency. Although conventional LP models require precise data, managers and decision makers dealing with real-world optimization problems often do not have access to exact values. Fuzzy sets have been used in the fuzzy LP (FLP) problems to deal with the imprecise data in the decision variables, objective function and/or the constraints. The imprecisions in the FLP problems could be related to (1) the decision variables; (2) the coefficients of the decision variables in the objective function; (3) the coefficients of the decision variables in the constraints; (4) the right-hand-side of the constraints; or (5) all of these parameters. In this paper, we develop a new stepwise FLP model where fuzzy numbers are considered for the coefficients of the decision variables in the objective function, the coefficients of the decision variables in the constraints and the right-hand-side of the constraints. In the first step, we use the possibility and necessity relations for fuzzy constraints without considering the fuzzy objective function. In the subsequent step, we extend our method to the fuzzy objective function. We use two numerical examples from the FLP literature for comparison purposes and to demonstrate the applicability of the proposed method and the computational efficiency of the procedures and algorithms. © 2013-IOS Press and the authors. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health care organizations must continuously improve their productivity to sustain long-term growth and profitability. Sustainable productivity performance is mostly assumed to be a natural outcome of successful health care management. Data envelopment analysis (DEA) is a popular mathematical programming method for comparing the inputs and outputs of a set of homogenous decision making units (DMUs) by evaluating their relative efficiency. The Malmquist productivity index (MPI) is widely used for productivity analysis by relying on constructing a best practice frontier and calculating the relative performance of a DMU for different time periods. The conventional DEA requires accurate and crisp data to calculate the MPI. However, the real-world data are often imprecise and vague. In this study, the authors propose a novel productivity measurement approach in fuzzy environments with MPI. An application of the proposed approach in health care is presented to demonstrate the simplicity and efficacy of the procedures and algorithms in a hospital efficiency study conducted for a State Office of Inspector General in the United States. © 2012, IGI Global.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.