20 resultados para performance data

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over recent years much has been learned about the way in which depth cues are combined (e.g. Landy et al., 1995). The majority of this work has used subjective measures, a rating scale or a point of subjective equality, to deduce the relative contributions of different cues to perception. We have adopted a very different approach by using two interval forced-choice (2IFC) performance measures and a signal processing framework. We performed summation experiments for depth cue increment thresholds between pairs of pictorial depth cues in displays depicting slanted planar surfaces made from arrays of circular 'contrast' elements. Summation was found to be ideal when size-gradient was paired with contrast-gradient for a wide range of depth-gradient magnitudes in the null stimulus. For a pairing of size-gradient and linear perspective, substantial summation (> 1.5 dB) was found only when the null stimulus had intermediate depth gradients; when flat or steeply inclined surfaces were depicted, summation was diminished or abolished. Summation was also abolished when one of the target cues was (i) not a depth cue, or (ii) added in conflict. We conclude that vision has a depth mechanism for the constructive combination of pictorial depth cues and suggest two generic models of summation to describe the results. Using similar psychophysical methods, Bradshaw and Rogers (1996) revealed a mechanism for the depth cues of motion parallax and binocular disparity. Whether this is the same or a different mechanism from the one reported here awaits elaboration.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper describes the organizational processes of knowledge acquisition, sharing, retention and utilisation as it affected the internal and external communication of knowledge about performance in an English police force. The research was gathered in three workshops for internal personnel, external stakeholders and chief officers, using Journey Making, a computer-assisted method of developing shared understanding. The research concluded that there are multiple audiences for the communication of knowledge about police performance, impeded by the requirement to publish performance data. However, the intelligence-led policing model could lead to a more focused means of communication with various stakeholder groups. Although technology investment was a preferred means of communicating knowledge about performance, without addressing cultural barriers, an investment in technology may not yield the appropriate changes in behaviour. Consequently, technology needs to be integrated with working practices in order to reduce organizational reliance on informal methods of communication.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – The purpose of the paper is to present the findings of a study of factory closure management. It details the sequence and the results of the key strategic manufacturing management decisions made from the time of the announcement of the plant closure to the cessation of operations. The paper also includes an analysis of the human resource management (HRM) actions taken during this same time period and their consequences upon all those involved in the closure management process. Design/methodology/approach – The case study methodology consisted of two initial site visits to monitor closure management effectiveness (adherence to plan and the types and frequency of closure management communications). During these visits, documentary evidence of the impact of the closure decision upon production performance was also collected (manufacturing output and quality performance data). Following plant closure, interviews were held with senior business, production and HRM managers and production personnel. A total of 12 interviews were carried out. Findings – The case study findings have informed the development of a conceptual model of facility closure management. Information obtained from the interviews suggests that the facility closure management process consists of five key management activities. The unexpected announcement of a factory closure can cause behavioural changes similar to those of bereavement, particularly by those employees who are its survivors. In addition, similar reactions to the closure announcement may be displayed by those who choose to remain employed by the factory owner throughout the phased closure of the plant. Originality/value – Facility closure management is an insufficiently researched strategic operations management activity. This paper details a recommended procedure for its management. A conceptual model has also been developed to illustrate the links between the key facility closure management tasks and the range of employee changes of behaviour that can be induced by their execution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of the work described here has been to seek methods of narrowing the present gap between currently realised heat pump performance and the theoretical limit. The single most important pre-requisite to this objective is the identification and quantitative assessment of the various non-idealities and degradative phenomena responsible for the present shortfall. The use of availability analysis has been introduced as a diagnostic tool, and applied to a few very simple, highly idealised Rankine cycle optimisation problems. From this work, it has been demonstrated that the scope for improvement through optimisation is small in comparison with the extensive potential for improvement by reducing the compressor's losses. A fully instrumented heat pump was assembled and extensively tested. This furnished performance data, and led to an improved understanding of the systems behaviour. From a very simple analysis of the resulting compressor performance data, confirmation of the compressor's low efficiency was obtained. In addition, in order to obtain experimental data concerning specific details of the heat pump's operation, several novel experiments were performed. The experimental work was concluded with a set of tests which attempted to obtain definitive performance data for a small set of discrete operating conditions. These tests included an investigation of the effect of two compressor modifications. The resulting performance data was analysed by a sophisticated calculation which used that measurements to quantify each dagradative phenomenon occurring in that compressor, and so indicate where the greatest potential for improvement lies. Finally, in the light of everything that was learnt, specific technical suggestions have been made, to reduce the losses associated with both the refrigerant circuit and the compressor.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vehicle dynamics modelling can provide vehicle designers with vehicle performance data that can assist with the efficient development of more refined cars. However, such models are notoriously complicated requiring the user to have a considerable understanding of vehicle dynamics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis investigates the cost of electricity generation using bio-oil produced by the fast pyrolysis of UK energy crops. The study covers cost from the farm to the generator’s terminals. The use of short rotation coppice willow and miscanthus as feedstocks was investigated. All costs and performance data have been taken from published papers, reports or web sites. Generation technologies are compared at scales where they have proved economic burning other fuels, rather than at a given size. A pyrolysis yield model was developed for a bubbling fluidised bed fast pyrolysis reactor from published data to predict bio-oil yields and pyrolysis plant energy demands. Generation using diesel engines, gas turbines in open and combined cycle (CCGT) operation and steam cycle plants was considered. The use of bio-oil storage to allow the pyrolysis and generation plants to operate independently of each other was investigated. The option of using diesel generators and open cycle gas turbines for combined heat and power was examined. The possible cost reductions that could be expected through learning if the technology is widely implemented were considered. It was found that none of the systems analysed would be viable without subsidy, but with the current Renewable Obligation Scheme CCGT plants in the 200 to 350 MWe range, super-critical coal fired boilers co-fired with bio-oil, and groups of diesel engine based CHP schemes supplied by a central pyrolysis plant would be viable. It was found that the cost would reduce with implementation and the planting of more energy crops but some subsidy would still be needed to make the plants viable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

4th International Symposium of DEA, 5th-6th September 2004, Birmingham (UK)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper analyses the relationship between production subsidies and firms’ export performance using a very comprehensive and recent firm-level database and controlling for the endogeneity of subsidies. It documents robust evidence that production subsidies stimulate export activity at the intensive margin, although this effect is conditional on firm characteristics. In particular, the positive relationship between subsidies and the intensive margin of exports is strongest among profit-making firms, firms in capital-intensive industries, and those located in non-coastal regions. Compared to firm characteristics, the extent of heterogeneity across ownership structure (SOEs, collectives, and privately owned firms) proves to be relatively less important.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.