905 resultados para Performance evaluation
Resumo:
For remote, semi-arid areas, brackish groundwater (BW) desalination powered by solar energy may serve as the most technically and economically viable means to alleviate the water stresses. For such systems, high recovery ratio is desired because of the technical and economical difficulties of concentrate management. It has been demonstrated that the current, conventional solar reverse osmosis (RO) desalination can be improved by 40–200 times by eliminating unnecessary energy losses. In this work, a batch-RO system that can be powered by a thermal Rankine cycle has been developed. By directly recycling high pressure concentrates and by using a linkage connection to provide increasing feed pressures, the batch-RO has been shown to achieve a 70% saving in energy consumption compared to a continuous single-stage RO system. Theoretical investigations on the mass transfer phenomena, including dispersion and concentration polarization, have been carried out to complement and to guide experimental efforts. The performance evaluation of the batch-RO system, named DesaLink, has been based on extensive experimental tests performed upon it. Operating DesaLink using compressed air as power supply under laboratory conditions, a freshwater production of approximately 300 litres per day was recorded with a concentration of around 350 ppm, whilst the feed water had a concentration range of 2500–4500 ppm; the corresponding linkage efficiency was around 40%. In the computational aspect, simulation models have been developed and validated for each of the subsystems of DesaLink, upon which an integrated model has been realised for the whole system. The models, both the subsystem ones and the integrated one, have been demonstrated to predict accurately the system performance under specific operational conditions. A simulation case study has been performed using the developed model. Simulation results indicate that the system can be expected to achieve a water production of 200 m3 per year by using a widely available evacuated tube solar collector having an area of only 2 m2. This freshwater production would satisfy the drinking water needs of 163 habitants in the Rajasthan region, the area for which the case study was performed.
Resumo:
In the teletraffic engineering of all the telecommunication networks, parameters characterizing the terminal traffic are used. One of the most important of them is the probability of finding the called (B-terminal) busy. This parameter is studied in some of the first and last papers in Teletraffic Theory. We propose a solution in this topic in the case of (virtual) channel systems, such as PSTN and GSM. We propose a detailed conceptual traffic model and, based on it, an analytical macro-state model of the system in stationary state, with: Bernoulli– Poisson–Pascal input flow; repeated calls; limited number of homogeneous terminals; losses due to abandoned and interrupted dialling, blocked and interrupted switching, not available intent terminal, blocked and abandoned ringing and abandoned conversation. Proposed in this paper approach may help in determination of many network traffic characteristics at session level, in performance evaluation of the next generation mobile networks.
Resumo:
Coherent optical orthogonal frequency division multiplexing (CO-OFDM) is an attractive transmission technique to virtually eliminate intersymbol interference caused by chromatic dispersion and polarization-mode dispersion. Design, development, and operation of CO-OFDM systems require simple, efficient, and reliable methods of their performance evaluation. In this paper, we demonstrate an accurate bit error rate estimation method for QPSK CO-OFDM transmission based on the probability density function of the received QPSK symbols. By comparing with other known approaches, including data-aided and nondata-aided error vector magnitude, we show that the proposed method offers the most accurate estimate of the system performance for both single channel and wavelength division multiplexing QPSK CO-OFDM transmission systems. © 2014 IEEE.
Resumo:
Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.
Resumo:
In a Ubiquitous Consumer Wireless World (UCWW) environment the provision, administration and management of the authentication, authorization and accounting (AAA) policies and business services are provided by third-party AAA service providers (3P-AAA-SPs) who are independent of the wireless access network providers (ANPs). In this environment the consumer can freely choose any suitable ANP, based on his/her own preferences. This new AAA infrastructural arrangement necessitates assessing the impact and re-thinking the design, structure and location of ‘charging and billing’ (C&B) functions and services. This paper addresses C&B issues in UCWW, proposing potential architectural solutions for C&B realization. Implementation approaches of these novel solutions together with a software testbed for validation and performance evaluation are addressed.
Resumo:
We present a performance evaluation of a non-conventional approach to implement phase noise tolerant optical systems with multilevel modulation formats. The performance of normalized Viterbi-Viterbi carrier phase estimation (V-V CPE) is investigated in detail for circular m-level quadrature amplitude modulation (C-mQAM) signals. The intrinsic property of C-mQAM constellation points with a uniform phase separation allows a straightforward employment of V-V CPE without the need to adapt constellation. Compared with conventional feed-forward CPE for square QAM signals, the simulated results show an enhanced tolerance of linewidth symbol duration product (ΔvTs) at a low sensitivity penalty by using feed-forward CPE structure with C-mQAM. This scheme can be easily upgraded to higher order modulations without inducing considerable complexity.
Resumo:
The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.
Resumo:
Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using a coherent measure of risk it is impossible to allocate risk satisfying the natural requirements of (Solution) Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.
Resumo:
A kockázat jó mérése és elosztása elengedhetetlen a bankok, biztosítók, befektetési alapok és egyéb pénzügyi vállalkozások belső tőkeallokációjához vagy teljesítményértékeléséhez. A cikkben bemutatjuk, hogy a koherens kockázati mértékek axiómáit nem likvid portfóliók esetén is el lehet várni. Így mérve a kockázatot, ismertetünk a kockázatelosztásra vonatkozó két kooperatív játékelméleti cikket. Az első optimista, eszerint mindig létezik stabil, az alegységek minden koalíciója által elfogadható, általános módszer a kockázat (tőke) elosztására. A második cikk pesszimista, mert azt mondja ki, hogy ha a stabilitás mellett igazságosak is szeretnénk lenni, akkor egy lehetetlenségi tételbe ütközünk. / === / Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to fi nancial risk. We argue that the axioms of coherent measures of risk are valid for illiquid portfolios as well. Then, we present the results of two papers on allocating risk measured by a coherent measure of risk. Assume a bank has some divisions. According to the fi rst paper there is always a stable allocation of risk capital, which is not blocked by any coalition of the divisions, that is there is a core compatible allocation rule (we present some examples for risk allocation rules). The second paper considers two more natural requirements, Equal Treatment Property and Strong Monotonicity. Equal Treatment Property makes sure that similar divisions are treated symmetrically, that is if two divisions make the same marginal risk contribution to all the coalition of divisions not containing them, then the rule should allocate them the very same risk capital. Strong Monotonicity requires that if the risk environment changes in such a way that the marginal contribution of a division is not decreasing, then its allocated risk capital should not decrease either. However, if risk is evaluated by any coherent measure of risk, then there is no risk allocation rule satisfying Core Compatibility, Equal Treatment Property and Strong Monotonicity, we encounter an impossibility result.
Resumo:
Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using coherent measures of risk it is impossible to allocate risk satisfying simultaneously the natural requirements of Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.
Resumo:
This study has explored the potential for implementing a merit-based public personnel system in The Bahamas, a former British colony in The Commonwealth Caribbean. Specifically, the study evaluated the use of merit-based public personnel management practices in areas of recruitment, selection, promotion, training and employee development and performance evaluation. Driving forces and barriers which impact merit system successes and failures as well as strategies for institutionalizing merit system practices are identified. Finally the study attempted to apply the developmental model created by Klingner (1996) to describe the stage of public personnel management in The Bahamas. The data for the study was collected through in-depth interviews with expert observers. ^
Resumo:
An assessment tool designed to measure a customer service orientation among RN's and LPN's was developed using a content-oriented approach. Critical incidents were first developed by asking two samples of healthcare managers (n = 52 and 25) to identify various customer-contact situations. The critical incidents were then used to formulate a 121-item instrument. Patient-contact workers from 3 hospitals (n = 102) completed the instrument along with the NEO-FFI, a measure of the Big Five personality factors. Concurrently, managers completed a performance evaluation scale on the employees participating in the study in order to determine the predictive validity of the instrument.^ Through a criterion-keying approach, the instrument was scaled down to 38 items. The correlation between HealthServe and the supervisory ratings of performance evaluation data supported the instrument's criterion-related validity (r =.66, p $<$.0001). Incremental validity of HealthServe over the Big Five was found with HealthServe accounting for 46% of the variance.^ The NEO-FFI was used to assess the correlation between personality traits and HealthServe. A factor analysis of HealthServe suggested 4 factors which were correlated with the NEO-FFI scores. Results indicated that HealthServe was related to Extraversion, Openness to Experience, Agreeableness, Conscientiousness and negatively related to Neuroticism.^ The benefits of the test construction procedure used here over the use of broad-based measures of personality were discussed as well as the limitations of using a concurrent validation strategy. Recommendations for future studies were provided. ^
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
In - Managing Quality In the Hospitality Industry – an observation by W. Gerald Glover, Associate Professor, Hospitality Management Program, Appalachian State University, initially Glover establishes: “Quality is a primary concern in the hospitality industry. The author sees problems in the nature of the way businesses are managed and discusses approaches to ensuring quality in corporate cultures.” As the title suggests, the author wants to point out certain discrepancies in hospitality quality control, as well as enlighten you as to how to address some of these concerns. “A discussion of quality presents some interesting dilemmas. Quality is something that almost everyone wants,” Assistant Professor Glover notes. “Service businesses will never admit that they don't provide it to their customers, and few people actually understand what it takes to make it happen,” he further maintains. Glover wants you to know that in a dynamic industry such as hospitality, quality is the common denominator. Whether it be hotel, restaurant, airline, et al., quality is the raison d’être of the industry. “Quality involves the consistent delivery of a product or service according to the expected standards,” Glover provides. Many, if not all quality deficiencies can be traced back to management, Glover declares. He bullet points some of the operational and guest service problems managers’ face on a daily basis. One important point of note is the measuring and managing of quality. “Standards management is another critical area in people and product management that is seldom effective in corporations,” says Glover. “Typically, this area involves performance documentation, performance evaluation and appraisal, coaching, discipline, and team-building.” “To be effective at managing standards, an organization must establish communication in realms where it is currently non-existent or ineffective,” Glover goes on to say. “Coaching, training, and performance appraisal are methods to manage individuals who are expected to do what's expected.” He alludes to the benefit quality circles supply as well. In addressing American organizational behavior, Glover postures, “…a realization must develop that people and product management are the primary influences on generating revenues and eventually influencing the bottom line in all American organizations.” Glover introduces the concept of pro-activity. “Most recently, quality assurance and quality management have become the means used to develop and maintain proactive corporate cultures. When prevention is the focus, quality is most consistent and expectations are usually met,” he offers. Much of the article is dedicated to, “Appendix A-Table 1-Characteristics of Corporate Cultures (Reactive and Proactive. In it, Glover measures the impact of proactive management as opposed to the reactive management intrinsic to many elements of corporate culture mentality.
Resumo:
Today, smart-phones have revolutionized wireless communication industry towards an era of mobile data. To cater for the ever increasing data traffic demand, it is of utmost importance to have more spectrum resources whereby sharing under-utilized spectrum bands is an effective solution. In particular, the 4G broadband Long Term Evolution (LTE) technology and its foreseen 5G successor will benefit immensely if their operation can be extended to the under-utilized unlicensed spectrum. In this thesis, first we analyze WiFi 802.11n and LTE coexistence performance in the unlicensed spectrum considering multi-layer cell layouts through system level simulations. We consider a time division duplexing (TDD)-LTE system with an FTP traffic model for performance evaluation. Simulation results show that WiFi performance is more vulnerable to LTE interference, while LTE performance is degraded only slightly. Based on the initial findings, we propose a Q-Learning based dynamic duty cycle selection technique for configuring LTE transmission gaps, so that a satisfactory throughput is maintained both for LTE and WiFi systems. Simulation results show that the proposed approach can enhance the overall capacity performance by 19% and WiFi capacity performance by 77%, hence enabling effective coexistence of LTE and WiFi systems in the unlicensed band.