854 resultados para software performance evaluation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coherent optical orthogonal frequency division multiplexing (CO-OFDM) is an attractive transmission technique to virtually eliminate intersymbol interference caused by chromatic dispersion and polarization-mode dispersion. Design, development, and operation of CO-OFDM systems require simple, efficient, and reliable methods of their performance evaluation. In this paper, we demonstrate an accurate bit error rate estimation method for QPSK CO-OFDM transmission based on the probability density function of the received QPSK symbols. By comparing with other known approaches, including data-aided and nondata-aided error vector magnitude, we show that the proposed method offers the most accurate estimate of the system performance for both single channel and wavelength division multiplexing QPSK CO-OFDM transmission systems. © 2014 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a performance evaluation of a non-conventional approach to implement phase noise tolerant optical systems with multilevel modulation formats. The performance of normalized Viterbi-Viterbi carrier phase estimation (V-V CPE) is investigated in detail for circular m-level quadrature amplitude modulation (C-mQAM) signals. The intrinsic property of C-mQAM constellation points with a uniform phase separation allows a straightforward employment of V-V CPE without the need to adapt constellation. Compared with conventional feed-forward CPE for square QAM signals, the simulated results show an enhanced tolerance of linewidth symbol duration product (ΔvTs) at a low sensitivity penalty by using feed-forward CPE structure with C-mQAM. This scheme can be easily upgraded to higher order modulations without inducing considerable complexity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using a coherent measure of risk it is impossible to allocate risk satisfying the natural requirements of (Solution) Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A kockázat jó mérése és elosztása elengedhetetlen a bankok, biztosítók, befektetési alapok és egyéb pénzügyi vállalkozások belső tőkeallokációjához vagy teljesítményértékeléséhez. A cikkben bemutatjuk, hogy a koherens kockázati mértékek axiómáit nem likvid portfóliók esetén is el lehet várni. Így mérve a kockázatot, ismertetünk a kockázatelosztásra vonatkozó két kooperatív játékelméleti cikket. Az első optimista, eszerint mindig létezik stabil, az alegységek minden koalíciója által elfogadható, általános módszer a kockázat (tőke) elosztására. A második cikk pesszimista, mert azt mondja ki, hogy ha a stabilitás mellett igazságosak is szeretnénk lenni, akkor egy lehetetlenségi tételbe ütközünk. / === / Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to fi nancial risk. We argue that the axioms of coherent measures of risk are valid for illiquid portfolios as well. Then, we present the results of two papers on allocating risk measured by a coherent measure of risk. Assume a bank has some divisions. According to the fi rst paper there is always a stable allocation of risk capital, which is not blocked by any coalition of the divisions, that is there is a core compatible allocation rule (we present some examples for risk allocation rules). The second paper considers two more natural requirements, Equal Treatment Property and Strong Monotonicity. Equal Treatment Property makes sure that similar divisions are treated symmetrically, that is if two divisions make the same marginal risk contribution to all the coalition of divisions not containing them, then the rule should allocate them the very same risk capital. Strong Monotonicity requires that if the risk environment changes in such a way that the marginal contribution of a division is not decreasing, then its allocated risk capital should not decrease either. However, if risk is evaluated by any coherent measure of risk, then there is no risk allocation rule satisfying Core Compatibility, Equal Treatment Property and Strong Monotonicity, we encounter an impossibility result.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a program for a research is outlined. Firstly, the concept of responsive information systems is defined and then the notion of the capacity planning and software performance engineering is clarified. Secondly, the purpose of the proposed methodology of capacity planning, the interface to information systems analysis and development methodologies (SSADM), the advantage of knowledge-based approach is discussed. The interfaces to CASE tools more precisely to data dictionaries or repositories (IRDS) are examined in the context of a certain systems analysis and design methodology (e.g. SSADM).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using coherent measures of risk it is impossible to allocate risk satisfying simultaneously the natural requirements of Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study has explored the potential for implementing a merit-based public personnel system in The Bahamas, a former British colony in The Commonwealth Caribbean. Specifically, the study evaluated the use of merit-based public personnel management practices in areas of recruitment, selection, promotion, training and employee development and performance evaluation. Driving forces and barriers which impact merit system successes and failures as well as strategies for institutionalizing merit system practices are identified. Finally the study attempted to apply the developmental model created by Klingner (1996) to describe the stage of public personnel management in The Bahamas. The data for the study was collected through in-depth interviews with expert observers. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An assessment tool designed to measure a customer service orientation among RN's and LPN's was developed using a content-oriented approach. Critical incidents were first developed by asking two samples of healthcare managers (n = 52 and 25) to identify various customer-contact situations. The critical incidents were then used to formulate a 121-item instrument. Patient-contact workers from 3 hospitals (n = 102) completed the instrument along with the NEO-FFI, a measure of the Big Five personality factors. Concurrently, managers completed a performance evaluation scale on the employees participating in the study in order to determine the predictive validity of the instrument.^ Through a criterion-keying approach, the instrument was scaled down to 38 items. The correlation between HealthServe and the supervisory ratings of performance evaluation data supported the instrument's criterion-related validity (r =.66, p $<$.0001). Incremental validity of HealthServe over the Big Five was found with HealthServe accounting for 46% of the variance.^ The NEO-FFI was used to assess the correlation between personality traits and HealthServe. A factor analysis of HealthServe suggested 4 factors which were correlated with the NEO-FFI scores. Results indicated that HealthServe was related to Extraversion, Openness to Experience, Agreeableness, Conscientiousness and negatively related to Neuroticism.^ The benefits of the test construction procedure used here over the use of broad-based measures of personality were discussed as well as the limitations of using a concurrent validation strategy. Recommendations for future studies were provided. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In - Managing Quality In the Hospitality Industry – an observation by W. Gerald Glover, Associate Professor, Hospitality Management Program, Appalachian State University, initially Glover establishes: “Quality is a primary concern in the hospitality industry. The author sees problems in the nature of the way businesses are managed and discusses approaches to ensuring quality in corporate cultures.” As the title suggests, the author wants to point out certain discrepancies in hospitality quality control, as well as enlighten you as to how to address some of these concerns. “A discussion of quality presents some interesting dilemmas. Quality is something that almost everyone wants,” Assistant Professor Glover notes. “Service businesses will never admit that they don't provide it to their customers, and few people actually understand what it takes to make it happen,” he further maintains. Glover wants you to know that in a dynamic industry such as hospitality, quality is the common denominator. Whether it be hotel, restaurant, airline, et al., quality is the raison d’être of the industry. “Quality involves the consistent delivery of a product or service according to the expected standards,” Glover provides. Many, if not all quality deficiencies can be traced back to management, Glover declares. He bullet points some of the operational and guest service problems managers’ face on a daily basis. One important point of note is the measuring and managing of quality. “Standards management is another critical area in people and product management that is seldom effective in corporations,” says Glover. “Typically, this area involves performance documentation, performance evaluation and appraisal, coaching, discipline, and team-building.” “To be effective at managing standards, an organization must establish communication in realms where it is currently non-existent or ineffective,” Glover goes on to say. “Coaching, training, and performance appraisal are methods to manage individuals who are expected to do what's expected.” He alludes to the benefit quality circles supply as well. In addressing American organizational behavior, Glover postures, “…a realization must develop that people and product management are the primary influences on generating revenues and eventually influencing the bottom line in all American organizations.” Glover introduces the concept of pro-activity. “Most recently, quality assurance and quality management have become the means used to develop and maintain proactive corporate cultures. When prevention is the focus, quality is most consistent and expectations are usually met,” he offers. Much of the article is dedicated to, “Appendix A-Table 1-Characteristics of Corporate Cultures (Reactive and Proactive. In it, Glover measures the impact of proactive management as opposed to the reactive management intrinsic to many elements of corporate culture mentality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Today, smart-phones have revolutionized wireless communication industry towards an era of mobile data. To cater for the ever increasing data traffic demand, it is of utmost importance to have more spectrum resources whereby sharing under-utilized spectrum bands is an effective solution. In particular, the 4G broadband Long Term Evolution (LTE) technology and its foreseen 5G successor will benefit immensely if their operation can be extended to the under-utilized unlicensed spectrum. In this thesis, first we analyze WiFi 802.11n and LTE coexistence performance in the unlicensed spectrum considering multi-layer cell layouts through system level simulations. We consider a time division duplexing (TDD)-LTE system with an FTP traffic model for performance evaluation. Simulation results show that WiFi performance is more vulnerable to LTE interference, while LTE performance is degraded only slightly. Based on the initial findings, we propose a Q-Learning based dynamic duty cycle selection technique for configuring LTE transmission gaps, so that a satisfactory throughput is maintained both for LTE and WiFi systems. Simulation results show that the proposed approach can enhance the overall capacity performance by 19% and WiFi capacity performance by 77%, hence enabling effective coexistence of LTE and WiFi systems in the unlicensed band.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is about the enhancement of the elementary school in Natal/RN, (PCCR - Law No. 058/2004), concerning to the horizontal promotion through the performance evaluation. It uses as reference the education policy on the legislative field hegemony and the managerial model. The analysis of the teacher´s valorization is based on the policy of Funds (Fundef and Fundeb) in the Brazilian social and educational agenda. The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010).The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010). The thesis argues for the necessity to adopt a direct relationship between career development and horizontal promotion for more others fifteen classes, no matter to any conditioning variables. In addition, the performance shall be evaluated by interval of 25 years to reach at least the provisions decided in the law, which determines the salary adjustment in 5% at every two years, as pointed at the PCCR, about teachers remuneration, and teachers qualifications. A work of a bibliographic and a documental review about the education funding with the purpose of enhancement of educational work, career concepts, and also promotion and evaluation performance as well was performed based on experts authors in this field. The survey was organized with the aim of articulating quantitative and qualitative information, analyzing data from the teacher's salary - payrolls and paychecks - also applying a questionnaire. After the implementation of the PCCR, it was found that the wage indices for horizontal promotion during the teaching career are tied to a strategy for evaluating the performance which disqualifies the teacher‟s salaries in a minimum percentage of 25% (up to 25 years) and there are also elements that disturb the promotion strategy. The national minimum wage was set in three salaries by the PSPN Lei nº11.738/2008 but it never reaches the three salaries at Natal/RN educational system.Otherwise, the elements that structure the horizontal promotion in fifteen classes, throughout the career, flout the minimum years of teaching work, long established in 25 years. In addition, changes in terms in the salary increase depend on individual efforts by professional development through titration. Concerning to the career, despite of the category approving its PCCR, neither this instrument nor the Funds Policy managed to establish regulations were able to cope effective rules for valuing the teachers in the educational district system. It is necessary to ensure, in percentage terms and financial, the real remuneration of teachers with the attainment of horizontal promotion, reviewing the elements that structure the career and the determinants of performance evaluation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.