781 resultados para Portfolio Performance Evaluation
Resumo:
Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using a coherent measure of risk it is impossible to allocate risk satisfying the natural requirements of (Solution) Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.
Resumo:
A kockázat jó mérése és elosztása elengedhetetlen a bankok, biztosítók, befektetési alapok és egyéb pénzügyi vállalkozások belső tőkeallokációjához vagy teljesítményértékeléséhez. A cikkben bemutatjuk, hogy a koherens kockázati mértékek axiómáit nem likvid portfóliók esetén is el lehet várni. Így mérve a kockázatot, ismertetünk a kockázatelosztásra vonatkozó két kooperatív játékelméleti cikket. Az első optimista, eszerint mindig létezik stabil, az alegységek minden koalíciója által elfogadható, általános módszer a kockázat (tőke) elosztására. A második cikk pesszimista, mert azt mondja ki, hogy ha a stabilitás mellett igazságosak is szeretnénk lenni, akkor egy lehetetlenségi tételbe ütközünk. / === / Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to fi nancial risk. We argue that the axioms of coherent measures of risk are valid for illiquid portfolios as well. Then, we present the results of two papers on allocating risk measured by a coherent measure of risk. Assume a bank has some divisions. According to the fi rst paper there is always a stable allocation of risk capital, which is not blocked by any coalition of the divisions, that is there is a core compatible allocation rule (we present some examples for risk allocation rules). The second paper considers two more natural requirements, Equal Treatment Property and Strong Monotonicity. Equal Treatment Property makes sure that similar divisions are treated symmetrically, that is if two divisions make the same marginal risk contribution to all the coalition of divisions not containing them, then the rule should allocate them the very same risk capital. Strong Monotonicity requires that if the risk environment changes in such a way that the marginal contribution of a division is not decreasing, then its allocated risk capital should not decrease either. However, if risk is evaluated by any coherent measure of risk, then there is no risk allocation rule satisfying Core Compatibility, Equal Treatment Property and Strong Monotonicity, we encounter an impossibility result.
Resumo:
Measuring and allocating risk properly are crucial for performance evaluation and internal capital allocation of portfolios held by banks, insurance companies, investment funds and other entities subject to financial risk. We show that by using coherent measures of risk it is impossible to allocate risk satisfying simultaneously the natural requirements of Core Compatibility, Equal Treatment Property and Strong Monotonicity. To obtain the result we characterize the Shapley value on the class of totally balanced games and also on the class of exact games.
Resumo:
This study has explored the potential for implementing a merit-based public personnel system in The Bahamas, a former British colony in The Commonwealth Caribbean. Specifically, the study evaluated the use of merit-based public personnel management practices in areas of recruitment, selection, promotion, training and employee development and performance evaluation. Driving forces and barriers which impact merit system successes and failures as well as strategies for institutionalizing merit system practices are identified. Finally the study attempted to apply the developmental model created by Klingner (1996) to describe the stage of public personnel management in The Bahamas. The data for the study was collected through in-depth interviews with expert observers. ^
Resumo:
An assessment tool designed to measure a customer service orientation among RN's and LPN's was developed using a content-oriented approach. Critical incidents were first developed by asking two samples of healthcare managers (n = 52 and 25) to identify various customer-contact situations. The critical incidents were then used to formulate a 121-item instrument. Patient-contact workers from 3 hospitals (n = 102) completed the instrument along with the NEO-FFI, a measure of the Big Five personality factors. Concurrently, managers completed a performance evaluation scale on the employees participating in the study in order to determine the predictive validity of the instrument.^ Through a criterion-keying approach, the instrument was scaled down to 38 items. The correlation between HealthServe and the supervisory ratings of performance evaluation data supported the instrument's criterion-related validity (r =.66, p $<$.0001). Incremental validity of HealthServe over the Big Five was found with HealthServe accounting for 46% of the variance.^ The NEO-FFI was used to assess the correlation between personality traits and HealthServe. A factor analysis of HealthServe suggested 4 factors which were correlated with the NEO-FFI scores. Results indicated that HealthServe was related to Extraversion, Openness to Experience, Agreeableness, Conscientiousness and negatively related to Neuroticism.^ The benefits of the test construction procedure used here over the use of broad-based measures of personality were discussed as well as the limitations of using a concurrent validation strategy. Recommendations for future studies were provided. ^
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors’ sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, “ Investor Sentiment and Intrinsic Stock Prices”, a new technical trading strategy was developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results indicate that sample firms trade within a range and give signals as to when to buy or sell. In the second essay, “Managerial Sentiment and the Value of the Firm”, examined the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Final analysis reported that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. Changes in the cost of capital, weighted cost of average capital were found due to managerial sentiment. In the last essay, “Investor Sentiment and Optimal Portfolio Selection”, analyzed how the investor sentiment affects the nature and composition of the optimal portfolio as well as the portfolio performance. Results suggested that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicated the practical application of behavioral model based technical indicator for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors' sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, Investor Sentiment and Intrinsic Stock Prices, a new technical trading strategy is developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results show that sample firms trade within a range and show signals as to when to buy or sell. The second essay, Managerial Sentiment and the Value of the Firm, examines the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Findings show that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. The last essay, Investor Sentiment and Optimal Portfolio Selection, analyzes how the investor sentiment affects the nature and composition of the optimal portfolio as well as the performance measures. Results suggest that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicate the practical application of behavioral model based technical indicators for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.
Resumo:
In - Managing Quality In the Hospitality Industry – an observation by W. Gerald Glover, Associate Professor, Hospitality Management Program, Appalachian State University, initially Glover establishes: “Quality is a primary concern in the hospitality industry. The author sees problems in the nature of the way businesses are managed and discusses approaches to ensuring quality in corporate cultures.” As the title suggests, the author wants to point out certain discrepancies in hospitality quality control, as well as enlighten you as to how to address some of these concerns. “A discussion of quality presents some interesting dilemmas. Quality is something that almost everyone wants,” Assistant Professor Glover notes. “Service businesses will never admit that they don't provide it to their customers, and few people actually understand what it takes to make it happen,” he further maintains. Glover wants you to know that in a dynamic industry such as hospitality, quality is the common denominator. Whether it be hotel, restaurant, airline, et al., quality is the raison d’être of the industry. “Quality involves the consistent delivery of a product or service according to the expected standards,” Glover provides. Many, if not all quality deficiencies can be traced back to management, Glover declares. He bullet points some of the operational and guest service problems managers’ face on a daily basis. One important point of note is the measuring and managing of quality. “Standards management is another critical area in people and product management that is seldom effective in corporations,” says Glover. “Typically, this area involves performance documentation, performance evaluation and appraisal, coaching, discipline, and team-building.” “To be effective at managing standards, an organization must establish communication in realms where it is currently non-existent or ineffective,” Glover goes on to say. “Coaching, training, and performance appraisal are methods to manage individuals who are expected to do what's expected.” He alludes to the benefit quality circles supply as well. In addressing American organizational behavior, Glover postures, “…a realization must develop that people and product management are the primary influences on generating revenues and eventually influencing the bottom line in all American organizations.” Glover introduces the concept of pro-activity. “Most recently, quality assurance and quality management have become the means used to develop and maintain proactive corporate cultures. When prevention is the focus, quality is most consistent and expectations are usually met,” he offers. Much of the article is dedicated to, “Appendix A-Table 1-Characteristics of Corporate Cultures (Reactive and Proactive. In it, Glover measures the impact of proactive management as opposed to the reactive management intrinsic to many elements of corporate culture mentality.
Resumo:
Today, smart-phones have revolutionized wireless communication industry towards an era of mobile data. To cater for the ever increasing data traffic demand, it is of utmost importance to have more spectrum resources whereby sharing under-utilized spectrum bands is an effective solution. In particular, the 4G broadband Long Term Evolution (LTE) technology and its foreseen 5G successor will benefit immensely if their operation can be extended to the under-utilized unlicensed spectrum. In this thesis, first we analyze WiFi 802.11n and LTE coexistence performance in the unlicensed spectrum considering multi-layer cell layouts through system level simulations. We consider a time division duplexing (TDD)-LTE system with an FTP traffic model for performance evaluation. Simulation results show that WiFi performance is more vulnerable to LTE interference, while LTE performance is degraded only slightly. Based on the initial findings, we propose a Q-Learning based dynamic duty cycle selection technique for configuring LTE transmission gaps, so that a satisfactory throughput is maintained both for LTE and WiFi systems. Simulation results show that the proposed approach can enhance the overall capacity performance by 19% and WiFi capacity performance by 77%, hence enabling effective coexistence of LTE and WiFi systems in the unlicensed band.
Resumo:
We investigated controls on the water chemistry of a South Ecuadorian cloud forest catchment which is partly pristine, and partly converted to extensive pasture. From April 2007 to May 2008 water samples were taken weekly to biweekly at nine different subcatchments, and were screened for differences in electric conductivity, pH, anion, as well as element composition. A principal component analysis was conducted to reduce dimensionality of the data set and define major factors explaining variation in the data. Three main factors were isolated by a subset of 10 elements (Ca2+, Ce, Gd, K+, Mg2+, Na+, Nd, Rb, Sr, Y), explaining around 90% of the data variation. Land-use was the major factor controlling and changing water chemistry of the subcatchments. A second factor was associated with the concentration of rare earth elements in water, presumably highlighting other anthropogenic influences such as gravel excavation or road construction. Around 12% of the variation was explained by the third component, which was defined by the occurrence of Rb and K and represents the influence of vegetation dynamics on element accumulation and wash-out. Comparison of base- and fast flow concentrations led to the assumption that a significant portion of soil water from around 30 cm depth contributes to storm flow, as revealed by increased rare earth element concentrations in fast flow samples. Our findings demonstrate the utility of multi-tracer principal component analysis to study tropical headwater streams, and emphasize the need for effective land management in cloud forest catchments.
Resumo:
Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.
Resumo:
This study is about the enhancement of the elementary school in Natal/RN, (PCCR - Law No. 058/2004), concerning to the horizontal promotion through the performance evaluation. It uses as reference the education policy on the legislative field hegemony and the managerial model. The analysis of the teacher´s valorization is based on the policy of Funds (Fundef and Fundeb) in the Brazilian social and educational agenda. The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010).The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010). The thesis argues for the necessity to adopt a direct relationship between career development and horizontal promotion for more others fifteen classes, no matter to any conditioning variables. In addition, the performance shall be evaluated by interval of 25 years to reach at least the provisions decided in the law, which determines the salary adjustment in 5% at every two years, as pointed at the PCCR, about teachers remuneration, and teachers qualifications. A work of a bibliographic and a documental review about the education funding with the purpose of enhancement of educational work, career concepts, and also promotion and evaluation performance as well was performed based on experts authors in this field. The survey was organized with the aim of articulating quantitative and qualitative information, analyzing data from the teacher's salary - payrolls and paychecks - also applying a questionnaire. After the implementation of the PCCR, it was found that the wage indices for horizontal promotion during the teaching career are tied to a strategy for evaluating the performance which disqualifies the teacher‟s salaries in a minimum percentage of 25% (up to 25 years) and there are also elements that disturb the promotion strategy. The national minimum wage was set in three salaries by the PSPN Lei nº11.738/2008 but it never reaches the three salaries at Natal/RN educational system.Otherwise, the elements that structure the horizontal promotion in fifteen classes, throughout the career, flout the minimum years of teaching work, long established in 25 years. In addition, changes in terms in the salary increase depend on individual efforts by professional development through titration. Concerning to the career, despite of the category approving its PCCR, neither this instrument nor the Funds Policy managed to establish regulations were able to cope effective rules for valuing the teachers in the educational district system. It is necessary to ensure, in percentage terms and financial, the real remuneration of teachers with the attainment of horizontal promotion, reviewing the elements that structure the career and the determinants of performance evaluation.
Resumo:
In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.
Resumo:
The pericarp of Passiflora edulis var. flavicarpa Degener is now being investigated for medicine purposes. There are no reports about it toxicity. The aim of the present study was investigate the sub chronic toxicity in male rats and reproductive toxicity in pregnant rats and exposed fetuses of an extract obtained by infusion of the pericarp in water (1:3 m/v;100o C, 10 min). The extract composition was evaluated by tube reactions and thin lawyer chromatography (TLC). Adult male rats (n=8) were treated with 300 mg/kg of the extract, by gavage, during 30 days and pregnant rats (n=7) from gestation day 0 to day 20. Control received tap water (1 mL). Water and food intakes and body weight gain were recorded. At day 29 of treatment the sexual behavior of the males was analyzed and then half of males from each group received cyclophosphamide (50 mg/kg, i.p.) to (anti)genotoxic assessment in bone marrow. At day 30, males were anesthesized for parameters collection. At day 20 of gestation, the dams were anesthesized for reproductive performance evaluation. The fetal analysis was conducted by visceral and skeletal. Phytochemical analysis revealed the presence of flavonoids, unspecific alkaloids, phenols and triterpenic compounds. Statistical analysis revealed absence of significant differences between experimental and control. This study suggest that the aqueous extract obtained from pericarp of P. edulis var. flavicarpa Degener was not able to promote toxic effects in rats. Cytotoxicity was evaluated with the PCE/NCE ratio (NCE=normochromatic erythrocytes). Statistical analysis (mean ± SEM) revealed absence of changes in the frequency of MNPCE (negative control: 3.26±0.42; positive control: 11.72±1.02; negative experimental: 4.02±0.13; positive experimental: 10.47±0.87) or cytotoxicity (negative control: 0.37±0.08; positive control: 0.23±0.05; negative experimental: 0.37±0.07; positive experimental: 0.23±0.02). This study suggests that the extracts showed no (anti)genotoxic and no cytotoxic activities under the experimental conditions.