939 resultados para Return-based pricing kernel


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article focuses on the labour market situation and opportunities of the Hungarian vocational students. After briefly placing the topic in an international context, the study introduces the findings of the Hungarian empirical researches. Due to the differences between the various national education systems, it is not easy to make international comparisons; therefore I chose former socialist countries with characteristics similar to those of Hungary. When comparing the relevant data, it became clear that obtaining a diploma provides more advantages in Hungary. Hungarian researches suggest that vocational schools mostly attract students with poor competence test scores at the end of primary school. Also a significant proportion of these students are disadvantaged. Vocational students are the most likely to drop out of the system and their return to the school later is sporadic at best. Although a completed VET improves their employment conditions and prospects, many of the graduates will leave their profession or do unskilled labour. Their labour income varies greatly depending on their type of trade and experience gained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diet and physical activity patterns have been implicated as major factors in the increasing prevalence of childhood and adolescent obesity. It is estimated that between 16 and 33 percent of children and adolescents in the United States are overweight (CDC, 2000). Moreover, the CDC estimates that less than 50% of adolescents are physically active on a regular basis (CDC, 2003). Interventions must be focused to modify these behaviors. Facilitating the understanding of proper nutrition and need for physical activity among adolescents is the first step in preventing overweight and obesity and delaying the development of chronic diseases later in life (Dwyer, 2000). The purpose of this study was to compare the outcomes of students receiving one of two forms of education (both emphasizing diet and physical activity), to determine whether a computer based intervention (CBI) program using an interactive, animated CD-ROM would elicit a greater behavior change in comparison to a traditional didactic intervention (TDI) program. A convenience sample of 254 high school students aged 14-19 participated in the 6-month program. A pre-test post-test design was used, with follow-up measures taken at three months post-intervention. ^ No change was noted in total fat, saturated fat, fruit/vegetables, or fiber intake for any of the groups. There was also no change in perceived self-efficacy or perceived social support. Results did, however, indicate an increase in nutrition knowledge for both intervention groups (p<0.001). In addition, the CBI group demonstrated more positive and sustained behavior changes throughout the course of the study. These changes included a decrease in BMI (ppre/post<0.001, ppost/follow-up<0.001), number of meals skipped (ppre/post<0.001), and soda consumption (ppre/post=0.003, ppost/follow-up=0.03) and an increase in nutrition knowledge (ppre/post<0.001, ppre/follow-up <0.001), physical activity (ppre/post<0.05, p pre/follow-up<0.01), frequency of label reading (ppre/follow-up <0.0l) and in dairy consumption (ppre/post=0.03). The TDI group did show positive gains in some areas post intervention, however a return to baseline behavior was shown at follow-up. Findings of this study suggest that compared to traditional didactic teaching, computer-based nutrition and health education has greater potential to elicit change in knowledge and behavior as well as promote maintenance of the behavior change over time. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquidity is an important attribute of an asset that investors would like to take into consideration when making investment decisions. However, the previous empirical evidence whether liquidity is a determinant of stock return is not unanimous. This dissertation provides a very comprehensive study about the role of liquidity in asset pricing using the Fama-French (1993) three-factor and Kraus and Litzenberger (1976) three-moment CAPM as models for risk adjustment. The relationship between liquidity and well-known determinants of stock returns such as size and book-to-market are also investigated. This study examines the liquidity and asset pricing issues for both intertemporal as well as cross-sectional data. ^ The results indicate an existence of a liquidity premium, i.e., less liquid stocks would demand higher rate of return than more liquid stocks. More specifically, a drop of 1 percent in liquidity is associated with a higher rate of return of about 2 to 3 basis points per month. Further investigation reveals that neither the Fama-French three-factor model nor the three-moment CAPM captures the liquidity premium. Finally, the results show that well-known determinants of stock return such as size and book-to-market do not serve as proxy for liquidity. ^ Overall, this dissertation shows that a liquidity premium exists in the stock market and that liquidity is a distinct effect, and is not influenced by the presence of non-market factors, market factors and other stock characteristics.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fast spreading unknown viruses have caused major damage on computer systems upon their initial release. Current detection methods have lacked capabilities to detect unknown viruses quickly enough to avoid mass spreading and damage. This dissertation has presented a behavior based approach to detecting known and unknown viruses based on their attempt to replicate. Replication is the qualifying fundamental characteristic of a virus and is consistently present in all viruses making this approach applicable to viruses belonging to many classes and executing under several conditions. A form of replication called self-reference replication, (SR-replication), has been formalized as one main type of replication which specifically replicates by modifying or creating other files on a system to include the virus itself. This replication type was used to detect viruses attempting replication by referencing themselves which is a necessary step to successfully replicate files. The approach does not require a priori knowledge about known viruses. Detection was accomplished at runtime by monitoring currently executing processes attempting to replicate. Two implementation prototypes of the detection approach called SRRAT were created and tested on the Microsoft Windows operating systems focusing on the tracking of user mode Win32 API system calls and Kernel mode system services. The research results showed SR-replication capable of distinguishing between file infecting viruses and benign processes with little or no false positives and false negatives. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature did not differentiate between single and multiple restatements announcements. This research investigated the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm’s rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examined the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigated how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results supported the non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examined the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examined the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supported the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examined the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis showed that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Road pricing has emerged as an effective means of managing road traffic demand while simultaneously raising additional revenues to transportation agencies. Research on the factors that govern travel decisions has shown that user preferences may be a function of the demographic characteristics of the individuals and the perceived trip attributes. However, it is not clear what are the actual trip attributes considered in the travel decision- making process, how these attributes are perceived by travelers, and how the set of trip attributes change as a function of the time of the day or from day to day. In this study, operational Intelligent Transportation Systems (ITS) archives are mined and the aggregated preferences for a priced system are extracted at a fine time aggregation level for an extended number of days. The resulting information is related to corresponding time-varying trip attributes such as travel time, travel time reliability, charged toll, and other parameters. The time-varying user preferences and trip attributes are linked together by means of a binary choice model (Logit) with a linear utility function on trip attributes. The trip attributes weights in the utility function are then dynamically estimated for each time of day by means of an adaptive, limited-memory discrete Kalman filter (ALMF). The relationship between traveler choices and travel time is assessed using different rules to capture the logic that best represents the traveler perception and the effect of the real-time information on the observed preferences. The impact of travel time reliability on traveler choices is investigated considering its multiple definitions. It can be concluded based on the results that using the ALMF algorithm allows a robust estimation of time-varying weights in the utility function at fine time aggregation levels. The high correlations among the trip attributes severely constrain the simultaneous estimation of their weights in the utility function. Despite the data limitations, it is found that, the ALMF algorithm can provide stable estimates of the choice parameters for some periods of the day. Finally, it is found that the daily variation of the user sensitivities for different periods of the day resembles a well-defined normal distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last three decades, the Capital Asset Pricing Model (CAPM) has been a dominant model to calculate expected return. In early 1990% Fama and French (1992) developed the Fama and French Three Factor model by adding two additional factors to the CAPM. However even with these present models, it has been found that estimates of the expected return are not accurate (Elton, 1999; Fama &French, 1997). Botosan (1997) introduced a new approach to estimate the expected return. This approach employs an equity valuation model to calculate the internal rate of return (IRR) which is often called, 'implied cost of equity capital" as a proxy of the expected return. This approach has been gaining in popularity among researchers. A critical review of the literature will help inform hospitality researchers regarding the issue and encourage them to implement the new approach into their own studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the discussion - Indirect Cost Factors in Menu Pricing – by David V. Pavesic, Associate Professor, Hotel, Restaurant and Travel Administration at Georgia State University, Associate Professor Pavesic initially states: “Rational pricing methodologies have traditionally employed quantitative factors to mark up food and beverage or food and labor because these costs can be isolated and allocated to specific menu items. There are, however, a number of indirect costs that can influence the price charged because they provide added value to the customer or are affected by supply/demand factors. The author discusses these costs and factors that must be taken into account in pricing decisions. Professor Pavesic offers as a given that menu pricing should cover costs, return a profit, reflect a value for the customer, and in the long run, attract customers and market the establishment. “Prices that are too high will drive customers away, and prices that are too low will sacrifice profit,” Professor Pavesic puts it succinctly. To dovetail with this premise the author provides that although food costs measure markedly into menu pricing, other factors such as equipment utilization, popularity/demand, and marketing are but a few of the parenthetic factors also to be considered. “… there is no single method that can be used to mark up every item on any given restaurant menu. One must employ a combination of methodologies and theories,” says Professor Pavesic. “Therefore, when properly carried out, prices will reflect food cost percentages, individual and/or weighted contribution margins, price points, and desired check averages, as well as factors driven by intuition, competition, and demand.” Additionally, Professor Pavesic wants you to know that value, as opposed to maximizing revenue, should be a primary motivating factor when designing menu pricing. This philosophy does come with certain caveats, and he explains them to you. Generically speaking, Professor Pavesic says, “The market ultimately determines the price one can charge.” But, in fine-tuning that decree he further offers, “Lower prices do not automatically translate into value and bargain in the minds of the customers. Having the lowest prices in your market may not bring customers or profit. “Too often operators engage in price wars through discount promotions and find that profits fall and their image in the marketplace is lowered,” Professor Pavesic warns. In reference to intangibles that influence menu pricing, service is at the top of the list. Ambience, location, amenities, product [i.e. food] presentation, and price elasticity are discussed as well. Be aware of price-value perception; Professor Pavesic explains this concept to you. Professor Pavesic closes with a brief overview of a la carte pricing; its pros and cons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fast spreading unknown viruses have caused major damage on computer systems upon their initial release. Current detection methods have lacked capabilities to detect unknown virus quickly enough to avoid mass spreading and damage. This dissertation has presented a behavior based approach to detecting known and unknown viruses based on their attempt to replicate. Replication is the qualifying fundamental characteristic of a virus and is consistently present in all viruses making this approach applicable to viruses belonging to many classes and executing under several conditions. A form of replication called self-reference replication, (SR-replication), has been formalized as one main type of replication which specifically replicates by modifying or creating other files on a system to include the virus itself. This replication type was used to detect viruses attempting replication by referencing themselves which is a necessary step to successfully replicate files. The approach does not require a priori knowledge about known viruses. Detection was accomplished at runtime by monitoring currently executing processes attempting to replicate. Two implementation prototypes of the detection approach called SRRAT were created and tested on the Microsoft Windows operating systems focusing on the tracking of user mode Win32 API system calls and Kernel mode system services. The research results showed SR-replication capable of distinguishing between file infecting viruses and benign processes with little or no false positives and false negatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature does not differentiate between single and multiple restatements announcements. This research investigates the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm's rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examines the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigates how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results support for non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examines the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examines the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supports the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examines the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis shows that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to develop an optimal kernel which would be used in a real-time engineering and communications system. Since the application is a real-time system, relevant real-time issues are studied in conjunction with kernel related issues. The emphasis of the research is the development of a kernel which would not only adhere to the criteria of a real-time environment, namely determinism and performance, but also provide the flexibility and portability associated with non-real-time environments. The essence of the research is to study how the features found in non-real-time systems could be applied to the real-time system in order to generate an optimal kernel which would provide flexibility and architecture independence while maintaining the performance needed by most of the engineering applications. Traditionally, development of real-time kernels has been done using assembly language. By utilizing the powerful constructs of the C language, a real-time kernel was developed which addressed the goals of flexibility and portability while still meeting the real-time criteria. The implementation of the kernel is carried out using the powerful 68010/20/30/40 microprocessor based systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This paper extends the use of Radio Frequency Identification (RFID) data for accounting of warehouse costs and services. Time Driven Activity Based Costing (TDABC) methodology is enhanced with the real-time collected RFID data about duration of warehouse activities. This allows warehouse managers to have accurate and instant calculations of costs. The RFID enhanced TDABC (RFID-TDABC) is proposed as a novel application of the RFID technology. Research Approach: Application of RFID-TDABC in a warehouse is implemented on warehouse processes of a case study company. Implementation covers receiving, put-away, order picking, and despatching. Findings and Originality: RFID technology is commonly used for the identification and tracking items. The use of the RFID generated information with the TDABC can be successfully extended to the area of costing. This RFID-TDABC costing model will benefit warehouse managers with accurate and instant calculations of costs. Research Impact: There are still unexplored benefits to RFID technology in its applications in warehousing and the wider supply chain. A multi-disciplinary research approach led to combining RFID technology and TDABC accounting method in order to propose RFID-TDABC. Combining methods and theories from different fields with RFID, may lead researchers to develop new techniques such as RFID-TDABC presented in this paper. Practical Impact: RFID-TDABC concept will be of value to practitioners by showing how warehouse costs can be accurately measured by using this approach. Providing better understanding of incurred costs may result in a further optimisation of warehousing operations, lowering costs of activities, and thus provide competitive pricing to customers. RFID-TDABC can be applied in a wider supply chain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation contributes to the rapidly growing empirical research area in the field of operations management. It contains two essays, tackling two different sets of operations management questions which are motivated by and built on field data sets from two very different industries --- air cargo logistics and retailing.

The first essay, based on the data set obtained from a world leading third-party logistics company, develops a novel and general Bayesian hierarchical learning framework for estimating customers' spillover learning, that is, customers' learning about the quality of a service (or product) from their previous experiences with similar yet not identical services. We then apply our model to the data set to study how customers' experiences from shipping on a particular route affect their future decisions about shipping not only on that route, but also on other routes serviced by the same logistics company. We find that customers indeed borrow experiences from similar but different services to update their quality beliefs that determine future purchase decisions. Also, service quality beliefs have a significant impact on their future purchasing decisions. Moreover, customers are risk averse; they are averse to not only experience variability but also belief uncertainty (i.e., customer's uncertainty about their beliefs). Finally, belief uncertainty affects customers' utilities more compared to experience variability.

The second essay is based on a data set obtained from a large Chinese supermarket chain, which contains sales as well as both wholesale and retail prices of un-packaged perishable vegetables. Recognizing the special characteristics of this particularly product category, we develop a structural estimation model in a discrete-continuous choice model framework. Building on this framework, we then study an optimization model for joint pricing and inventory management strategies of multiple products, which aims at improving the company's profit from direct sales and at the same time reducing food waste and thus improving social welfare.

Collectively, the studies in this dissertation provide useful modeling ideas, decision tools, insights, and guidance for firms to utilize vast sales and operations data to devise more effective business strategies.