815 resultados para economic value analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the article - Menu Analysis: Review and Evaluation - by Lendal H. Kotschevar, Distinguished Professor School of Hospitality Management, Florida International University, Kotschevar’s initial statement reads: “Various methods are used to evaluate menus. Some have quite different approaches and give different information. Even those using quite similar methods vary in the information they give. The author attempts to describe the most frequently used methods and to indicate their value. A correlation calculation is made to see how well certain of these methods agree in the information they give.” There is more than one way to look at the word menu. The culinary selections decided upon by the head chef or owner of a restaurant, which ultimately define the type of restaurant is one way. The physical outline of the food, which a patron actually holds in his or her hand, is another. These descriptions are most common to the word, menu. The author primarily concentrates on the latter description, and uses the act of counting the number of items sold on a menu to measure the popularity of any particular item. This, along with a formula, allows Kotschevar to arrive at a specific value per item. Menu analysis would appear a difficult subject to broach. How does a person approach a menu analysis, how do you qualify and quantify a menu; it seems such a subjective exercise. The author offers methods and outlines on approaching menu analysis from empirical perspectives. “Menus are often examined visually through the evaluation of various factors. It is a subjective method but has the advantage of allowing scrutiny of a wide range of factors which other methods do not,” says Distinguished Professor, Kotschevar. “The method is also highly flexible. Factors can be given a score value and scores summed to give a total for a menu. This allows comparison between menus. If the one making the evaluations knows menu values, it is a good method of judgment,” he further offers. The author wants you to know that assigning values is fundamental to a pragmatic menu analysis; it is how the reviewer keeps score, so to speak. Value merit provides reliable criteria from which to gauge a particular menu item. In the final analysis, menu evaluation provides the mechanism for either keeping or rejecting selected items on a menu. Kotschevar provides at least three different matrix evaluation methods; they are defined as the Miller method, the Smith and Kasavana method, and the Pavesic method. He offers illustrated examples of each via a table format. These are helpful tools since trying to explain the theories behind the tables would be difficult at best. Kotschevar also references examples of analysis methods which aren’t matrix based. The Hayes and Huffman - Goal Value Analysis - is one such method. The author sees no one method better than another, and suggests that combining two or more of the methods to be a benefit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since the second half of 1990s, the economic impact of sports mega-events concerned the researchers, the public and the professionals. The investment of public funds and the effects on several sectors of the economy motivate the economic impact studies. The economic impact of the FIS Nordic World Ski Championship Falun 2015 to the region of Dalarna is the topic of this thesis. This requires the calculation of direct, indirect and induced economic impact. Within the analysis, data from a questionnaire survey conducted on seven different days during the event are used. The final sample of the analysis contains 893 observations. The segmentation approach was applied for the calculations and the visitors were classified regarding their choice of accommodation. The regional economic impact is calculated at 321 M SEK and the employment effect on the tourism sector is estimated. However, the lack of information limits the study. The analysis could be extended with an accurate investigation of certain issues. Further, the impact of the event should be estimated from all the perspectives. The organization of sports mega-events creates tangible and intangible effects to the host-city. The thesis reviews literature on the economic impact studies of sports mega-events. The results of the study can be used for a comprehensive analysis of the case study. Further, the professionals of the tourism and the event could be benefited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In many product categories, unit prices facilitate price comparisons across brands and package sizes; this enables consumers to identify those products that provide the greatest value. However in other product categories, unit prices may be confusing. This is because there are two types of unit pricing, measure-based and usage-based. Measure-based unit prices are what the name implies; price is expressed in cents or dollars per unit of measure (e.g. ounce). Usage-based unit prices, on the other hand, are expressed in terms of cents or dollars per use (e.g., wash load or serving). The results of this study show that in two different product categories (i.e., laundry detergent and dry breakfast cereal), measure-based unit prices reduced consumers’ ability to identify higher value products, but when a usage-based unit price was provided, their ability to identify product value was increased. When provided with both a measure-based and a usage-based unit price, respondents did not perform as well as when they were provided only a usage-based unit price, additional evidence that the measure-based unit price hindered consumers’ comparisons. Finally, the presence of two potential moderators, education about the meaning of the two measures and having to rank order the options in the choice set in terms of value before choosing, did not eliminate these effects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For SMEs to operate in the complex and globalised economic landscape of today engaging with innovation can sustain competitive advantage. Within Design Management, design is being increasingly posited as a strategic resource to facilitate the absorption of new design resources and leverage design knowledge in ways that support SMEs through such economic pressures. Evidencing the relationship between design and economic performance is complex, leading to extensive current research and industry efforts to show how design adds economic value. Despite the value of such efforts, it is important to recognise that innovation means different things to different organizations, especially for start-ups and SMEs. Within the rising tide of design-led innovation, there is a gap being explored in how design can effectively capture and evaluate its contribution within the complex and diverse situations of business development it engages. In seeking to address this gap, this paper presents findings from research undertaken within Design in Action (DiA), an AHRC-funded knowledge exchange hub. Presenting DiA as a single case study, the paper offers methodical reflection on five case example start-up businesses funded by DiA in order to explore the value that design-led innovation approaches offered in their formation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mest. em Gestão e Conservação da Natureza, Faculdade de Ciências do Mar e do Ambiente, Univ. do Algarve, 2004

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the growth of energy consumption worldwide, conventional reservoirs, the reservoirs called "easy exploration and production" are not meeting the global energy demand. This has led many researchers to develop projects that will address these needs, companies in the oil sector has invested in techniques that helping in locating and drilling wells. One of the techniques employed in oil exploration process is the reverse time migration (RTM), in English, Reverse Time Migration, which is a method of seismic imaging that produces excellent image of the subsurface. It is algorithm based in calculation on the wave equation. RTM is considered one of the most advanced seismic imaging techniques. The economic value of the oil reserves that require RTM to be localized is very high, this means that the development of these algorithms becomes a competitive differentiator for companies seismic processing. But, it requires great computational power, that it still somehow harms its practical success. The objective of this work is to explore the implementation of this algorithm in unconventional architectures, specifically GPUs using the CUDA by making an analysis of the difficulties in developing the same, as well as the performance of the algorithm in the sequential and parallel version

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is a fact that the uncertainty about a firm’s future has to be measured and incorporated into a company’s valuation throughout the explicit analysis period – in the continuing or terminal value within valuation models. One of the concerns that can influence the continuing value of enterprises, which is not explicitly considered in traditional valuation models, is a firm’s average life expectancy. Although the literature has studied the life cycle of a firm, there is still a considerable lack of references on this topic. If we ignore the period during which a company has the ability to produce future cash flows, the valuations can fall into irreversible errors, leading to results markedly different from market values. This paper aims to provide a contribution in this area. Its main objective is to construct a mortality table for non-listed Portuguese enterprises, showing that the use of a terminal value through a mathematical expression of perpetuity of free cash flows is not adequate. We provide the use of an appropriate coefficient to perceive the number of years in which the company will continue to operate until its theoretical extinction. If well addressed regarding valuation models, this issue can be used to reduce or even to eliminate one of the main problems that cause distortions in contemporary enterprise valuation models: the premise of an enterprise’s unlimited existence in time. Besides studying the companies involved in it, from their existence to their demise, our study intends to push knowledge forward by providing a consistent life and mortality expectancy table for each age of the company, presenting models with an explicitly and different survival rate for each year. Moreover, we show that, after reaching a certain age, firms can reinvent their business, acquiring maturity and consequently postponing their mortality through an additional life period.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Una de las maneras que existen para lograr niveles de ingreso mayores y a su vez incrementar las posibilidades de perdurabilidad de las empresas es la generación de valor, el cual permite que las compañías tengan la capacidad de adaptarse a su entorno y así sacar provecho del mismo. Sin embargo, son pocas las herramientas con las que dichas empresas cuentan para determinar de manera efectiva la creación o destrucción de valor que se genera a través del tiempo. El presente proyecto tiene como fin utilizar la herramienta del Valor Económico Agregado (EVA, por sus siglas en inglés) para analizar específicamente la creación de valor en un periodo de cinco años en el sector de la salud en Bogotá, tomando como base de investigación los hospitales pertenecientes al nivel III de atención dentro del total de la red hospitalaria de la capital del país. De igual manera, el análisis permitirá realizar proyecciones acerca de la creación de valor, cuyos resultados serán utilizados como base para evaluar estrategias que permitan a las empresas de este sector mantener dicho valor a través del tiempo o revertir una situación desfavorable en el caso que los resultados arrojados evidencien una destrucción de valor dentro de las instituciones prestadoras de salud. Para llegar a comprender el alcance de estos resultados, este proyecto explicará en qué consiste el valor, cómo se mide en las empresas y la viabilidad y aplicación de estas medidas en el caso específico de las sociedades colombianas pertenecientes al sector de la salud.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article is an abbreviated version of a debate between two economists holding somewhat different perspectives on the nature of non-market production in the space of new digital media. While the ostensible focus here is on the role of markets in the innovation of new technologies to create new economic value, this context also serves to highlight the private and public value of digital literacy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: A bundled approach to central venous catheter care is currently being promoted as an effective way of preventing catheter-related bloodstream infection (CR-BSI). Consumables used in the bundled approach are relatively inexpensive which may lead to the conclusion that the bundle is cost-effective. However, this fails to consider the nontrivial costs of the monitoring and education activities required to implement the bundle, or that alternative strategies are available to prevent CR-BSI. We evaluated the cost-effectiveness of a bundle to prevent CR-BSI in Australian intensive care patients. ---------- Methods and Findings: A Markov decision model was used to evaluate the cost-effectiveness of the bundle relative to remaining with current practice (a non-bundled approach to catheter care and uncoated catheters), or use of antimicrobial catheters. We assumed the bundle reduced relative risk of CR-BSI to 0.34. Given uncertainty about the cost of the bundle, threshold analyses were used to determine the maximum cost at which the bundle remained cost-effective relative to the other approaches to infection control. Sensitivity analyses explored how this threshold alters under different assumptions about the economic value placed on bed-days and health benefits gained by preventing infection. If clinicians are prepared to use antimicrobial catheters, the bundle is cost-effective if national 18-month implementation costs are below $1.1 million. If antimicrobial catheters are not an option the bundle must cost less than $4.3 million. If decision makers are only interested in obtaining cash-savings for the unit, and place no economic value on either the bed-days or the health benefits gained through preventing infection, these cost thresholds are reduced by two-thirds.---------- Conclusions: A catheter care bundle has the potential to be cost-effective in the Australian intensive care setting. Rather than anticipating cash-savings from this intervention, decision makers must be prepared to invest resources in infection control to see efficiency improvements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Monetary valuations of the economic cost of health care–associated infections (HAIs) are important for decision making and should be estimated accurately. Erroneously high estimates of costs, designed to jolt decision makers into action, may do more harm than good in the struggle to attract funding for infection control. Expectations among policy makers might be raised, and then they are disappointed when the reduction in the number of HAIs does not yield the anticipated cost saving. For this article, we critically review the field and discuss 3 questions. Why measure the cost of an HAI? What outcome should be used to measure the cost of an HAI? What is the best method for making this measurement? The aim is to encourage researchers to collect and then disseminate information that accurately guides decisions about the economic value of expanding or changing current infection control activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lean project management is the comprehensive adaption of other lean concept like lean construction, lean manufacturing and lean thinking into project management context. Execution of many similar industrial projects creates the idea of lean project management in companies and rapidly growing in industries. This paper offers the standardization method in order to achieve Lean project management in large scale industrial project. Standardization refers to all activity which makes two projects most identical and unify to each other like standardization of design, reducing output variability, value analysis and strategic management. Although standard project may have minor effi ciency decrease, compare to custom built project; but great advantage of standard project like cost saving, time reduction and quality improvement justify standardization methodology. This paper based on empirical experience in industrial project and theoretical analysis of benefi ts of project standardization.