989 resultados para risk simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

La teoría de redes de Johanson y Mattson (1988) explica como las pequeñas empresas, también conocidas como PyMes, utilizan las redes de negocio para desarrollar sus procesos de internacionalización. Es así que a través de las redes pueden superar sus limitaciones de tamaño para encontrar cierto tipo de fluidez y dinamismo en su gestión, con el fin de aprovechar los beneficios de la internacionalización. A partir del desarrollo y fortalecimiento de las relaciones dentro de la red la organización puede posicionarse en una instancia competitiva cada vez más fuerte (Jarillo, 1988). Según Forsgren y Johanson (1992), para los gerentes es importante coordinar la interacción entre los diferentes actores de la red, ya que a través de estas su posición dentro de la red mejora y así mismo el flujo de recursos será mayor. El propósito de este trabajo es analizar el modelo de internacionalización según la teoría de redes, desde una perspectiva cultural, de e-Tech Simulation una PyME “Born to be global” norteamericana. Esta empresa ha minimizado su riesgo de internacionalización, a través del desarrollo de acuerdos entre los diferentes actores. Al mejorar su posición dentro de la red, es decir al fortalecer aún más los lazos existentes y crear nuevas relaciones, la empresa ha obtenido mayores beneficios de la misma y ha logrado ser aún más flexible con sus clientes. Es por esto que a partir de este análisis se planteó una serie de recomendaciones para mejorar los procesos de negociación dentro de la red, bajo un contexto cultural. De igual forma se evidencio la importancia del papel del emprendimiento del gerente en los procesos de internacionalización, así como su habilidad para mezclar los recursos obtenidos de diferentes mercados internacionales para satisfacer las necesidades de los clientes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The activated sludge and anaerobic digestion processes have been modelled in widely accepted models. Nevertheless, these models still have limitations when describing operational problems of microbiological origin. The aim of this thesis is to develop a knowledge-based model to simulate risk of plant-wide operational problems of microbiological origin.For the risk model heuristic knowledge from experts and literature was implemented in a rule-based system. Using fuzzy logic, the system can infer a risk index for the main operational problems of microbiological origin (i.e. filamentous bulking, biological foaming, rising sludge and deflocculation). To show the results of the risk model, it was implemented in the Benchmark Simulation Models. This allowed to study the risk model's response in different scenarios and control strategies. The risk model has shown to be really useful providing a third criterion to evaluate control strategies apart from the economical and environmental criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A radionuclide source term model has been developed which simulates the biogeochemical evolution of the Drigg low level waste (LLW) disposal site. The DRINK (DRIgg Near field Kinetic) model provides data regarding radionuclide concentrations in groundwater over a period of 100,000 years, which are used as input to assessment calculations for a groundwater pathway. The DRINK model also provides input to human intrusion and gaseous assessment calculations through simulation of the solid radionuclide inventory. These calculations are being used to support the Drigg post closure safety case. The DRINK model considers the coupled interaction of the effects of fluid flow, microbiology, corrosion, chemical reaction, sorption and radioactive decay. It represents the first direct use of a mechanistic reaction-transport model in risk assessment calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a method for simulating multivariate samples that have exact means, covariances, skewness and kurtosis. We introduce a new class of rectangular orthogonal matrix which is fundamental to the methodology and we call these matrices L matrices. They may be deterministic, parametric or data specific in nature. The target moments determine the L matrix then infinitely many random samples with the same exact moments may be generated by multiplying the L matrix by arbitrary random orthogonal matrices. This methodology is thus termed “ROM simulation”. Considering certain elementary types of random orthogonal matrices we demonstrate that they generate samples with different characteristics. ROM simulation has applications to many problems that are resolved using standard Monte Carlo methods. But no parametric assumptions are required (unless parametric L matrices are used) so there is no sampling error caused by the discrete approximation of a continuous distribution, which is a major source of error in standard Monte Carlo simulations. For illustration, we apply ROM simulation to determine the value-at-risk of a stock portfolio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban microclimates are greatly affected by urban form and texture and have a significant impact on building energy performance. The impact of urban form on energy consumption in buildings mainly relates to the availability of the uses of solar radiation, daylighting and natural ventilation. The urban heat island (UHI) effect increases the risk of overheating in buildings as well as the maximum energy demand for cooling. A need has arisen for a robust calculation tool (using the first-cut calculation method) to enable planners, architects and environmental assessors, to quickly and accurately compare the impact of different urban forms on local climate and UHI mitigation strategies. This paper describes a tool for the simulation of urban microclimates, which is developed by integrating image processing with a coupled thermal and airflow model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the changes in the length of commercial property leases over the last decade and presents an analysis of the consequent investment and occupational pricing implications for commercial property investmentsIt is argued that the pricing implications of a short lease to an investor are contingent upon the expected costs of the letting termination to the investor, the probability that the letting will be terminated and the volatility of rental values.The paper examines the key factors influencing these variables and presents a framework for incorporating their effects into pricing models.Approaches to their valuation derived from option pricing are critically assessed. It is argued that such models also tend to neglect the price effects of specific risk factors such as tenant circumstances and the terms of break clause. Specific risk factors have a significant bearing on the probability of letting termination and on the level of the resultant financial losses. The merits of a simulation methododology are examined for rental and capital valuations of short leases and properties with break clauses.It is concluded that in addition to the rigour of its internal logic, the success of any methodology is predicated upon the accuracy of the inputs.The lack of reliable data on patterns in, and incidence of, lease termination and the lack of reliable time series of historic property performance limit the efficacy of financial models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the UK, urban river basins are particularly vulnerable to flash floods due to short and intense rainfall. This paper presents potential flood resilience approaches for the highly urbanised Wortley Beck river basin, south west of the Leeds city centre. The reach of Wortley Beck is approximately 6km long with contributing catchment area of 30km2 that drain into the River Aire. Lower Wortley has experienced regular flooding over the last few years from a range of sources, including Wortley Beck and surface and ground water, that affects properties both upstream and downstream of Farnley Lake as well as Wortley Ring Road. This has serious implications for society, the environment and economy activity in the City of Leeds. The first stage of the study involves systematically incorporating Wortley Beck’s land scape features on an Arc-GIS platform to identify existing green features in the region. This process also enables the exploration of potential blue green features: green spaces, green roofs, water retention ponds and swales at appropriate locations and connect them with existing green corridors to maximize their productivity. The next stage is involved in developing a detailed 2D urban flood inundation model for the Wortley Beck region using the CityCat model. CityCat is capable to model the effects of permeable/impermeable ground surfaces and buildings/roofs to generate flood depth and velocity maps at 1m caused by design storm events. The final stage of the study is involved in simulation of range of rainfall and flood event scenarios through CityCat model with different blue green features. Installation of other hard engineering individual property protection measures through water butts and flood walls are also incorporated in the CityCat model. This enables an integrated sustainable flood resilience strategy for this region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Há mais de uma década, o Value-at-Risk (VaR) é utilizado por instituições financeiras e corporações não financeiras para controlar o risco de mercado de carteiras de investimentos. O fato dos métodos paramétricos assumirem a hipótese de normalidade da distribuição de retornos dos fatores de risco de mercado, leva alguns gestores de risco a utilizar métodos por simulação histórica para calcular o VaR das carteiras. A principal crítica à simulação histórica tradicional é, no entanto, dar o mesmo peso na distribuição à todos os retornos encontrados no período. Este trabalho testa o modelo de simulação histórica com atualização de volatilidade proposto por Hull e White (1998) com dados do mercado brasileiro de ações e compara seu desempenho com o modelo tradicional. Os resultados mostraram um desempenho superior do modelo de Hull e White na previsão de perdas para as carteiras e na sua velocidade de adaptação à períodos de ruptura da volatilidade do mercado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables to do this sacrifices too much information. However, most of the specification tests (also called backtests) avaliable in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not realy solely on binary variable. It is show that the new backtest provides a sufficiant condition to assess the performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theorical findings are corroborated through a monte Carlo simulation and an empirical exercise with daily S&P500 time series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.