962 resultados para competing risks model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to conduct a methodical drawback analysis of a financial supplier risk management approach which is currently implemented in the automotive industry. Based on identified methodical flaws, the risk assessment model is further developed by introducing a malus system which incorporates hidden risks into the model and by revising the derivation of the most central risk measure in the current model. Both methodical changes lead to significant enhancements in terms of risk assessment accuracy, supplier identification and workload efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nestlé’s Dynamic Forecasting Process: Anticipating Risks and Opportunities This Work Project discusses the Nestlé’s Dynamic Forecasting Process, implemented within the organization as a way of reengineering its performance management concept and processes, so as to make it more flexible and capable to react to volatile business conditions. When stressing the importance of demand planning to reallocate resources and enhance performance, Nescafé Dolce Gusto comes as way of seeking improvements on this forecasts’ accuracy and it is thus, by providing a more accurate model on its capsules’ sales, as well as recommending adequate implementations that positively contribute to the referred Planning Process, that value is brought to the Project

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doctoral Dissertation for PhD degree in Industrial and Systems Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Molecular Genetics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The twin objectives of the work described were to construct nutrient balance models (NBM) for a range of Irish animal production systems and to evaluate their potential as a means of estimating the nutrient composition of farm wastes. The NBM has three components. The first is the intake of nutrients in the animal's diet. The second is retention or the nutrients the animal retains for the production of milk, meat or eggs. The third is the balance or the difference between the nutrient intake and retention. Data on the intake levels and their nutrient value for dairy cows, beef cattle, pigs and poultry systems were assembled. Literature searches and interviews with National experts were the primary sources of information. NBMs were then constructed for each production system. Summary tables of the nutrient values for the common diet constituents used in Irish animal production systems, the nutrient composition of the animal products and the NBMs (nutrient intake, retention and excretion) for a range of production systems were assembled. These represent the first comprehensive data set of this type for Irish animal production systems. There was generally good agreement between the derived NBMs values and those published in the literature. The NBMs were validated on a number of farms. Data on animal numbers, fertiliser use, concentrates inputs and production output were recorded on seven farms. Using the data a nutrient input/output balance was constructed for each farm. This was compared with the NBM estimate of the farm nutrient balance. The results showed good agreement between the measured balance and the NBM estimate particularly for the pig and poultry farms. However, the validation emphasised the inherent risks associated with NBMs. The average values used for feed intake and production parameters in the NEMs may result in the under or over estimate of actual nutrient balances on individual farms where these variables are substantially different. On the grassland farms there was a poor correlation between the input/output estimate and the NBM. This possibly results from the omission of the soil's contribution to the nutrient balance. However, the results indicate that the NBMs developed are a potentially useful tool for estimating nutrient balances. They also will serve to highlight the significant fraction of the nutrient inputs into farming systems that are retained on the farm. The potential of the NBM as a means of estimating the nutrient composition of farm wastes was evaluated on two farms. Feed intake and composition, animal production, slurry production was monitored during the indoor winter feeding period. Slurry samples were taken for analysis. The appropriates NBMs were used to estimate the nutrient balance for each farm. The nutrient content of the slurry produced was calculated. There was a good agreement between the NBM estimate and the measured values. This preliminary evaluation suggests that the NBM has a potential to provide the farmer with a simple means of estimating the nutrient value of his slurry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The public perception of the EU in Spain varies greatly. The most positive aspects of Spanish membership are associated with the consolidation of democracy, economic growth, the introduction of the euro, the growth in employment and structural and cohesion funds, the increase in the female participation rate, and the equal opportunities policies. The analysts are in favour of common objectives in the employment policy and multi-level government. The less positive aspects of the EU are the risks of losing social protection and loss of employment in some sectors due to mergers of multinationals and delocalization of companies towards Eastern Europe. The continuous demands for reform of the welfare state, the toughening of the conditions of access to social benefit and the reform of the labour market are also seen as problematic issues. Risks of competitive cuts and social dumping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently there has been a renewed research interest in the properties of non survey updates of input-output tables and social accounting matrices (SAM). Along with the venerable and well known scaling RAS method, several alternative new procedures related to entropy minimization and other metrics have been suggested, tested and used in the literature. Whether these procedures will eventually substitute or merely complement the RAS approach is still an open question without a definite answer. The performance of many of the updating procedures has been tested using some kind of proximity or closeness measure to a reference input-output table or SAM. The first goal of this paper, in contrast, is the proposal of checking the operational performance of updating mechanisms by way of comparing the simulation results that ensue from adopting alternative databases for calibration of a reference applied general equilibrium model. The second goal is to introduce a new updatin! g procedure based on information retrieval principles. This new procedure is then compared as far as performance is concerned to two well-known updating approaches: RAS and cross-entropy. The rationale for the suggested cross validation is that the driving force for having more up to date databases is to be able to conduct more current, and hopefully more credible, policy analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a structured dynamic factor model for the spreads between London Interbank Offered Rate (LIBOR) and overnight index swap (OIS) rates for a panel of banks. Our model involves latent factors which reflect liquidity and credit risk. Our empirical results show that surges in the short term LIBOR-OIS spreads during the 2007-2009 fi nancial crisis were largely driven by liquidity risk. However, credit risk played a more signifi cant role in the longer term (twelve-month) LIBOR-OIS spread. The liquidity risk factors are more volatile than the credit risk factor. Most of the familiar events in the financial crisis are linked more to movements in liquidity risk than credit risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stylized macroeconomic model is developed with an indebted, heterogeneous Investment Banking Sector funded by borrowing from a retail banking sector. The government guarantees retail deposits. Investment banks choose how risky their activities should be. We compared the benefits of separated vs. universal banking modelled as a vertical integration of the retail and investment banks. The incidence of banking default is considered under different constellations of shocks and degrees of competitiveness. The benefits of universal banking rise in the volatility of idiosyncratic shocks to trading strategies and are positive even for very bad common shocks, even though government bailouts, which are costly, are larger compared to the case of separated banking entities. The welfare assessment of the structure of banks may depend crucially on the kinds of shock hitting the economy as well as on the efficiency of government intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the ability of a number of different ordered probit models to predict ratings based on firm-specific data on business and financial risks. We investigate models based on momentum, drift and ageing and compare them against alternatives that take into account the initial rating of the firm and its previous actual rating. Using data on US bond issuing firms rated by Fitch over the years 2000 to 2007 we compare the performance of these models in predicting the rating in-sample and out-of-sample using root mean squared errors, Diebold-Mariano tests of forecast performance and contingency tables. We conclude that initial and previous states have a substantial influence on rating prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To measure the contribution of individual transactions inside the total risk of a credit portfolio is a major issue in financial institutions. VaR Contributions (VaRC) and Expected Shortfall Contributions (ESC) have become two popular ways of quantifying the risks. However, the usual Monte Carlo (MC) approach is known to be a very time consuming method for computing these risk contributions. In this paper we consider the Wavelet Approximation (WA) method for Value at Risk (VaR) computation presented in [Mas10] in order to calculate the Expected Shortfall (ES) and the risk contributions under the Vasicek one-factor model framework. We decompose the VaR and the ES as a sum of sensitivities representing the marginal impact on the total portfolio risk. Moreover, we present technical improvements in the Wavelet Approximation (WA) that considerably reduce the computational effort in the approximation while, at the same time, the accuracy increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most models on introgression from genetically modified (GM) plants have focused on small spatial scales, modelling gene flow from a field containing GM plants into a single adjacent population of a wild relative. Here, we present a model to study the effect of introgression from multiple plantations into the whole metapopulation of the wild relative. The most important result of the model is that even very low levels of introgression and selection can lead to a high probability that the transgene goes to fixation in the metapopulation. Furthermore, the overall frequency of the transgene in the metapopulation, after a certain number of generations of introgression, depends on the population dynamics. If there is a high rate of migration or a high rate of population turnover, the overall transgene frequency is much higher than with lower rates. However, under an island model of population structure, this increased frequency has only a very small effect on the probability of fixation of the transgene. Considering these results, studies on the potential ecological risks of introgression from GM plants should look not only at the rate of introgression and selection acting on the transgene, but also at the metapopulation dynamics of the wild relative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a standard model of financial innovation, in which intermediaries engineer securities with cash flows that investors seek, but modify two assumptions. First, investors (and possibly intermediaries) neglect certain unlikely risks. Second, investors demand securities with safe cash flows. Financial intermediaries cater to these preferences and beliefs by engineering securities perceived to be safe but exposed to neglected risks. Because the risks are neglected, security issuance is excessive. As investors eventually recognize these risks, they fly back to safety of traditional securities and markets become fragile, even without leverage, precisely because the volume of new claims is excessive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades, thanks to the discovery of several pharmaceutical agents, multiple sclerosis (MS) has been transformed into a treatable disorder although the degree of therapeutic response may vary considerably. As more medications find their entry into the MS market, a clinician faces a mounting challenge of comparing risk and benefit profiles of various agents in an attempt to find the best treatment approach for each individual patient. In this review, we aim to summarize the available data on safety profiles of available MS therapies while focusing mostly on serious medication specific potential adverse events without discussing the teratogenic potential of each agent (unless there is a black box warning) or hypersensitivity reactions. Our goal is to provide a clinician with guidance on assuring the appropriate safety monitoring for patients treated with one of the agents discussed. We also comment on the future of risk management in MS and discuss possible enhancements to the current model of drug approval process and general strategies to improve the patient safety.