869 resultados para Risk model
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
Models of windblown pollen or spore movement are required to predict gene flow from genetically modified (GM) crops and the spread of fungal diseases. We suggest a simple form for a function describing the distance moved by a pollen grain or fungal spore, for use in generic models of dispersal. The function has power-law behaviour over sub-continental distances. We show that air-borne dispersal of rapeseed pollen in two experiments was inconsistent with an exponential model, but was fitted by power-law models, implying a large contribution from distant fields to the catches observed. After allowance for this 'background' by applying Fourier transforms to deconvolve the mixture of distant and local sources, the data were best fit by power-laws with exponents between 1.5 and 2. We also demonstrate that for a simple model of area sources, the median dispersal distance is a function of field radius and that measurement from the source edge can be misleading. Using an inverse-square dispersal distribution deduced from the experimental data and the distribution of rapeseed fields deduced by remote sensing, we successfully predict observed rapeseed pollen density in the city centres of Derby and Leicester (UK).
Resumo:
Natural exposure to prion disease is likely to occur throughout successive challenges, yet most experiments focus on single large doses of infectious material. We analyze the results from an experiment in which rodents were exposed to multiple doses of feed contaminated with the scrapie agent. We formally define hypotheses for how the doses combine in terms of statistical models. The competing hypotheses are that only the total dose of infectivity is important (cumulative model), doses act independently, or a general alternative that interaction between successive doses occurs (to raise or lower the risk of infection). We provide sample size calculations to distinguish these hypotheses. In the experiment, a fixed total dose has a significantly reduced probability of causing infection if the material is presented as multiple challenges, and as the time between challenges lengthens. Incubation periods are shorter and less variable if all material is consumed on one occasion. We show that the probability of infection is inconsistent with the hypothesis that each dose acts as a cumulative or independent challenge. The incubation periods are inconsistent with the independence hypothesis. Thus, although a trend exists for the risk of infection with prion disease to increase with repeated doses, it does so to a lesser degree than is expected if challenges combine independently or in a cumulative manner.
Resumo:
Individuals with elevated levels of plasma low density lipoprotein (LDL) cholesterol (LDL-C) are considered to be at risk of developing coronary heart disease. LDL particles are removed from the blood by a process known as receptor-mediated endocytosis, which occurs mainly in the liver. A series of classical experiments delineated the major steps in the endocytotic process; apolipoprotein B-100 present on LDL particles binds to a specific receptor (LDL receptor, LDL-R) in specialized areas of the cell surface called clathrin-coated pits. The pit comprising the LDL-LDL-R complex is internalized forming a cytoplasmic endosome. Fusion of the endosome with a lysosome leads to degradation of the LDL into its constituent parts (that is, cholesterol, fatty acids, and amino acids), which are released for reuse by the cell, or are excreted. In this paper, we formulate a mathematical model of LDL endocytosis, consisting of a system of ordinary differential equations. We validate our model against existing in vitro experimental data, and we use it to explore differences in system behavior when a single bolus of extracellular LDL is supplied to cells, compared to when a continuous supply of LDL particles is available. Whereas the former situation is common to in vitro experimental systems, the latter better reflects the in vivo situation. We use asymptotic analysis and numerical simulations to study the longtime behavior of model solutions. The implications of model-derived insights for experimental design are discussed.
Resumo:
Our objective in this study was to develop and implement an effective intervention strategy to manipulate the amount and composition of dietary fat and carbohydrate (CHO) in free-living individuals in the RISCK study. The study was a randomized, controlled dietary intervention study that was conducted in 720 participants identified as higher risk for or with metabolic syndrome. All followed a 4-wk run-in reference diet [high saturated fatty acids (SF)/high glycemic index (GI)]. Volunteers were randomized to continue this diet for a further 24 wk or to I of 4 isoenergetic prescriptions [high monounsaturated fatty acids (MUFA)/high GI; high MUFA/low GI; low fat (LF)/high GI; and LF/low GI]. We developed a food exchange model to implement each diet. Dietary records and plasma phospholipid fatty acids were used to assess the effectiveness of the intervention strategy. Reported fat intake from the LF diets was significantly reduced to 28% of energy (%E) compared with 38% E from the HM and LF diets. SF intake was successfully decreased in the HM and LF diets was similar to 10% E compared with 17% E in the reference diet (P = 0.001). Dietary MUFA in the HIM diets was similar to 17% E, significantly higher than in the reference (12% E) and LF diets (10% E) (P = 0.001). Changes in plasma phospholipid fatty acids provided further evidence for the successful manipulation of fat intake. The GI of the HGI and LGI arms differed by similar to 9 points (P = 0.001). The food exchange model provided an effective dietary strategy for the design and implementation across multiple sites of 5 experimental diets with specific targets for the proportion of fat and CHO. J. Nutr. 139: 1534-1540, 2009.
Resumo:
Fecal water (FW) has been shown to exert, in cultured cells, cytotoxic and genotoxic effects that have implications for colorectal cancer (CRC) risk. We have investigated a further biological activity of FW, namely, the ability to affect gap junctions in CACO2 cell monolayers as an index of mucosal barrier function, which is known to be disrupted in cancer. FW samples fi-om healthy, free-living, European subjects that were divided into two broad age groups, adult (40 +/- 9.7 yr; n = 53) and elderly (76 +/- 7.5 yr; n = 55) were tested for effects on gap junction using the transepithelial resistance (TER) assay. Overall, treatment of CACO2 cells with FW samples fi-om adults increased TER (+ 4 %), whereas FW from elderly subjects decreased TER (-5%); the difference between the two groups was significant (P < 0.05). We also measured several components of FW potentially associated with modulation of TER, namely, short-chain fatty acid (SCFA) and ammonia. SCFAs (propionic, acetic, and n-butyric) were significantly lower in the elderly population (-30%, -35%, and -21%, respectively, all P pound 0.01). We consider that FW modulation of in vitro epithelial barrier function is a potentially useful noninvasive biomarker, but it requires further validation to establish its relationship to CRC risk.
Resumo:
The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.
Resumo:
We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant.
Resumo:
The possibility of a rapid collapse in the strength of the Atlantic meridional overturning circulation (AMOC), with associated impacts on climate, has long been recognized. The suggested basis for this risk is the existence of two stable regimes of the AMOC (‘on’ and ‘off’), and such bistable behaviour has been identified in a range of simplified climate models. However, up to now, no state-of-the-art atmosphere-ocean coupled global climate model (AOGCM) has exhibited such behaviour, leading to the interpretation that the AMOC is more stable than simpler models indicate. Here we demonstrate AMOC bistability in the response to freshwater perturbations in the FAMOUS AOGCM - the most complex AOGCM to exhibit such behaviour to date. The results also support recent suggestions that the direction of the net freshwater transport at the southern boundary of the Atlantic by the AMOC may be a useful physical indicator of the existence of bistability. We also present new estimates for this net freshwater transport by the AMOC from a range of ocean reanalyses which suggest that the Atlantic AMOC is currently in a bistable regime, although with large uncertainties. More accurate observational constraints, and an improved physical understanding of this quantity, could help narrow uncertainty in the future evolution of the AMOC and to assess the risk of a rapid AMOC collapse.
Resumo:
This paper compares a number of different extreme value models for determining the value at risk (VaR) of three LIFFE futures contracts. A semi-nonparametric approach is also proposed, where the tail events are modeled using the generalised Pareto distribution, and normal market conditions are captured by the empirical distribution function. The value at risk estimates from this approach are compared with those of standard nonparametric extreme value tail estimation approaches, with a small sample bias-corrected extreme value approach, and with those calculated from bootstrapping the unconditional density and bootstrapping from a GARCH(1,1) model. The results indicate that, for a holdout sample, the proposed semi-nonparametric extreme value approach yields superior results to other methods, but the small sample tail index technique is also accurate.
Resumo:
The increased frequency in reporting UK property performance figures, coupled with the acceptance of the IPD database as the market standard, has enabled property to be analysed on a comparable level with other more frequently traded assets. The most widely utilised theory for pricing financial assets, the Capital Asset Pricing Model (CAPM), gives market (systematic) risk, beta, centre stage. This paper seeks to measure the level of systematic risk (beta) across various property types, market conditions and investment holding periods. This paper extends the authors’ previous work on investment holding periods and how excess returns (alpha) relate to those holding periods. We draw on the uniquely constructed IPD/Gerald Eve transactions database, containing over 20,000 properties over the period 1983-2005. This research allows us to confirm our initial findings that properties held over longer periods perform in line with overall market performance. One implication of this is that over the long-term performance may be no different from an index tracking approach.
Resumo:
Traditionally, the measure of risk used in portfolio optimisation models is the variance. However, alternative measures of risk have many theoretical and practical advantages and it is peculiar therefore that they are not used more frequently. This may be because of the difficulty in deciding which measure of risk is best and any attempt to compare different risk measures may be a futile exercise until a common risk measure can be identified. To overcome this, another approach is considered, comparing the portfolio holdings produced by different risk measures, rather than the risk return trade-off. In this way we can see whether the risk measures used produce asset allocations that are essentially the same or very different. The results indicate that the portfolio compositions produced by different risk measures vary quite markedly from measure to measure. These findings have a practical consequence for the investor or fund manager because they suggest that the choice of model depends very much on the individual’s attitude to risk rather than any theoretical and/or practical advantages of one model over another.
Resumo:
This paper reviews the evidence relating to the question: does the risk of fungicide resistance increase or decrease with dose? The development of fungicide resistance progresses through three key phases. During the ‘emergence phase’ the resistant strain has to arise through mutation and invasion. During the subsequent ‘selection phase’, the resistant strain is present in the pathogen population and the fraction of the pathogen population carrying the resistance increases due to the selection pressure caused by the fungicide. During the final phase of ‘adjustment’, the dose or choice of fungicide may need to be changed to maintain effective control over a pathogen population where resistance has developed to intermediate levels. Emergence phase: no experimental publications and only one model study report on the emergence phase, and we conclude that work in this area is needed. Selection phase: all the published experimental work, and virtually all model studies, relate to the selection phase. Seven peer reviewed and four non-peer reviewed publications report experimental evidence. All show increased selection for fungicide resistance with increased fungicide dose, except for one peer reviewed publication that does not detect any selection irrespective of dose and one conference proceedings publication which claims evidence for increased selection at a lower dose. In the mathematical models published, no evidence has been found that a lower dose could lead to a higher risk of fungicide resistance selection. We discuss areas of the dose rate debate that need further study. These include further work on pathogen-fungicide combinations where the pathogen develops partial resistance to the fungicide and work on the emergence phase.
Resumo:
Using UK equity index data, this paper considers the impact of news on time varying measures of beta, the usual measure of undiversifiable risk. The empirical model implies that beta depends on news about the market and news about the sector. The asymmetric response of beta to news about the market is consistent across all sectors considered. Recent research is divided as to whether abnormalities in equity returns arise from changes in expected returns in an efficient market or over-reactions to new information. The evidence suggests that such abnormalities may be due to changes in expected returns caused by time-variation and asymmetry in beta.
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.