982 resultados para Risk Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dengue fever is currently the most important arthropod-borne viral disease in Brazil. Mathematical modeling of disease dynamics is a very useful tool for the evaluation of control measures. To be used in decision-making, however, a mathematical model must be carefully parameterized and validated with epidemiological and entomological data. In this work, we developed a simple dengue model to answer three questions: (i) which parameters are worth pursuing in the field in order to develop a dengue transmission model for Brazilian cities; (ii) how vector density spatial heterogeneity influences control efforts; (iii) with a degree of uncertainty, what is the invasion potential of dengue virus type 4 (DEN-4) in Rio de Janeiro city. Our model consists of an expression for the basic reproductive number (R0) that incorporates vector density spatial heterogeneity. To deal with the uncertainty regarding parameter values, we parameterized the model using a priori probability density functions covering a range of plausible values for each parameter. Using the Latin Hypercube Sampling procedure, values for the parameters were generated. We conclude that, even in the presence of vector spatial heterogeneity, the two most important entomological parameters to be estimated in the field are the mortality rate and the extrinsic incubation period. The spatial heterogeneity of the vector population increases the risk of epidemics and makes the control strategies more complex. At last, we conclude that Rio de Janeiro is at risk of a DEN-4 invasion. Finally, we stress the point that epidemiologists, mathematicians, and entomologists need to interact more to find better approaches to the measuring and interpretation of the transmission dynamics of arthropod-borne diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Long-lasting food impactions requiring endoscopic bolus removal occur frequently in patients with eosinophilic esophagitis (EoE) and harbor a risk for severe esophageal injuries. We evaluated whether treatment with swallowed topical corticosteroids is able to reduce the risk of occurrence of this complication. METHODS: We analyzed data from the Swiss EoE Cohort Study. Patients with yearly clinic visits, during which standardized assessment of symptoms, endoscopic, histologic, and laboratory findings was carried out, were included. RESULTS: A total of 206 patients (157 males) were analyzed. The median follow-up time was 5 years with a total of 703 visits (mean 3.41 visits/patient). During the follow-up period, 33 patients (16 % of the cohort) experienced 42 impactions requiring endoscopic bolus removal. We evaluated the following factors regarding the outcome 'bolus impaction' by univariate logistic regression modeling: swallowed topical corticosteroid therapy (OR 0.503, 95%-CI 0.255-0.993, P = 0.048), presence of EoE symptoms (OR 1.150, 95%-CI 0.4668-2.835, P = 0.761), esophageal stricture (OR 2.832, 95%-CI 1.508-5.321, P = 0.001), peak eosinophil count >10 eosinophils/HPF (OR 0.724, 95%-CI 0.324-1.621, P = 0.433), blood eosinophilia (OR 1.532, 95%-CI 0.569-4.118, P = 0.398), and esophageal dilation (OR 1.852, 95%-CI 1.034-3.755, P = 0.017). In the multivariate model, the following factors were significantly associated with bolus impaction: swallowed topical corticosteroid therapy (OR 0.411, 95%-CI 0.203-0.835, P = 0.014) and esophageal stricture (OR 2.666, 95%-CI 1.259-5.645, P = 0.01). Increasing frequency of use of swallowed topical steroids was associated with a lower risk for bolus impactions. CONCLUSIONS: Treatment of EoE with swallowed topical corticosteroids significantly reduces the risk for long-lasting bolus impactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Germline genetic variation is associated with the differential expression of many human genes. The phenotypic effects of this type of variation may be important when considering susceptibility to common genetic diseases. Three regions at 8q24 have recently been identified to independently confer risk of prostate cancer. Variation at 8q24 has also recently been associated with risk of breast and colorectal cancer. However, none of the risk variants map at or relatively close to known genes, with c-MYC mapping a few hundred kilobases distally. Results: This study identifies cis-regulators of germline c-MYC expression in immortalized lymphocytes of HapMap individuals. Quantitative analysis of c-MYC expression in normal prostate tissues suggests an association between overexpression and variants in Region 1 of prostate cancer risk. Somatic c-MYC overexpression correlates with prostate cancer progression and more aggressive tumor forms, which was also a pathological variable associated with Region 1. Expression profiling analysis and modeling of transcriptional regulatory networks predicts a functional association between MYC and the prostate tumor suppressor KLF6. Analysis of MYC/Myc-driven cell transformation and tumorigenesis substantiates a model in which MYC overexpression promotes transformation by down-regulating KLF6. In this model, a feedback loop through E-cadherin down-regulation causes further transactivation of c-MYC.Conclusion: This study proposes that variation at putative 8q24 cis-regulator(s) of transcription can significantly alter germline c-MYC expression levels and, thus, contribute to prostate cancer susceptibility by down-regulating the prostate tumor suppressor KLF6 gene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this paper is to estimate time-varying covariance matrices.Since the covariance matrix of financial returns is known to changethrough time and is an essential ingredient in risk measurement, portfolioselection, and tests of asset pricing models, this is a very importantproblem in practice. Our model of choice is the Diagonal-Vech version ofthe Multivariate GARCH(1,1) model. The problem is that the estimation ofthe general Diagonal-Vech model model is numerically infeasible indimensions higher than 5. The common approach is to estimate more restrictive models which are tractable but may not conform to the data. Our contributionis to propose an alternative estimation method that is numerically feasible,produces positive semi-definite conditional covariance matrices, and doesnot impose unrealistic a priori restrictions. We provide an empiricalapplication in the context of international stock markets, comparing thenew estimator to a number of existing ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential ecological impact of ongoing climate change has been much discussed. High mountain ecosystems were identified early on as potentially very sensitive areas. Scenarios of upward species movement and vegetation shift are commonly discussed in the literature. Mountains being characteristically conic in shape, impact scenarios usually assume that a smaller surface area will be available as species move up. However, as the frequency distribution of additional physiographic factors (e.g., slope angle) changes with increasing elevation (e.g., with few gentle slopes available at higher elevation), species migrating upslope may encounter increasingly unsuitable conditions. As a result, many species could suffer severe reduction of their habitat surface, which could in turn affect patterns of biodiversity. In this paper, results from static plant distribution modeling are used to derive climate change impact scenarios in a high mountain environment. Models are adjusted with presence/absence of species. Environmental predictors used are: annual mean air temperature, slope, indices of topographic position, geology, rock cover, modeled permafrost and several indices of solar radiation and snow cover duration. Potential Habitat Distribution maps were drawn for 62 higher plant species, from which three separate climate change impact scenarios were derived. These scenarios show a great range of response, depending on the species and the degree of warming. Alpine species would be at greatest risk of local extinction, whereas species with a large elevation range would run the lowest risk. Limitations of the models and scenarios are further discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The likelihood of significant exposure to drugs in infants through breast milk is poorly defined, given the difficulties of conducting pharmacokinetics (PK) studies. Using fluoxetine (FX) as an example, we conducted a proof-of-principle study applying population PK (popPK) modeling and simulation to estimate drug exposure in infants through breast milk. We simulated data for 1,000 mother-infant pairs, assuming conservatively that the FX clearance in an infant is 20% of the allometrically adjusted value in adults. The model-generated estimate of the milk-to-plasma ratio for FX (mean: 0.59) was consistent with those reported in other studies. The median infant-to-mother ratio of FX steady-state plasma concentrations predicted by the simulation was 8.5%. Although the disposition of the active metabolite, norfluoxetine, could not be modeled, popPK-informed simulation may be valid for other drugs, particularly those without active metabolites, thereby providing a practical alternative to conventional PK studies for exposure risk assessment in this population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Ocular anatomy and radiation-associated toxicities provide unique challenges for external beam radiation therapy. For treatment planning, precise modeling of organs at risk and tumor volume are crucial. Development of a precise eye model and automatic adaptation of this model to patients' anatomy remain problematic because of organ shape variability. This work introduces the application of a 3-dimensional (3D) statistical shape model as a novel method for precise eye modeling for external beam radiation therapy of intraocular tumors. METHODS AND MATERIALS: Manual and automatic segmentations were compared for 17 patients, based on head computed tomography (CT) volume scans. A 3D statistical shape model of the cornea, lens, and sclera as well as of the optic disc position was developed. Furthermore, an active shape model was built to enable automatic fitting of the eye model to CT slice stacks. Cross-validation was performed based on leave-one-out tests for all training shapes by measuring dice coefficients and mean segmentation errors between automatic segmentation and manual segmentation by an expert. RESULTS: Cross-validation revealed a dice similarity of 95% ± 2% for the sclera and cornea and 91% ± 2% for the lens. Overall, mean segmentation error was found to be 0.3 ± 0.1 mm. Average segmentation time was 14 ± 2 s on a standard personal computer. CONCLUSIONS: Our results show that the solution presented outperforms state-of-the-art methods in terms of accuracy, reliability, and robustness. Moreover, the eye model shape as well as its variability is learned from a training set rather than by making shape assumptions (eg, as with the spherical or elliptical model). Therefore, the model appears to be capable of modeling nonspherically and nonelliptically shaped eyes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to various pesticides has been characterized in workers and the general population, but interpretation and assessment of biomonitoring data from a health risk perspective remains an issue. For workers, a Biological Exposure Index (BEI®) has been proposed for some substances, but most BEIs are based on urinary biomarker concentrations at Threshold Limit Value - Time Weighted Average (TLV-TWA) airborne exposure while occupational exposure can potentially occurs through multiple routes, particularly by skin contact (i.e.captan, chlorpyrifos, malathion). Similarly, several biomonitoring studies have been conducted to assess environmental exposure to pesticides in different populations, but dose estimates or health risks related to these environmental exposures (mainly through the diet), were rarely characterized. Recently, biological reference values (BRVs) in the form of urinary pesticide metabolites have been proposed for both occupationally exposed workers and children. These BRVs were established using toxicokinetic models developed for each substance, and correspond to safe levels of absorption in humans, regardless of the exposure scenario. The purpose of this chapter is to present a review of a toxicokinetic modeling approach used to determine biological reference values. These are then used to facilitate health risk assessments and decision-making on occupational and environmental pesticide exposures. Such models have the ability to link absorbed dose of the parent compound to exposure biomarkers and critical biological effects. To obtain the safest BRVs for the studied population, simulations of exposure scenarios were performed using a conservative reference dose such as a no-observed-effect level (NOEL). The various examples discussed in this chapter show the importance of knowledge on urine collections (i.e. spot samples and complete 8-h, 12-h or 24-h collections), sampling strategies, metabolism, relative proportions of the different metabolites in urine, absorption fraction, route of exposure and background contribution of prior exposures. They also show that relying on urinary measurements of specific metabolites appears more accurate when applying this approach to the case of occupational exposures. Conversely, relying on semi-specific metabolites (metabolites common to a category of pesticides) appears more accurate for the health risk assessment of environmental exposures given that the precise pesticides to which subjects are exposed are often unknown. In conclusion, the modeling approach to define BRVs for the relevant pesticides may be useful for public health authorities for managing issues related to health risks resulting from environmental and occupational exposures to pesticides.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the public health impact of statin prescribing strategies based on the Justification for the Use of Statins in Primary Prevention: An Intervention Trial Evaluating Rosuvastatin Study (JUPITER). METHODS: We studied 2268 adults aged 35-75 without cardiovascular disease in a population-based study in Switzerland in 2003-2006. We assessed the eligibility for statins according to the Adult Treatment Panel III (ATPIII) guidelines, and by adding "strict" (hs-CRP≥2.0mg/L and LDL-cholesterol <3.4mmol/L), and "extended" (hs-CRP≥2.0mg/L alone) JUPITER-like criteria. We estimated the proportion of CHD deaths potentially prevented over 10years in the Swiss population. RESULTS: Fifteen % were already taking statins, 42% were eligible by ATPIII guidelines, 53% by adding "strict", and 62% by adding "extended" criteria, with a total of 19% newly eligible. The number needed to treat with statins to avoid one CHD death over 10years was 38 for ATPIII, 84 for "strict" and 92 for "extended" JUPITER-like criteria. ATPIII would prevent 17% of CHD deaths, compared with 20% for ATPIII+"strict" and 23% for ATPIII + "extended" criteria (+6%). CONCLUSION: Implementing JUPITER-like strategies would make statin prescribing for primary prevention more common and less efficient than it is with current guidelines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, I develop analytical models to price the value of supply chain investments under demand uncer¬tainty. This thesis includes three self-contained papers. In the first paper, we investigate the value of lead-time reduction under the risk of sudden and abnormal changes in demand forecasts. We first consider the risk of a complete and permanent loss of demand. We then provide a more general jump-diffusion model, where we add a compound Poisson process to a constant-volatility demand process to explore the impact of sudden changes in demand forecasts on the value of lead-time reduction. We use an Edgeworth series expansion to divide the lead-time cost into that arising from constant instantaneous volatility, and that arising from the risk of jumps. We show that the value of lead-time reduction increases substantially in the intensity and/or the magnitude of jumps. In the second paper, we analyze the value of quantity flexibility in the presence of supply-chain dis- intermediation problems. We use the multiplicative martingale model and the "contracts as reference points" theory to capture both positive and negative effects of quantity flexibility for the downstream level in a supply chain. We show that lead-time reduction reduces both supply-chain disintermediation problems and supply- demand mismatches. We furthermore analyze the impact of the supplier's cost structure on the profitability of quantity-flexibility contracts. When the supplier's initial investment cost is relatively low, supply-chain disin¬termediation risk becomes less important, and hence the contract becomes more profitable for the retailer. We also find that the supply-chain efficiency increases substantially with the supplier's ability to disintermediate the chain when the initial investment cost is relatively high. In the third paper, we investigate the value of dual sourcing for the products with heavy-tailed demand distributions. We apply extreme-value theory and analyze the effects of tail heaviness of demand distribution on the optimal dual-sourcing strategy. We find that the effects of tail heaviness depend on the characteristics of demand and profit parameters. When both the profit margin of the product and the cost differential between the suppliers are relatively high, it is optimal to buffer the mismatch risk by increasing both the inventory level and the responsive capacity as demand uncertainty increases. In that case, however, both the optimal inventory level and the optimal responsive capacity decrease as the tail of demand becomes heavier. When the profit margin of the product is relatively high, and the cost differential between the suppliers is relatively low, it is optimal to buffer the mismatch risk by increasing the responsive capacity and reducing the inventory level as the demand uncertainty increases. In that case, how¬ever, it is optimal to buffer with more inventory and less capacity as the tail of demand becomes heavier. We also show that the optimal responsive capacity is higher for the products with heavier tails when the fill rate is extremely high.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Static process simulation has traditionally been used to model complex processes for various purposes. However, the use of static processsimulators for the preparation of holistic examinations aiming at improving profit-making capability requires a lot of work because the production of results requires the assessment of the applicability of detailed data which may be irrelevant to the objective. The relevant data for the total assessment gets buried byirrelevant data. Furthermore, the models do not include an examination of the maintenance or risk management, and economic examination is often an extra property added to them which can be performed with a spreadsheet program. A process model applicable to holistic economic examinations has been developed in this work. The model is based on the life cycle profit philosophy developed by Hagberg and Henriksson in 1996. The construction of the model has utilized life cycle assessment and life cycle costing methodologies with a view to developing, above all, a model which would be applicable to the economic examinations of complete wholes and which would require the need for information focusing on aspects essential to the objectives. Life cycle assessment and costing differ from each other in terms of the modeling principles, but the features of bothmethodologies can be used in the development of economic process modeling. Methods applicable to the modeling of complex processes can be examined from the viewpoint of life cycle methodologies, because they involve the collection and management of large corpuses of information and the production of information for the needs of decision-makers as well. The results of the study shows that on the basis of the principles of life cycle modeling, a process model can be created which may be used to produce holistic efficiency examinations on the profit-making capability of the production line, with fewer resources thanwith traditional methods. The calculations of the model are based to the maximum extent on the information system of the factory, which means that the accuracyof the results can be improved by developing information systems so that they can provide the best information for this kind of examinations.