869 resultados para Risk model
Resumo:
Traditional approaches to the way people react to food risks often focus on ways in which the media distort information about risk, or on the deficiencies in people’s interpretation of this information. In this chapter Jones offers an alternative model which sees decisions regarding food risk as taking place at a complex nexus where different people, texts, objects and practices, each with their own histories, come together. Based on a case study of a food scandal involving a particular brand of Chinese candy, Jones argues that understanding why people respond the way they do to food risk requires tracing the itineraries along which different people, texts, objects and practices have traveled to converge at particular moments, and understanding the kinds of concrete social actions that these convergences make possible.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Remotely sensed rainfall is increasingly being used to manage climate-related risk in gauge sparse regions. Applications based on such data must make maximal use of the skill of the methodology in order to avoid doing harm by providing misleading information. This is especially challenging in regions, such as Africa, which lack gauge data for validation. In this study, we show how calibrated ensembles of equally likely rainfall can be used to infer uncertainty in remotely sensed rainfall estimates, and subsequently in assessment of drought. We illustrate the methodology through a case study of weather index insurance (WII) in Zambia. Unlike traditional insurance, which compensates proven agricultural losses, WII pays out in the event that a weather index is breached. As remotely sensed rainfall is used to extend WII schemes to large numbers of farmers, it is crucial to ensure that the indices being insured are skillful representations of local environmental conditions. In our study we drive a land surface model with rainfall ensembles, in order to demonstrate how aggregation of rainfall estimates in space and time results in a clearer link with soil moisture, and hence a truer representation of agricultural drought. Although our study focuses on agricultural insurance, the methodological principles for application design are widely applicable in Africa and elsewhere.
Resumo:
Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.
Resumo:
This multicentric population-based study in Brazil is the first national effort to estimate the prevalence of hepatitis B (HBV) and risk factors in the capital cities of the Northeast. Central-West, and Federal Districts (2004-2005). Random multistage cluster sampling was used to select persons 13-69 years of age. Markers for HBV were tested by enzyme-linked immunosorbent assay. The HBV genotypes were determined by sequencing hepatitis B surface antigen (HBsAg). Multivariate analyses and simple catalytic model were performed. Overall. 7,881 persons were inculded < 70% were not vaccinated. Positivity for HBsAg was less than 1% among non-vaccinated persons and genotypes A, D, and F co-circulated. The incidence of infection increased with age with similar force of infection in all regions. Males and persons having initiated sexual activity were associated with HBV infection in the two settings: healthcare jobs and prior hospitalization were risk factors in the Federal District. Our survey classified these regions as areas with HBV endemicity and highlighted the risk factors differences among the settings.
Resumo:
The present study investigated the effects of exercise training on arterial pressure, baroreflex sensitivity, cardiovascular autonomic control and metabolic parameters on female LDL-receptor knockout ovariectomized mice. Mice were divided into two groups: sedentary and trained. Trained group was submitted to an exercise training protocol. Blood cholesterol was measured. Arterial pressure (AP) signals were directly recorded in conscious mice. Baroreflex sensitivity was evaluated by tachycardic and bradycardic responses to AP changes. Cardiovascular autonomic modulation was measured in frequency (FFT) and time domains. Maximal exercise capacity was increased in trained as compared to sedentary group. Blood cholesterol was diminished in trained mice (191 +/- 8 mg/dL) when compared to sedentary mice (250 +/- 9 mg/dL, p<0.05). Mean AP and HR were reduced in trained group (101 +/- 3 mmHg and 535 +/- 14 bpm, p<0.05) when compared with sedentary group (125 +/- 3 mmHg and 600 +/- 12 bpm). Exercise training induced improvement in bradycardic reflex response in trained animals (-4.24 +/- 0.62 bpm/mmHg) in relation to sedentary animals (-1.49 +/- 0.15 bpm/mmHg, p<0.01); tachycardic reflex responses were similar between studied groups. Exercise training increased the variance (34 +/- 8 vs. 6.6 +/- 1.5 ms(2) in sedentary, p<0.005) and the high-frequency band (HF) of the pulse interval (IP) (53 +/- 7% vs. 26 +/- 6% in sedentary, p<0.01). It is tempting to speculate that results of this experimental study might represent a rationale for this non-pharmacological intervention in the management of cardiovascular risk factors in dyslipidemic post-menopause women. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The incidence of melanoma is increasing worldwide. It is one of the leading cancers in pregnancy and the most common malignancy to metastasize to placenta and fetus. There are no publications about experimental models of melanoma and pregnancy. We propose a new experimental murine model to study the effects of melanoma on pregnancy and its metastatic process. We tested several doses of melanoma cells until we arrived at the optimal dose, which produced tumor growth and allowed animal survival to the end of pregnancy. Two control groups were used: control (C) and stress control (SC) and three different routes of inoculation: intravenous (IV), intraperitoneal (IP) and subcutaneous (SC). All the fetuses and placentas were examined macroscopically and microscopically. The results suggest that melanoma is a risk factor for intrauterine growth restriction but does not affect placental weight. When inoculated by the SC route, the tumor grew only in the site of implantation. The IP route produced peritoneal tumoral growth and also ovarian and uterine metastases in 60% of the cases. The IV route produced pulmonary tumors. No placental or fetal metastases were obtained, regardless of the inoculation route. The injection of melanoma cells by any route did not increase the rate of fetal resorptions. Surprisingly, animals in the IV groups had no resorptions and a significantly higher number of fetuses. This finding may indicate that tumoral factors released in the host organism to favor tumor survival may also have a pro-gestational action and consequently improve the reproductive performance of these animals.
Resumo:
In this paper we introduce a parametric model for handling lifetime data where an early lifetime can be related to the infant-mortality failure or to the wear processes but we do not know which risk is responsible for the failure. The maximum likelihood approach and the sampling-based approach are used to get the inferences of interest. Some special cases of the proposed model are studied via Monte Carlo methods for size and power of hypothesis tests. To illustrate the proposed methodology, we introduce an example consisting of a real data set.
Resumo:
In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved
Resumo:
In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Steatosis is diagnosed on the basis of the macroscopic aspect of the liver evaluated by the surgeon at the time of organ extraction or by means of a frozen biopsy. In the present study, the applicability of laser-induced fluorescence (LIF) spectroscopy was investigated as a method for the diagnosis of different degrees of steatosis experimentally induced in rats. Rats received a high-lipid diet for different periods of time. The animals were divided into groups according to the degree of induced steatosis diagnosis by histology. The concentration of fat in the liver was correlated with LIF by means of the steatosis fluorescence factor (SFF). The histology classification, according to liver fat concentration was, Severe Steatosis, Moderate Steatosis, Mild Steatosis and Control (no liver steatosis). Fluorescence intensity could be directly correlated with fat content. It was possible to estimate an average of fluorescence intensity variable by means of different confidence intervals (P=95%) for each steatosis group. SFF was significantly higher in the Severe Steatosis group (P < 0.001) compared with the Moderate Steatosis, Mild Steatosis and Control groups. The various degrees of steatosis could be directly correlated with SFF. LIF spectroscopy proved to be a method capable of identifying the degree of hepatic steatosis in this animal model, and has the potential of clinical application for non-invasive evaluation of the degree of steatosis.
Resumo:
The aim of this article is to discuss the estimation of the systematic risk in capital asset pricing models with heavy-tailed error distributions to explain the asset returns. Diagnostic methods for assessing departures from the model assumptions as well as the influence of observations on the parameter estimates are also presented. It may be shown that outlying observations are down weighted in the maximum likelihood equations of linear models with heavy-tailed error distributions, such as Student-t, power exponential, logistic II, so on. This robustness aspect may also be extended to influential observations. An application in which the systematic risk estimate of Microsoft is compared under normal and heavy-tailed errors is presented for illustration.
Resumo:
In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
A high incidence of waterborne diseases is observed worldwide and in order to address contamination problems prior to an outbreak, quantitative microbial risk assessment is a useful tool for estimating the risk of infection. The objective of this paper was to assess the probability of Giardia infection from consuming water from shallow wells in a peri-urban area. Giardia has been described as an important waterborne pathogen and reported in several water sources, including ground waters. Sixteen water samples were collected and examined according to the US EPA (1623, 2005). A Monte Carlo method was used to address the potential risk as described by the exponential dose response model. Giardia cysts occurred in 62.5% of the samples (0.1-36.1 cysts/l). A median risk of 10-1 for the population was estimated and the adult ingestion was the highest risk driver. This study illustrates the vulnerability of shallow well water supply systems in peri-urban areas.
Resumo:
This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.