55 resultados para Relative risk aversion
em CentAUR: Central Archive University of Reading - UK
Resumo:
Developing models to predict the effects of social and economic change on agricultural landscapes is an important challenge. Model development often involves making decisions about which aspects of the system require detailed description and which are reasonably insensitive to the assumptions. However, important components of the system are often left out because parameter estimates are unavailable. In particular, measurements of the relative influence of different objectives, such as risk, environmental management, on farmer decision making, have proven difficult to quantify. We describe a model that can make predictions of land use on the basis of profit alone or with the inclusion of explicit additional objectives. Importantly, our model is specifically designed to use parameter estimates for additional objectives obtained via farmer interviews. By statistically comparing the outputs of this model with a large farm-level land-use data set, we show that cropping patterns in the United Kingdom contain a significant contribution from farmer’s preference for objectives other than profit. In particular, we found that risk aversion had an effect on the accuracy of model predictions, whereas preference for a particular number of crops grown was less important. While nonprofit objectives have frequently been identified as factors in farmers’ decision making, our results take this analysis further by demonstrating the relationship between these preferences and actual cropping patterns.
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
This study compares relative and absolute forms of presenting risk information about influenza and the need for vaccination. It investigates whether differences in people's risk estimates and their evaluations of risk information, as a result of the different presentation formats, are still apparent when they are provided with information about the baseline level of risk. The results showed that, in the absence of baseline information, the relative risk format resulted in higher ratings of satisfaction, perceived effectiveness of vaccination, and likelihood of being vaccinated. However, these differences were not apparent when baseline information was presented. Overall, provision of baseline information resulted in more accurate risk estimates and more positive evaluations of the risk messages. It is recommended that, in order to facilitate shared and fully informed decision making, information about baseline level of risk should be included in all health communications specifying risk reductions, irrespective of the particular format adopted.
Resumo:
Objective: To assess the effectiveness of absolute risk, relative risk, and number needed to harm formats for medicine side effects, with and without the provision of baseline risk information. Methods: A two factor, risk increase format (relative, absolute and NNH) x baseline (present/absent) between participants design was used. A sample of 268 women was given a scenario about increase in side effect risk with third generation oral contraceptives, and were required to answer written questions to assess their understanding, satisfaction, and likelihood of continuing to take the drug. Results: Provision of baseline information significantly improved risk estimates and increased satisfaction, although the estimates were still considerably higher than the actual risk. No differences between presentation formats were observed when baseline information was presented. Without baseline information, absolute risk led to the most accurate performance. Conclusion: The findings support the importance of informing people about baseline level of risk when describing risk increases. In contrast, they offer no support for using number needed to harm. Practice implications: Health professionals should provide baseline risk information when presenting information about risk increases or decreases. More research is needed before numbers needed to harm (or treat) should be given to members of the general populations. (c) 2005 Elsevier Ireland Ltd. All rights reserved.
Resumo:
We apply experimental methods to study the role of risk aversion on players’ behavior in repeated prisoners’ dilemma games. Faced with quantitatively equal discount factors, the most risk-averse players will choose Nash strategies more often in the presence of uncertainty than when future profits are discounted in a deterministic way. Overall, we find that risk aversion relates negatively with the frequency of collusive outcomes.
Resumo:
We present and experimentally test a theoretical model of majority threshold determination as a function of voters’ risk preferences. The experimental results confirm the theoretical prediction of a positive correlation between the voter's risk aversion and the corresponding preferred majority threshold. Furthermore, the experimental results show that a voter's preferred majority threshold negatively relates to the voter's confidence about how others will vote. Moreover, in a treatment in which individuals receive a private signal about others’ voting behaviour, the confidence-related motivation of behaviour loses ground to the signal's strength.
Resumo:
Following the 1995 “pill scare” relating to the risk of venous thrombosis from taking second- or third-generation oral contraceptives, the Committee on Safety of Medicines (CSM) withdrew their earlier recommended restrictions on the use of third-generation pills and published recommended wording to be used in patient information leaflets. However, the effectiveness of this wording has not been tested. An empirical study (with 186 pill users, past users, and non-users) was conducted to assess understanding, based on this wording, of the absolute and relative risk of thrombosis in pill users and in pregnancy. The results showed that less than 12% of women in the (higher education) group fully understood the absolute levels of risk from taking the pill and from being pregnant. Relative risk was also poorly understood, with less than 40% of participants showing full understanding, and 20% showing no understanding. We recommend that the CSM revisit the wording currently provided to millions of women in the UK.
Resumo:
Objectives We examined the characteristics and CHD risks of people who accessed the free Healthy Heart Assessment (HHA) service operated by a large UK pharmacy chain from August 2004 to April 2006. Methods Associations between participants’ gender, age, and socioeconomics were explored in relation to calculated 10-year CHD risks by cross-tabulation of the data. Specific associations were tested by forming contingency tables and using Pearson chi-square (χ2). Results Data from 8,287 records were analysable; 5,377 were at low and 2,910 at moderate-to-high CHD risk. The likelihood of moderate-to-high risk for a male versus female participant was significantly higher with a relative risk ratio (RRR) 1.72 (P < 0.001). A higher percentage of those in socioeconomic categories ‘constrained by circumstances’ (RRR 1.15; P < 0.05) and ‘blue collar communities’ (RRR 1.13; P < 0.05) were assessed with moderate-to-high risk compared to those in ‘prospering suburbs’. Conclusions People from ‘hard-to-reach’ sectors of the population, men and people from less advantaged communities, accessed the HHA service and were more likely to return moderate-to-high CHD risk. Pharmacists prioritised provision of lifestyle information above the sale of a product. Our study supports the notion that pharmacies can serve as suitable environments for the delivery of similar screening services.
Resumo:
The farm-level success of Bt-cotton in developing countries is well documented. However, the literature has only recently begun to recognise the importance of accounting for the effects of the technology on production risk, in addition to the mean effect estimated by previous studies. The risk effects of the technology are likely very important to smallholder farmers in the developing world due to their risk-aversion. We advance the emergent literature on Bt-cotton and production risk by using panel data methods to control for possible endogeneity of Bt-adoption. We estimate two models, the first a fixed-effects version of the Just and Pope model with additive individual and time effects, and the second a variation of the model in which inputs and variety choice are allowed to affect the variance of the time effect and its correlation with the idiosyncratic error. The models are applied to panel data on smallholder cotton production in India and South Africa. Our results suggest a risk-reducing effect of Bt-cotton in India, but an inconclusive picture in South Africa.
Resumo:
This study proposes a utility-based framework for the determination of optimal hedge ratios (OHRs) that can allow for the impact of higher moments on hedging decisions. We examine the entire hyperbolic absolute risk aversion family of utilities which include quadratic, logarithmic, power, and exponential utility functions. We find that for both moderate and large spot (commodity) exposures, the performance of out-of-sample hedges constructed allowing for nonzero higher moments is better than the performance of the simpler OLS hedge ratio. The picture is, however, not uniform throughout our seven spot commodities as there is one instance (cotton) for which the modeling of higher moments decreases welfare out-of-sample relative to the simpler OLS. We support our empirical findings by a theoretical analysis of optimal hedging decisions and we uncover a novel link between OHRs and the minimax hedge ratio, that is the ratio which minimizes the largest loss of the hedged position. © 2011 Wiley Periodicals, Inc. Jrl Fut Mark
Resumo:
Purpose Limited robust randomised controlled trials investigating fruit and vegetable (F&V) intake in people at risk of cardiovascular disease (CVD) exist. We aimed to design and validate a dietary strategy of increasing flavonoid-rich versus flavonoid-poor F&V consumption on nutrient biomarker profile. Methods A parallel, randomised, controlled, dose–response dietary intervention study. Participants with a CVD relative risk of 1.5 assessed by risk scores were randomly assigned to one of the 3 groups: habitual (control, CT), high-flavonoid (HF) or low-flavonoid (LF) diets. While the CT group (n = 57) consumed their habitual diet throughout, the HF (n = 58) and LF (n = 59) groups sequentially increased their daily F&V intake by an additional 2, 4 and 6 portions for 6-week periods during the 18-week study. Results Compliance to target numbers and types of F&V was broadly met and verified by dietary records, and plasma and urinary biomarkers. Mean (±SEM) number of F&V portions/day consumed by the HF and LF groups at baseline (3.8 ± 0.3 and 3.4 ± 0.3), 6 weeks (6.3 ± 0.4 and 5.8 ± 0.3), 12 weeks (7.0 ± 0.3 and 6.8 ± 0.3) and 18 weeks (7.6 ± 0.4 and 8.1 ± 0.4), respectively, was similar at baseline yet higher than the CT group (3.9 ± 0.3, 4.3 ± 0.3, 4.6 ± 0.4, 4.5 ± 0.3) (P = 0.015). There was a dose-dependent increase in dietary and urinary flavonoids in the HF group, with no change in other groups (P = 0.0001). Significantly higher dietary intakes of folate (P = 0.035), non-starch polysaccharides (P = 0.001), vitamin C (P = 0.0001) and carotenoids (P = 0.0001) were observed in both intervention groups compared with CT, which were broadly supported by nutrient biomarker analysis. Conclusions The success of improving nutrient profile by active encouragement of F&V intake in an intervention study implies the need for a more hands-on public health approach.
Resumo:
1. It has been postulated that climate warming may pose the greatest threat species in the tropics, where ectotherms have evolved more thermal specialist physiologies. Although species could rapidly respond to environmental change through adaptation, little is known about the potential for thermal adaptation, especially in tropical species. 2. In the light of the limited empirical evidence available and predictions from mutation-selection theory, we might expect tropical ectotherms to have limited genetic variance to enable adaptation. However, as a consequence of thermodynamic constraints, we might expect this disadvantage to be at least partially offset by a fitness advantage, that is, the ‘hotter-is-better’ hypothesis. 3. Using an established quantitative genetics model and metabolic scaling relationships, we integrate the consequences of the opposing forces of thermal specialization and thermodynamic constraints on adaptive potential by evaluating extinction risk under climate warming. We conclude that the potential advantage of a higher maximal development rate can in theory more than offset the potential disadvantage of lower genetic variance associated with a thermal specialist strategy. 4. Quantitative estimates of extinction risk are fundamentally very sensitive to estimates of generation time and genetic variance. However, our qualitative conclusion that the relative risk of extinction is likely to be lower for tropical species than for temperate species is robust to assumptions regarding the effects of effective population size, mutation rate and birth rate per capita. 5. With a view to improving ecological forecasts, we use this modelling framework to review the sensitivity of our predictions to the model’s underpinning theoretical assumptions and the empirical basis of macroecological patterns that suggest thermal specialization and fitness increase towards the tropics. We conclude by suggesting priority areas for further empirical research.
Resumo:
A method is presented to calculate economic optimum fungicide doses accounting for the risk-aversion of growers responding to variability in disease severity between crops. Simple dose-response and disease-yield loss functions are used to estimate net disease-related costs (fungicide cost, plus disease-induced yield loss) as a function of dose and untreated severity. With fairly general assumptions about the shapes of the probability distribution of disease severity and the other functions involved, we show that a choice of fungicide dose which minimises net costs on average across seasons results in occasional large net costs caused by inadequate control in high disease seasons. This may be unacceptable to a grower with limited capital. A risk-averse grower can choose to reduce the size and frequency of such losses by applying a higher dose as insurance. For example, a grower may decide to accept ‘high loss’ years one year in ten or one year in twenty (i.e. specifying a proportion of years in which disease severity and net costs will be above a specified level). Our analysis shows that taking into account disease severity variation and risk-aversion will usually increase the dose applied by an economically rational grower. The analysis is illustrated with data on septoria tritici leaf blotch of wheat caused by Mycosphaerella graminicola. Observations from untreated field plots at sites across England over three years were used to estimate the probability distribution of disease severities at mid-grain filling. In the absence of a fully reliable disease forecasting scheme, reducing the frequency of ‘high loss’ years requires substantially higher doses to be applied to all crops. Disease resistant cultivars reduce both the optimal dose at all levels of risk and the disease-related costs at all doses.
Resumo:
We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.