842 resultados para Risk taking.
Resumo:
This article examines the moment of exchange between artist, audience and culture in Live Art. Drawing on historical and contemporary examples, including examples from the Exist in 08 Live Art Event in Brisbane, Australia, in October 2008, it argues that Live Art - be it body art, activist art, site-specific performance, or other sorts of performative intervention in the public sphere - is characterised by a common set of claims about activating audiences, asking them to reflect on cultural norms challenged in the work. Live Art presents risky actions, in a context that blurs the boundaries between art and reality, to position audients as ‘witnesses’ who are personally implicated in, and responsible for, the actions unfolding before them. This article problematises assumptions about the way the uncertainties embedded in the Live Art encounter contribute to its deconstructive agenda. It uses the ethical theory of Emmanuel Levinas, Hans-Thies Lehmann and Dwight Conquergood to examine the mechanics of reductive, culturally-recuperative readings that can limit the efficacy of the Live Art encounter. It argues that, though ‘witnessing’ in Live Art depends on a relation to the real - real people, taking real risks, in real places - if it fails to foreground theatrical frame it is difficult for audients to develop the dual consciousness of the content, and their complicity in that content, that is the starting point for reflexivity, and response-ability, in the ethical encounter.
Resumo:
Increased crash risk is associated with sedative medications and researchers and health-professionals have called for improvements to medication warnings about driving. The tiered warning system in France since 2005 indicates risk level, uses a color-coded pictogram, and advises the user to seek the advice of a doctor before driving. In Queensland, Australia, the mandatory warning on medications that may cause drowsiness advises the user not to drive or operate machinery if they self-assess that they are affected, and calls attention to possible increased impairment when combined with alcohol. Objectives The reported aims of the study were to establish and compare risk perceptions associated with the Queensland and French warnings among medication users. It was conducted to complement the work of DRUID in reviewing the effectiveness of existing campaigns and practice guidelines. Methods Medication users in France and Queensland were surveyed using warnings about driving from both contexts to compare risk perceptions associated with each label. Both samples were assessed for perceptions of the warning that carried the strongest message of risk. The Queensland study also included perceptions of the likelihood of crash and level of impairment associated with the warning. Results Findings from the French study (N = 75) indicate that when all labels were compared, the majority of respondents perceived the French Level-3 label as the strongest warning about risk concerning driving. Respondents in Queensland had significantly stronger perceptions of potential impairment to driving ability, z = -13.26, p <.000 (n = 325), and potential chance of having a crash, z = -11.87, p < .000 (n = 322), after taking a medication that displayed the strongest French warning, compared with the strongest Queensland warning. Conclusions Evidence suggests that warnings about driving displayed on medications can influence risk perceptions associated with use of medication. Further analyses will determine whether risk perceptions influence compliance with the warnings.
Resumo:
This dissertation examines the compliance and performance of a large sample of faith based (religious) ethical funds - the Shari'ah-compliant equity funds (SEFs), which may be viewed as a form of ethical investing. SEFs screen their investment for compliance with Islamic law, where riba (conventional interest expense), maysir (gambling), gharar (excessive uncertainty), and non-halal (non-ethical) products are prohibited. Using a set of stringent Shari'ah screens similar to those of MSCI Islamic, we first examine the extent to which SEFs comply with the Shari'ah law. Results show that only about 27% of the equities held by SEFs are Shari'ah-compliant. While most of the fund holdings pass the business screens, only about 42% pass the total debt to total assets ratio screen. This finding suggests that, in order to overcome a significant reduction in the investment opportunity, Shari'ah principles are compromised, with SEFs adopting lax screening rules so as to achieve a financial performance. While younger funds and funds that charge higher fees and are domiciled in more Muslim countries are more Shari'ah-compliant, we find little evidence of a positive relationship between fund disclosure of the Shari'ah compliance framework and Shari'ah-compliance. Clearly, Shari'ah compliance remains a major challenge for fund managers and SEF investors should be aware of Shari'ah-compliance risk since the fund managers do not always fulfill their fiduciary obligation, as promised in their prospectus. Employing a matched firm approach for a survivorship free sample of 387 SEFs, we then examine an issue that has been heavily debated in the literature: Does ethical screening reduce investment performance? Results show that it does but only by an average of 0.04% per month if benchmarked against matched conventional funds - this is a relatively small price to pay for religious faith. Cross-sectional regressions show an inverse relationship between Shari'ah compliance and fund performance: every one percentage increase in total compliance decreases fund performance by 0.01% per month. However, compliance fails to explain differences in the performance between SEFs and matched funds. Although SEFs do not generally perform better during crisis periods, further analysis shows evidence of better performance relative to conventional funds only during the recent Global Financial Crisis; the latter is consistent with popular media claims.
Resumo:
This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we prompt the participant with the expected risk that a given fault will occur given the particular input. These risks are predicted by traversing decision trees generated from the logs of past process executions and considering process data, involved resources, task durations and contextual information like task frequencies. The approach has been implemented in the YAWL system and its effectiveness evaluated. The results show that the process instances executed in the tests complete with substantially fewer faults and with lower fault severities, when taking into account the recommendations provided by our technique.
Resumo:
This paper proposes a concrete approach for the automatic mitigation of risks that are detected during process enactment. Given a process model exposed to risks, e.g. a financial process exposed to the risk of approval fraud, we enact this process and as soon as the likelihood of the associated risk(s) is no longer tolerable, we generate a set of possible mitigation actions to reduce the risks' likelihood, ideally annulling the risks altogether. A mitigation action is a sequence of controlled changes applied to the running process instance, taking into account a snapshot of the process resources and data, and the current status of the system in which the process is executed. These actions are proposed as recommendations to help process administrators mitigate process-related risks as soon as they arise. The approach has been implemented in the YAWL environment and its performance evaluated. The results show that it is possible to mitigate process-related risks within a few minutes.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.
Resumo:
OBJECTIVE The study investigates the knowledge, intentions, and driving behavior of persons prescribed medications that display a warning about driving. It also examines their confidence that they can self-assess possible impairment, as is required by the Australian labeling system. METHOD We surveyed 358 outpatients in an Australian public hospital pharmacy, representing a well-advised group taking a range of medications including those displaying a warning label about driving. A brief telephone follow-up survey was conducted with a subgroup of the participants. RESULTS The sample had a median age of 53.2 years and was 53 percent male. Nearly three quarters (73.2%) had taken a potentially impairing class of medication and more than half (56.1%) had taken more than one such medication in the past 12 months. Knowledge of the potentially impairing effects of medication was relatively high for most items; however, participants underestimated the possibility of increased impairment from exceeding the prescribed dose and at commencing treatment. Participants' responses to the safety implications of taking drugs with the highest level of warning varied. Around two thirds (62.8%) indicated that they would consult a health practitioner for advice and around half would modify their driving in some way. However, one fifth (20.9%) would drive when the traffic was thought to be less heavy and over a third (37.7%) would modify their medication regime so that they could drive. The findings from the follow-up survey of a subsample taking target drugs at the time of the first interview were also of concern. Only just over half (51%) recalled seeing the warning label on their medications and, of this group, three quarters (78%) reported following the warning label advice. These findings indicated that there remains a large proportion of people who either did not notice or did not consider the warning when deciding whether to drive. There was a very high level of confidence in this group that they could determine whether they were personally affected by the medication, which may be a problem from a safety perspective. CONCLUSION This study involved persons who should have had a very high level of knowledge and awareness of medication warning labeling. Even in this group there was a lack of informed response to potential impairment. A review of the Australian warning system and wider dissemination of information on medication treatment effects would be useful. Clarifying the importance of potential risk in the general community context is recommended for consideration and further research.
Resumo:
INTRODUCTION: The first South African National Burden of Disease study quantified the underlying causes of premature mortality and morbidity experienced in South Africa in the year 2000. This was followed by a Comparative Risk Assessment to estimate the contributions of 17 selected risk factors to burden of disease in South Africa. This paper describes the health impact of exposure to four selected environmental risk factors: unsafe water, sanitation and hygiene; indoor air pollution from household use of solid fuels; urban outdoor air pollution and lead exposure. METHODS: The study followed World Health Organization comparative risk assessment methodology. Population-attributable fractions were calculated and applied to revised burden of disease estimates (deaths and disability adjusted life years, [DALYs]) from the South African Burden of Disease study to obtain the attributable burden for each selected risk factor. The burden attributable to the joint effect of the four environmental risk factors was also estimated taking into account competing risks and common pathways. Monte Carlo simulation-modeling techniques were used to quantify sampling, uncertainty. RESULTS: Almost 24 000 deaths were attributable to the joint effect of these four environmental risk factors, accounting for 4.6% (95% uncertainty interval 3.8-5.3%) of all deaths in South Africa in 2000. Overall the burden due to these environmental risks was equivalent to 3.7% (95% uncertainty interval 3.4-4.0%) of the total disease burden for South Africa, with unsafe water sanitation and hygiene the main contributor to joint burden. The joint attributable burden was especially high in children under 5 years of age, accounting for 10.8% of total deaths in this age group and 9.7% of burden of disease. CONCLUSION: This study highlights the public health impact of exposure to environmental risks and the significant burden of preventable disease attributable to exposure to these four major environmental risk factors in South Africa. Evidence-based policies and programs must be developed and implemented to address these risk factors at individual, household, and community levels.
Resumo:
Disadvantages of invariable cereal cropping, concern of nutrient leaching and prices of nitrogen (N) fertilizer have all increased during last decades. An undersown crop, which grows together with a main crop and after harvest, could mitigate all those questions. The aim of this study was to develop undersowing in Finnish conditions, so that it suits for spring cereal farming as well as possible and enhances taking care of soil and environment, especially when control of N is concerned. In total, 17 plant species were undersown in spring cereals during the field experiments between 1991-1999 at four sites in South and Central Finland, but after selection, eight of them were studied more thoroughly. Two legumes, one grass species and one mixture of them were included in long-term trials in order to study annually repeated undersowing. Further, simultaneous broadcasting of seeds instead of separate undersowing was studied. Grain yield response and the capacity of the undersown crop to absorb soil N or fix N from atmosphere, and the release of N were of greatest interest. Seeding rates of undersown crops and N fertilization rates during annually repeated undersowing were also studied. Italian ryegrass (Lolium multiflorum Lam., IR) absorbed soil nitrate N (NO3-N) most efficiently in autumn and timothy (Phleum pratense L.) in spring. The capacity of other grass species to absorb N was low, or it was insufficient considering the negative effect on grain yield. Red clover (Trifolium pratense L.) and white clover (Trifolium repens L.) suited well in annually repeated undersowing, supplying fixed N for cereals without markedly increased risk of N leaching. Autumn oriented growth rhythm of the studied legumes was optimal for undersowing, whereas the growth rhythm of grasses was less suited but varied between species. A model of adaptive undersowing system was outlined in order to emphasize allocation of measures according needs. After defining the goal of undersowing, many decisions are to be done. When diminishing N leaching is primarily sought, a mixture of IR and timothy is advantageous. Clovers suit for replacing N fertilization, as the positive residual effect is greater than the negative effect caused by competition. A mixture of legume and non legume is a good choice when increased diversity is the main target. Seeding rate is an efficient means for adjusting competition and N effects. Broadcasting with soil covering equipment can be used to establish an undersown crop. In addition, timing and method of cover crop termination have an important role in the outcome. Continuous observing of the system is needed as for instance conditions significantly affect growth of undersown crop and on the other hand N release from crop residues may increase in long run.
Resumo:
Tibolone, a synthetic steroid, is effective in the treatment of postmenopausal symptoms. Its cardiovascular safety profile has been questioned, because tibolone reduces the levels of high-density lipoprotein (HDL) cholesterol. Soy-derived isoflavones may offer health benefits, particularly as regards lipids and also other cardiovascular disease (CVD) risk factors. The soy-isoflavone metabolite equol is thought to be the key as regards soy-related beneficial effects. We studied the effects of soy supplementation on various CVD risk factors in postmenopausal monkeys and postmenopausal women using tibolone. In addition, the impact of equol production capability was studied. A total of 18 monkeys received casein/lactalbumin (C/L) (placebo), tibolone, soy (a woman s equivalent dose of 138 mg of isoflavones), or soy with tibolone in a randomized order for 14 weeks periods, and there was a 4-week washout (C/L) in between treatments. Postmenopausal women using tibolone (N=110) were screened by means of a one-week soy challenge to find 20 women with equol production capability (4-fold elevation from baseline equol level) and 20 control women, and treated in a randomized cross-over trial with a soy powder (52 g of soy protein containing 112 mg of isoflavones) or placebo for 8 weeks. Before and after the treatments lipids and lipoproteins were assessed in both monkeys and women. In addition, blood pressure, arterial stiffness, endothelial function, sex steroids, sex hormone-binding globulin (SHBG), and vascular inflammation markers were assessed. A 14% increase in plasma low-density lipoprotein (LDL) + very low-density lipoprotein (VLDL) cholesterol was observed in tibolone-treated monkeys vs. placebo. Soy treatment resulted in a 18% decrease in LDL+VLDL cholesterol, and concomitant supplementation with tibolone did not negate the LDL+VLDL cholesterol-lowering effect of soy. A 30% increase in HDL cholesterol was observed in monkeys fed with soy, whereas HDL cholesterol levels were reduced (48%) after tibolone. Interestingly, Soy+Tibolone diet conserved HDL cholesterol levels. Tibolone alone increased the total cholesterol (TC):HDL cholesterol ratio, whereas it was reduced by Soy or Soy+Tibolone. In postmenopausal women using tibolone, reductions in the levels of total cholesterol and LDL cholesterol were seen after soy supplementation compared with placebo, but there was no effect on HDL cholesterol, blood pressure, arterial stiffness or endothelial function. Soy supplementation decreased the levels of estrone in equol producers, and those of testosterone in the entire study population. No changes were seen in the levels of androstenedione, dehydroepiandrosterone sulfate, or SHBG. The levels of vascular cell adhesion molecule-1 increased, and platelet-selectin decreased after soy treatment, whereas C-reactive protein and intercellular adhesion molecule-1 remained unchanged. At baseline and unrelated to soy treatment, equol producers had lower systolic, diastolic and mean arterial pressures, less arterial stiffness and better endothelial function than non-producers. To conclude, soy supplementation reversed the tibolone-induced fall in HDL cholesterol in postmenopausal monkeys, but this effect was not seen in women taking tibolone. Equol production capability was associated with beneficial cardiovascular changes and thus, this characteristic may offer cardiovascular benefits, at least in women using tibolone.
Resumo:
We study risk-sensitive control of continuous time Markov chains taking values in discrete state space. We study both finite and infinite horizon problems. In the finite horizon problem we characterize the value function via Hamilton Jacobi Bellman equation and obtain an optimal Markov control. We do the same for infinite horizon discounted cost case. In the infinite horizon average cost case we establish the existence of an optimal stationary control under certain Lyapunov condition. We also develop a policy iteration algorithm for finding an optimal control.
Resumo:
Infrastructure project sustainability assessment typically entails the use of specialised assessment tools to measure and rate project performance against a set of criteria. This paper looks beyond the prevailing approaches to sustainability assessments and explores sustainability principles in terms of project risks and opportunities. Taking a risk management approach to applying sustainability concepts to projects has the potential to reconceptualise decision structures for sustainability from bespoke assessments to becoming a standard part of the project decisionmaking process. By integrating issues of sustainability into project risk management for project planning, design and construction, sustainability is considered within a more traditional business and engineering language. Currently, there is no widely practised approach for objectively considering the environmental and social context of projects alongside the more traditional project risk assessments of time, cost and quality. A risk-based approach would not solve all the issues associated with existing sustainability assessments but it would place sustainability concerns alongside other key risks and opportunities, integrating sustainability with other project decisions.
Resumo:
Background: Infection with multiple types of human papillomavirus (HPV) is one of the main risk factors associated with the development of cervical lesions. In this study, cervical samples collected from 1, 810 women with diverse sociocultural backgrounds, who attended to their cervical screening program in different geographical regions of Colombia, were examined for the presence of cervical lesions and HPV by Papanicolau testing and DNA PCR detection, respectively. Principal Findings: The negative binomial distribution model used in this study showed differences between the observed and expected values within some risk factor categories analyzed. Particularly in the case of single infection and coinfection with more than 4 HPV types, observed frequencies were smaller than expected, while the number of women infected with 2 to 4 viral types were higher than expected. Data analysis according to a negative binomial regression showed an increase in the risk of acquiring more HPV types in women who were of indigenous ethnicity (+37.8%), while this risk decreased in women who had given birth more than 4 times (-31.1%), or were of mestizo (-24.6%) or black (-40.9%) ethnicity. Conclusions: According to a theoretical probability distribution, the observed number of women having either a single infection or more than 4 viral types was smaller than expected, while for those infected with 2-4 HPV types it was larger than expected. Taking into account that this study showed a higher HPV coinfection rate in the indigenous ethnicity, the role of underlying factors should be assessed in detail in future studies.
Resumo:
Background: Previous studies have not demonstrated a consistent association between potentially inappropriate medicines (PIMs) in older patients as defined by Beers criteria and avoidable adverse drug events (ADEs). This study aimed to assess whether PIMs defined by new STOPP (Screening Tool of Older Persons’ potentially inappropriate Prescriptions) criteria are significantly associated with ADEs in older people with acute illness.
Methods: We prospectively studied 600 consecutive patients 65 years or older who were admitted with acute illness to a university teaching hospital over a 4-month interval. Potentially inappropriate medicines were defined by both Beers and STOPP criteria. Adverse drug events were defined by World Health Organization–Uppsala Monitoring Centre criteria and verified by a local expert consensus panel, which also assessed whether ADEs were causal or contributory to current hospitalization. Hallas criteria defined ADE avoidability.Wecompared the proportions of patients taking Beers criteria PIMs
and STOPP criteria PIMs with avoidable ADEs that were causal or contributory to admission.
Results: A total of 329 ADEs were detected in 158 of 600 patients (26.3%); 219 of 329 ADEs (66.6%) were considered causal or contributory to admission. Of the 219 ADEs, 151(68.9%)considered causal or contributory to admission were avoidable or potentially avoidable. After adjusting for age, sex, comorbidity, dementia, baseline activities of daily living function, and number of medications, the likelihood of a serious avoidable ADE increased significantly when STOPP PIMs were prescribed (odds ratio, 1.847; 95% confidence interval [CI], 1.506-2.264; P.001); prescription of Beers criteria PIMs did not significantly increase ADE risk (odds ratio, 1.276; 95% CI, 0.945-1.722; P=.11).
Conclusion: STOPP criteria PIMs,unlike Beers criteria PIMs, are significantly associated with avoidable ADEs in older people that cause or contribute to urgent hospitalization.
Resumo:
Background Previous research has shown that home ownership is associated with a reduced risk of admission to institutional care. The extent to which this reflects associations between wealth and health, between wealth and ability to buy in care or increased motivation to avoid admission related to policies on charging is unclear. Taking account of the value of the home, as well as housing tenure, may provide some clarification as to the relative importance of these factors.
Aims To analyse the probability of admission to residential and nursing home care according to housing tenure and house value.
Methods Cox regression was used to examine the association between home ownership, house value and risk of care home admissions over 6 years of follow-up among a cohort of 51 619 people aged 65 years or older drawn from the Northern Ireland Longitudinal Study, a representative sample of approximate to 28% of the population of Northern Ireland.
Results 4% of the cohort (2138) was admitted during follow-up. Homeowners were less likely than those who rented to be admitted to care homes (HR 0.77, 95% CI 0.70 to 0.85, after adjusting for age, sex, health, living arrangement and urban/rural differences). There was a strong association between house value/tenure and health with those in the highest valued houses having the lowest odds of less than good health or limiting long-term illness. However, there was no difference in probability of admission according to house value; HRs of 0.78 (95% CI 0.67 to 0.90) and 0.81 (95% CI 0.70 to 0.95), respectively, for the lowest and highest value houses compared with renters.
Conclusions The requirement for people in the UK with capital resources to contribute to their care is a significant disincentive to institutional admission. This may place an additional burden on carers.