846 resultados para Risk-need-responsivity
Resumo:
Objective: The objective of this paper is to describe the population served in mental health institutionsfor mental illness relapse, and the process of identifying risk factors in relapsing patientsdiagnosed with severe mental illness. To this end a descriptive exploratory multicenter, multistageepidemiological study was carried out in mental health institutions of the Order of San Juan deDios Hospital (OHSJD) with hospitalized relapsing patients with a diagnosis of severe mentaldisorder. This study comes from a working network of Psychology professionals in the OHSJDnationwide. Materials and methods: The population sample was of 1005 patients diagnosed withsevere mental disorders, who had presented relapse during the last year. First, the characterizationof the general population was conducted; then, it was narrowed down to the centers, taking intoaccount similarities and differences found according to the clinical and demographic variables.Results: Major risk factors for relapse found in patients diagnosed with severe mental disorderswere: having between 38 and 58 years of age, being female, single, graduates, unemployed, witha prevalence of bipolar affective disorder diagnosis, number of hospitalizations between 2 and10, number of drugs at the time of leaving hospital 2 to 6, with severe difficulties relating withothers and difficulties in adherence to treatment. The need for a caregiver was also found, as wellas a limited number of received psychological interventions. How the system of beliefs affects thedisease and the poor adherence to treatment was identified. Conclusions: These results indicatethe requirement of a design of team intervention strategies, ranging from the assessment team(home), definition of therapeutic action plans (for) and the posthospitalizacion (egress) following.There is a poor support network and limited adherence to comprehensive treatment.
Resumo:
High resolution descriptions of plant distribution have utility for many ecological applications but are especially useful for predictive modelling of gene flow from transgenic crops. Difficulty lies in the extrapolation errors that occur when limited ground survey data are scaled up to the landscape or national level. This problem is epitomized by the wide confidence limits generated in a previous attempt to describe the national abundance of riverside Brassica rapa (a wild relative of cultivated rapeseed) across the United Kingdom. Here, we assess the value of airborne remote sensing to locate B. rapa over large areas and so reduce the need for extrapolation. We describe results from flights over the river Nene in England acquired using Airborne Thematic Mapper (ATM) and Compact Airborne Spectrographic Imager (CASI) imagery, together with ground truth data. It proved possible to detect 97% of flowering B. rapa on the basis of spectral profiles. This included all stands of plants that occupied >2m square (>5 plants), which were detected using single-pixel classification. It also included very small populations (<5 flowering plants, 1-2m square) that generated mixed pixels, which were detected using spectral unmixing. The high detection accuracy for flowering B. rapa was coupled with a rather large false positive rate (43%). The latter could be reduced by using the image detections to target fieldwork to confirm species identity, or by acquiring additional remote sensing data such as laser altimetry or multitemporal imagery.
Resumo:
Introduction Health promotion (HP) aims to enhance good health while preventing ill-health at three levels of activity; primary (preventative), secondary (diagnostic) and tertiary (management).1 It can range from simple provision of health education to ongoing support, but the effectiveness of HP is ultimately dependent on its ability to influence change. HP as part of the Community Pharmacy Contract (CPC) aims to increase public knowledge and target ‘hard-to-reach’ individuals by focusing mainly on primary and tertiary HP. The CPC does not include screening programmes (secondary HP) as a service. Coronary heart disease (CHD) is a significant cause of morbidity and mortality in the UK. While there is evidence to support the effectiveness of some community pharmacy HP strategies in CHD, there is paucity of research in relation to screening services.2 Against this background, Alliance Pharmacy introduced a free CHD risk screening programme to provide tailored HP advice as part of a participant–pharmacist consultation. The aim of this study is to report on the CHD risk levels of participants and to provide a qualitative indication of consultation outcomes. Methods Case records for 12 733 people who accessed a free CHD risk screening service between August 2004 and April 2006 offered at 217 community pharmacies were obtained. The service involved initial self-completion of the Healthy Heart Assessment (HHA) form and measurement of height, weight, body mass index, blood pressure, total cholesterol and highdensity lipoprotein levels by pharmacists to calculate CHD risk.3 Action taken by pharmacists (lifestyle advice, statin recommendation or general practitioner (GP) referral) and qualitative statements of advice were recorded, and a copy provided to the participants. The service did not include follow-up of participants. All participants consented to taking part in evaluations of the service. Ethical committee scrutiny was not required for this service development evaluation. Results Case records for 10 035 participants (3658 male) were evaluable; 5730 (57%) were at low CHD risk (<15%); 3636 (36%) at moderate-to-high CHD risk (≥15%); and 669 (7%) had existing heart disease. A significantly higher proportion of male (48% versus 30% female) participants were at moderate- to-high risk of CHD (chi-square test; P < 0.005). A range of outcomes resulted from consultations. Lifestyle advice was provided irrespective of participants’ CHD risk or existing disease. In the moderate-to-high-risk group, of which 52% received prescribed medication, lifestyle advice was recorded for 62%, 16% were referred and 34% were advised to have a re-assessment. Statin recommendations were made in 1% of all cases. There was evidence of supportive and motivational statements in the advice recorded. Discussion Pharmacists were able to identify individuals’ level of CHD risk and provide them with bespoke advice. Identification of at-risk participants did not automatically result in referrals or statin recommendation. One-third of those accessing the screening service had moderate-to-high risk of CHD, a significantly higher proportion of whom were men. It is not known whether these individuals had been previously exposed to HP but presumably by accessing this service they may have contemplated change. As effectiveness of HP advice will depend among other factors on ability to influence change, future consultations may need to explore patients’ attitude towards change in relation to the Trans Theoretical Model4 to better tailor HP advice. The high uptake of the service by those at moderate-to-high CHD risk indicates a need for this type of screening programme in community pharmacy, perhaps specifically to reach men who access medical services less.
Resumo:
Consumers' attitudes to trust and risk are key issues in food safety research and attention needs to be focused on clearly defining a framework for analysing consumer behaviour in these terms. In order to achieve this, a detailed review of the recent literature surrounding risk, trust and the relationship between the two must be conducted. This paper aims to collate the current social sciences literature in the fields of food safety, trust and risk. It provides an insight into the economic and other modelling procedures available to measure consumers' attitudes to risk and trust in food safety and specifically notes the need for future research to concentrate on examining risk and trust as inter-related variables rather than two distinct, mutually exclusive concepts. A framework is proposed which it is hoped will assist in devising more effective research to support risk communication to consumers.
Resumo:
High resolution descriptions of plant distribution have utility for many ecological applications but are especially useful for predictive modeling of gene flow from transgenic crops. Difficulty lies in the extrapolation errors that occur when limited ground survey data are scaled up to the landscape or national level. This problem is epitomized by the wide confidence limits generated in a previous attempt to describe the national abundance of riverside Brassica rapa (a wild relative of cultivated rapeseed) across the United Kingdom. Here, we assess the value of airborne remote sensing to locate B. rapa over large areas and so reduce the need for extrapolation. We describe results from flights over the river Nene in England acquired using Airborne Thematic Mapper (ATM) and Compact Airborne Spectrographic Imager (CASI) imagery, together with ground truth data. It proved possible to detect 97% of flowering B. rapa on the basis of spectral profiles. This included all stands of plants that occupied >2m square (>5 plants), which were detected using single-pixel classification. It also included very small populations (<5 flowering plants, 1-2m square) that generated mixed pixels, which were detected using spectral unmixing. The high detection accuracy for flowering B. rapa was coupled with a rather large false positive rate (43%). The latter could be reduced by using the image detections to target fieldwork to confirm species identity, or by acquiring additional remote sensing data such as laser altimetry or multitemporal imagery.
Resumo:
This study compares relative and absolute forms of presenting risk information about influenza and the need for vaccination. It investigates whether differences in people's risk estimates and their evaluations of risk information, as a result of the different presentation formats, are still apparent when they are provided with information about the baseline level of risk. The results showed that, in the absence of baseline information, the relative risk format resulted in higher ratings of satisfaction, perceived effectiveness of vaccination, and likelihood of being vaccinated. However, these differences were not apparent when baseline information was presented. Overall, provision of baseline information resulted in more accurate risk estimates and more positive evaluations of the risk messages. It is recommended that, in order to facilitate shared and fully informed decision making, information about baseline level of risk should be included in all health communications specifying risk reductions, irrespective of the particular format adopted.
Resumo:
Background A significant proportion of women who are vulnerable to postnatal depression refuse to engage in treatment programmes. Little is known about them, other than some general demographic characteristics. In particular, their access to health care and their own and their infants' health outcomes are uncharted. Methods We conducted a nested cohort case-control study, using data from computerized health systems, and general practitioner (GP) and maternity records, to identify the characteristics, health service contacts, and maternal and infant health outcomes for primiparous antenatal clinic attenders at high risk for postnatal depression who either refused (self-exclusion group) or else agreed (take-up group) to receive additional Health Visiting support in pregnancy and the first 2 months postpartum. Results Women excluding themselves from Health Visitor support were younger and less highly educated than women willing to take up the support. They were less likely to attend midwifery, GP and routine Health Visitor appointments, but were more likely to book in late and to attend accident and emergency department (A&E). Their infants had poorer outcome in terms of gestation, birthweight and breastfeeding. Differences between the groups still obtained when age and education were taken into account for midwifery contacts, A&E attendance and gestation;the difference in the initiation of breast feeding was attenuated, but not wholly explained, by age and education. Conclusion A subgroup of psychologically vulnerable childbearing women are at particular risk for poor access to health care and adverse infant outcome. Barriers to take-up of services need to be understood in order better to deliver care.
Resumo:
Objective: To determine whether the use of verbal descriptors suggested by the European Union (EU) such as "common" (1-10% frequency) and "rare" (0.01-0.1%) effectively conveys the level of risk of side effects to people taking a medicine. Design: Randomised controlled study with unconcealed allocation. Participants: 120 adults taking simvastatin or atorvastatin after cardiac surgery or myocardial infarction. Setting: Cardiac rehabilitation clinics at two hospitals in Leeds, UK. Intervention: A written statement about one of the side effects of the medicine (either constipation or pancreatitis). Within each side effect condition half the patients were given the information in verbal form and half in numerical form (for constipation, "common" or 2.5%; for pancreatitis, "rare" or 0.04%). Main outcome measure: The estimated likelihood of the side effect occurring. Other outcome measures related to the perceived severity of the side effect, its risk to health, and its effect on decisions about whether to take the medicine. Results: The mean likelihood estimate given for the constipation side effect was 34.2% in the verbal group and 8.1% in the numerical group; for pancreatitis it was 18% in the verbal group and 2.1% in the numerical group. The verbal descriptors were associated with more negative perceptions of the medicine than their equivalent numerical descriptors. Conclusions: Patients want and need understandable information about medicines and their risks and benefits. This is essential if they are to become partners in medicine taking. The use of verbal descriptors to improve the level of information about side effect risk leads to overestimation of the level of harm and may lead patients to make inappropriate decisions about whether or not they take the medicine.
Resumo:
Patients want and need comprehensive and accurate information about their medicines so that they can participate in decisions about their healthcare: In particular, they require information about the likely risks and benefits that are associated with the different treatment options. However, to provide this information in a form that people can readily understand and use is a considerable challenge to healthcare professionals. One recent attempt to standardise the Language of risk has been to produce sets of verbal descriptors that correspond to specific probability ranges, such as those outlined in the European Commission (EC) Pharmaceutical Committee guidelines in 1998 for describing the incidence of adverse effects. This paper provides an overview of a number of studies involving members of the general public, patients, and hospital doctors, that evaluated the utility of the EC guideline descriptors (very common, common, uncommon, rare, very rare). In all studies it was found that people significantly over-estimated the likelihood of adverse effects occurring, given specific verbal descriptors. This in turn resulted in significantly higher ratings of their perceived risks to health and significantly lower ratings of their likelihood of taking the medicine. Such problems of interpretation are not restricted to the EC guideline descriptors. Similar levels of misinterpretation have also been demonstrated with two other recently advocated risk scales (Caiman's verbal descriptor scale and Barclay, Costigan and Davies' lottery scale). In conclusion, the challenge for risk communicators and for future research will be to produce a language of risk that is sufficiently flexible to take into account different perspectives, as well as changing circumstances and contexts of illness and its treatments. In the meantime, we urge the EC and other legislative bodies to stop recommending the use of specific verbal labels or phrases until there is a stronger evidence base to support their use.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society
Resumo:
This paper reviews the evidence relating to the question: does the risk of fungicide resistance increase or decrease with dose? The development of fungicide resistance progresses through three key phases. During the ‘emergence phase’ the resistant strain has to arise through mutation and invasion. During the subsequent ‘selection phase’, the resistant strain is present in the pathogen population and the fraction of the pathogen population carrying the resistance increases due to the selection pressure caused by the fungicide. During the final phase of ‘adjustment’, the dose or choice of fungicide may need to be changed to maintain effective control over a pathogen population where resistance has developed to intermediate levels. Emergence phase: no experimental publications and only one model study report on the emergence phase, and we conclude that work in this area is needed. Selection phase: all the published experimental work, and virtually all model studies, relate to the selection phase. Seven peer reviewed and four non-peer reviewed publications report experimental evidence. All show increased selection for fungicide resistance with increased fungicide dose, except for one peer reviewed publication that does not detect any selection irrespective of dose and one conference proceedings publication which claims evidence for increased selection at a lower dose. In the mathematical models published, no evidence has been found that a lower dose could lead to a higher risk of fungicide resistance selection. We discuss areas of the dose rate debate that need further study. These include further work on pathogen-fungicide combinations where the pathogen develops partial resistance to the fungicide and work on the emergence phase.
Resumo:
Pesticide risk indicators provide simple support in the assessment of environmental and health risks from pesticide use, and can therefore inform policies to foster a sustainable interaction of agriculture with the environment. For their relative simplicity, indicators may be particularly useful under conditions of limited data availability and resources, such as in Less Developed Countries (LDCs). However, indicator complexity can vary significantly, in particular between those that rely on an exposure–toxicity ratio (ETR) and those that do not. In addition, pesticide risk indicators are usually developed for Western contexts, which might cause incorrect estimation in LDCs. This study investigated the appropriateness of seven pesticide risk indicators for use in LDCs, with reference to smallholding agriculture in Colombia. Seven farm-level indicators, among which 3 relied on an ETR (POCER, EPRIP, PIRI) and 4 on a non-ETR approach (EIQ, PestScreen, OHRI, Dosemeci et al., 2002), were calculated and then compared by means of the Spearman rank correlation test. Indicators were also compared with respect to key indicator characteristics, i.e. user friendliness and ability to represent the system under study. The comparison of the indicators in terms of the total environmental risk suggests that the indicators not relying on an ETR approach cannot be used as a reliable proxy for more complex, i.e. ETR, indicators. ETR indicators, when user-friendly, show a comparative advantage over non-ETR in best combining the need for a relatively simple tool to be used in contexts of limited data availability and resources, and for a reliable estimation of environmental risk. Non-ETR indicators remain useful and accessible tools to discriminate between different pesticides prior to application. Concerning the human health risk, simple algorithms seem more appropriate for assessing human health risk in LDCs. However, further research on health risk indicators and their validation under LDC conditions is needed.
Resumo:
Although the independence of the association and causality has not been fully established, non-fasting (postprandial) triglyceride (TG) concentrations have emerged as a clinically significant cardiovascular disease (CVD) risk factor. In the current review, findings from three insightful prospective studies in the area, namely the Women's Health Study, the Copenhagen City Heart Study and the Norwegian Counties Study, are discussed. An overview is provided as to the likely etiological basis for the association between postprandial TG and CVD, with a focus on both lipid and non-lipid (inflammation, hemostasis and vascular function) risk factors. The impact of various lifestyle and physiological determinants are considered, in particular genetic variation and meal fat composition. Furthermore, although data is limited some information is provided as to the relative and interactive impact of a number of modulators of lipemia. It is evident that relative to age, gender and body mass index (known modulators of postprandial lipemia), the contribution of identified gene variants to the heterogeneity observed in the postprandial response is likely to be relatively small. Finally, we highlight the need for the development of a standardised ‘fat tolerance test’ for use in clinical trials, to allow the integration and comparison of data from individual studies
Resumo:
By employing Moody’s corporate default and rating transition data spanning the last 90 years we explore how much capital banks should hold against their corporate loan portfolios to withstand historical stress scenarios. Specifically, we will focus on the worst case scenario over the observation period, the Great Depression. We find that migration risk and the length of the investment horizon are critical factors when determining bank capital needs in a crisis. We show that capital may need to rise more than three times when the horizon is increased from 1 year, as required by current and future regulation, to 3 years. Increases are still important but of a lower magnitude when migration risk is introduced in the analysis. Further, we find that the new bank capital requirements under the so-called Basel 3 agreement would enable banks to absorb Great Depression-style losses. But, such losses would dent regulatory capital considerably and far beyond the capital buffers that have been proposed to ensure that banks survive crisis periods without government support.
Resumo:
Purpose Limited robust randomised controlled trials investigating fruit and vegetable (F&V) intake in people at risk of cardiovascular disease (CVD) exist. We aimed to design and validate a dietary strategy of increasing flavonoid-rich versus flavonoid-poor F&V consumption on nutrient biomarker profile. Methods A parallel, randomised, controlled, dose–response dietary intervention study. Participants with a CVD relative risk of 1.5 assessed by risk scores were randomly assigned to one of the 3 groups: habitual (control, CT), high-flavonoid (HF) or low-flavonoid (LF) diets. While the CT group (n = 57) consumed their habitual diet throughout, the HF (n = 58) and LF (n = 59) groups sequentially increased their daily F&V intake by an additional 2, 4 and 6 portions for 6-week periods during the 18-week study. Results Compliance to target numbers and types of F&V was broadly met and verified by dietary records, and plasma and urinary biomarkers. Mean (±SEM) number of F&V portions/day consumed by the HF and LF groups at baseline (3.8 ± 0.3 and 3.4 ± 0.3), 6 weeks (6.3 ± 0.4 and 5.8 ± 0.3), 12 weeks (7.0 ± 0.3 and 6.8 ± 0.3) and 18 weeks (7.6 ± 0.4 and 8.1 ± 0.4), respectively, was similar at baseline yet higher than the CT group (3.9 ± 0.3, 4.3 ± 0.3, 4.6 ± 0.4, 4.5 ± 0.3) (P = 0.015). There was a dose-dependent increase in dietary and urinary flavonoids in the HF group, with no change in other groups (P = 0.0001). Significantly higher dietary intakes of folate (P = 0.035), non-starch polysaccharides (P = 0.001), vitamin C (P = 0.0001) and carotenoids (P = 0.0001) were observed in both intervention groups compared with CT, which were broadly supported by nutrient biomarker analysis. Conclusions The success of improving nutrient profile by active encouragement of F&V intake in an intervention study implies the need for a more hands-on public health approach.