16 resultados para Risk perception
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
QUESTIONS UNDER STUDY: To determine the perception of primary care physicians regarding the risk of subsequent atherothrombotic events in patients with established cardiovascular (CV) disease, and to correlate this perception with documented antithrombotic therapy. METHODS: In a cross-sectional study of the general practice population in Switzerland, 381 primary care physicians screened 127 040 outpatients during 15 consecutive workdays in 2006. Perception of subsequent atherothrombotic events in patients with established CV disease was assessed using a tick box questionnaire allowing choices between low, moderate, high or very high risk. Logistic regression models were used to determine the relationship between risk perception and antithrombotic treatment. RESULTS: Overall, 13 057 patients (10.4%) were identified as having established CV disease and 48.8% of those were estimated to be at high to very high risk for subsequent atherothrombotic events. Estimated higher risk for subsequent atherothrombotic events was associated with a shift from aspirin monotherapy to clopidogrel, vitamin K antagonist or aspirin plus clopidogrel (p <0.001 for trend). Clopidogrel (12.7% vs 6.8%, p <0.001), vitamin K antagonist (24.5% vs 15.6%, p <0.001) or aspirin plus clopidogrel (10.2% vs 4.2%, p <0.001) were prescribed in patients estimated to be at high to very high risk more often than in those at low to moderate risk. CONCLUSIONS: Perception of primary care physicians regarding risk of subsequent atherothrombotic events varies in patients with CV disease, and as a result antithrombotic therapy is altered in patients with anticipated high to very high risk even though robust evidence and clear guidelines are lacking.
Resumo:
Background Public information about prevention of zoonoses should be based on the perceived problem by the public and should be adapted to regional circumstances. Growing fox populations have led to increasing concern about human alveolar echinococcosis, which is caused by the fox tapeworm Echinococcus multilocularis. In order to plan information campaigns, public knowledge about this zoonotic tapeworm was assessed. Methods By means of representative telephone interviews (N = 2041), a survey of public knowledge about the risk and the prevention of alveolar echinococcosis was carried out in the Czech Republic, France, Germany and Switzerland in 2004. Results For all five questions, significant country-specific differences were found. Fewer people had heard of E. multilocularis in the Czech Republic (14%) and France (18%) compared to Germany (63%) and Switzerland (70%). The same effect has been observed when only high endemic regions were considered (Czech Republic: 20%, France: 17%, Germany: 77%, Switzerland: 61%). In France 17% of people who knew the parasite felt themselves reasonably informed. In the other countries, the majority felt themselves reasonably informed (54–60%). The percentage that perceived E. multilocularis as a high risk ranged from 12% (Switzerland) to 43% (France). In some countries promising measures as deworming dogs (Czech Republic, Switzerland) were not recognized as prevention options. Conclusion Our results and the actual epidemiological circumstances of AE call for proactive information programs. This communication should enable the public to achieve realistic risk perception, give clear information on how people can minimize their infection risk, and prevent exaggerated reactions and anxiety.
Resumo:
When talking about flow, most people probably think of a highly desirable state associated with a broad variety of positive outcomes in terms of positive motivation, well-being, and performance. In contrast, this chapter suggests that the characteristics of flow also have the potential to be evil. First, we will explain how flow can lead to addiction when exercising, playing games, and using the Internet. Then we will consider how flow is linked to impaired risk perception and risky behavior. As a third negative facet of flow, we will outline how it can also be experienced in antisocial contexts and during combat. This chapter ends with some broader comments on the dark and bright sides of flow, including flow as a universal experience, the implications for practical interventions, ethical questions related to flow, and future research questions.
Resumo:
INTRODUCTION Late presentation to HIV care leads to increased morbidity and mortality. We explored risk factors and reasons for late HIV testing and presentation to care in the nationally representative Swiss HIV Cohort Study (SHCS). METHODS Adult patients enrolled in the SHCS between July 2009 and June 2012 were included. An initial CD4 count <350 cells/µl or an AIDS-defining illness defined late presentation. Demographic and behavioural characteristics of late presenters (LPs) were compared with those of non-late presenters (NLPs). Information on self-reported, individual barriers to HIV testing and care were obtained during face-to-face interviews. RESULTS Of 1366 patients included, 680 (49.8%) were LPs. Seventy-two percent of eligible patients took part in the survey. LPs were more likely to be female (p<0.001) or from sub-Saharan Africa (p<0.001) and less likely to be highly educated (p=0.002) or men who have sex with men (p<0.001). LPs were more likely to have their first HIV test following a doctor's suggestion (p=0.01), and NLPs in the context of a regular check-up (p=0.02) or after a specific risk situation (p<0.001). The main reasons for late HIV testing were "did not feel at risk" (72%), "did not feel ill" (65%) and "did not know the symptoms of HIV" (51%). Seventy-one percent of the participants were symptomatic during the year preceding HIV diagnosis and the majority consulted a physician for these symptoms. CONCLUSIONS In Switzerland, late presentation to care is driven by late HIV testing due to low risk perception and lack of awareness about HIV. Tailored HIV testing strategies and enhanced provider-initiated testing are urgently needed.
Resumo:
Tajikistan is judged to be highly vulnerable to risk, including food insecurity risks and climate change risks. By some vulnerability measures it is the most vulnerable among all 28 countries in the World Bank’s Europe and Central Asia Region – ECA (World Bank 2009). The rural population, with its relatively high incidence of poverty, is particularly vulnerable. The Pilot Program for Climate Resilience (PPCR) in Tajikistan (2011) provided an opportunity to conduct a farm-level survey with the objective of assessing various dimensions of rural population’s vulnerability to risk and their perception of constraints to farming operations and livelihoods. The survey should be accordingly referred to as the 2011 PPCR survey. The rural population in Tajikistan is highly agrarian, with about 50% of family income deriving from agriculture (see Figure 4.1; also LSMS 2007 – own calculations). Tajikistan’s agriculture basically consists of two groups of producers: small household plots – the successors of Soviet “private agriculture” – and dehkan (or “peasant”) farms – new family farming structures that began to be created under relevant legislation passed after 1992 (Lerman and Sedik, 2008). The household plots manage 20% of arable land and produce 65% of gross agricultural output (GAO). Dehkan farms manage 65% of arable land and produce close to 30% of GAO. The remaining 15% of arable land is held in agricultural enterprises – the rapidly shrinking sector of corporate farms that succeeded the Soviet kolkhozes and sovkhozes and today produces less than 10% of GAO (TajStat 2011) The survey conducted in May 2011 focused on dehkan farms, as budgetary constraints precluded the inclusion of household plots. A total of 142 dehkan farms were surveyed in face-to-face interviews. They were sampled from 17 districts across all four regions – Sughd, Khatlon, RRP, and GBAO. The districts were selected so as to represent different agro-climatic zones, different vulnerability zones (based on the World Bank (2011) vulnerability assessment), and different food-insecurity zones (based on WFP/IPC assessments). Within each district, 3-4 jamoats were chosen at random and 2-3 farms were selected in each jamoat from lists provided by jamoat administration so as to maximize the variability by farm characteristics. The sample design by region/district is presented in Table A, which also shows the agro-climatic zone and the food security phase for each district. The sample districts are superimposed on a map of food security phases based on IPC April 2011.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.
Resumo:
The development of a clinical decision tree based on knowledge about risks and reported outcomes of therapy is a necessity for successful planning and outcome of periodontal therapy. This requires a well-founded knowledge of the disease entity and a broad knowledge of how different risk conditions attribute to periodontitis. The infectious etiology, a complex immune response, and influence from a large number of co-factors are challenging conditions in clinical periodontal risk assessment. The difficult relationship between independent and dependent risk conditions paired with limited information on periodontitis prevalence adds to difficulties in periodontal risk assessment. The current information on periodontitis risk attributed to smoking habits, socio-economic conditions, general health and subjects' self-perception of health, is not comprehensive, and this contributes to limited success in periodontal risk assessment. New models for risk analysis have been advocated. Their utility for the estimation of periodontal risk assessment and prognosis should be tested. The present review addresses several of these issues associated with periodontal risk assessment.
Resumo:
BACKGROUND: In industrialized countries vaccination coverage remains suboptimal, partly because of perception of an increased risk of asthma. Epidemiologic studies of the association between childhood vaccinations and asthma have provided conflicting results, possibly for methodologic reasons such as unreliable vaccination data, biased reporting, and reverse causation. A recent review stressed the need for additional, adequately controlled large-scale studies. OBJECTIVE: Our goal was to determine if routine childhood vaccination against pertussis was associated with subsequent development of childhood wheezing disorders and asthma in a large population-based cohort study. METHODS: In 6811 children from the general population born between 1993 and 1997 in Leicestershire, United Kingdom, respiratory symptom data from repeated questionnaire surveys up to 2003 were linked to independently collected vaccination data from the National Health Service database. We compared incident wheeze and asthma between children of different vaccination status (complete, partial, and no vaccination against pertussis) by computing hazard ratios. Analyses were based on 6048 children, 23 201 person-years of follow-up, and 2426 cases of new-onset wheeze. RESULTS: There was no evidence for an increased risk of wheeze or asthma in children vaccinated against pertussis compared with nonvaccinated children. Adjusted hazard ratios comparing fully and partially vaccinated with nonvaccinated children were close to one for both incident wheeze and asthma. CONCLUSION: This study provides no evidence of an association between vaccination against pertussis in infancy and an increased risk of later wheeze or asthma and does not support claims that vaccination against pertussis might significantly increase the risk of childhood asthma.
Resumo:
PURPOSE OF REVIEW: The surgical procedure remains the key element in the multidisciplinary treatment of a wide variety of degenerative, traumatic, tumorous, congenital, and vascular diseases, resulting in an estimated 234 million surgical interventions worldwide each year. Undesired effects are inherent in any medical intervention, but are of particular interest in an invasive procedure for both the patient and the responsible physician. Major topics in current complication research include perception of key factors responsible for complication development, prediction, and whenever possible, prevention of complications. RECENT FINDINGS: For many years, the technical aspects of surgery and the skills of the surgeon her/himself were evaluated and considered as the main sources of surgical complications. However, recent studies identified many nontechnical perspectives, which could improve the overall quality of surgical interventions. SUMMARY: This article reviews selected, recently published data in this field and aims to point out the complexity and multidimensional facets of surgery-related risk factors.
Resumo:
BACKGROUND The objective of this study was to assess the incidence and impact of asymptomatic arrhythmia in patients with highly symptomatic atrial fibrillation (AF) who qualified for radiofrequency (RF) catheter ablation. METHODS AND RESULTS In this prospective study, 114 patients with at least 3 documented AF episodes together with corresponding symptoms and an ineffective trial of at least 1 antiarrhythmic drug were selected for RF ablation. With the use of CARTO, circumferential lesions around the pulmonary veins and linear lesions at the roof of the left atrium and along the left atrial isthmus were placed. A continuous, 7-day, Holter session was recorded before ablation, right after ablation, and after 3, 6, and 12 months of follow-up. During each 7-day Holter monitoring, the patients recorded quality and duration of any complaints by using a detailed symptom log. More than 70,000 hours of ECG recording were analyzed. In the 7-day Holter records before ablation, 92 of 114 patients (81%) had documented AF episodes. All episodes were symptomatic in 35 patients (38%). In 52 patients (57%), both symptomatic and asymptomatic episodes were recorded, whereas in 5 patients (5%), all documented AF episodes were asymptomatic. After ablation, the percentage of patients with only asymptomatic AF recurrences increased to 37% (P<0.05) at the 6-month follow-up. An analysis of patient characteristics and arrhythmia patterns failed to identify a specific subset who were at high risk for the development of asymptomatic AF. CONCLUSIONS Even in patients presenting with highly symptomatic AF, asymptomatic episodes may occur and significantly increase after catheter ablation. A symptom-only-based follow-up would substantially overestimate the success rate. Objective measures such as long-term Holter monitoring are needed to identify asymptomatic AF recurrences after ablation.
Resumo:
OBJECTIVE This study aimed to test the prediction from the Perception and Attention Deficit model of complex visual hallucinations (CVH) that impairments in visual attention and perception are key risk factors for complex hallucinations in eye disease and dementia. METHODS Two studies ran concurrently to investigate the relationship between CVH and impairments in perception (picture naming using the Graded Naming Test) and attention (Stroop task plus a novel Imagery task). The studies were in two populations-older patients with dementia (n = 28) and older people with eye disease (n = 50) with a shared control group (n = 37). The same methodology was used in both studies, and the North East Visual Hallucinations Inventory was used to identify CVH. RESULTS A reliable relationship was found for older patients with dementia between impaired perceptual and attentional performance and CVH. A reliable relationship was not found in the population of people with eye disease. CONCLUSIONS The results add to previous research that object perception and attentional deficits are associated with CVH in dementia, but that risk factors for CVH in eye disease are inconsistent, suggesting that dynamic rather than static impairments in attentional processes may be key in this population.
Resumo:
Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.
Resumo:
Objective The validity of current ultra-high risk (UHR) criteria is under-examined in help-seeking minors, particularly, in children below the age of 12 years. Thus, the present study investigated predictors of one-year outcome in children and adolescents (CAD) with UHR status. Method Thirty-five children and adolescents (age 9–17 years) meeting UHR criteria according to the Structured Interview for Psychosis-Risk Syndromes were followed-up for 12 months. Regression analyses were employed to detect baseline predictors of conversion to psychosis and of outcome of non-converters (remission and persistence of UHR versus conversion). Results At one-year follow-up, 20% of patients had developed schizophrenia, 25.7% had remitted from their UHR status that, consequently, had persisted in 54.3%. No patient had fully remitted from mental disorders, even if UHR status was not maintained. Conversion was best predicted by any transient psychotic symptom and a disorganized communication score. No prediction model for outcome beyond conversion was identified. Conclusions Our findings provide the first evidence for the predictive utility of UHR criteria in CAD in terms of brief intermittent psychotic symptoms (BIPS) when accompanied by signs of cognitive impairment, i.e. disorganized communication. However, because attenuated psychotic symptoms (APS) related to thought content and perception were indicative of non-conversion at 1-year follow-up, their use in early detection of psychosis in CAD needs further study. Overall, the need for more in-depth studies into developmental peculiarities in the early detection and treatment of psychoses with an onset of illness in childhood and early adolescence was further highlighted.
Resumo:
The aim of this study was to examine whether heart drawings of patients with acute myocardial infarction reflect acute distress symptoms and negative illness beliefs and predict posttraumatic stress symptoms 3 months post-myocardial infarction. In total, 84 patients aged over 18 years drew pictures of their heart. The larger the area drawn as damaged, the greater were the levels of acute distress (r = 0.36; p < 0.05), negative illness perceptions (r = 0.42, p < 0.05), and posttraumatic stress symptoms (r = 0.54, p < 0.01). Pain drawings may offer a tool to identify maladaptive cognitions and thus patients at risk of posttraumatic stress disorder.