882 resultados para non-diversifiable risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hepatocellular carcinoma is the main liver-related cause of death in patients with compensated cirrhosis. The early phases are asymptomatic and the prognosis is poor, which makes prevention essential. We propose that non-selective beta-blockers decrease the incidence and growth of hepatocellular carcinoma via a reduction of the inflammatory load from the gut to the liver and inhibition of angiogenesis. Due to their effect on the portal pressure, non-selective beta-blockers are used for prevention of esophageal variceal bleeding. Recently, non-hemodynamic effects of beta-blockers have received increasing attention. Blockage of β-adrenoceptors in the intestinal mucosa and gut lymphatic tissue together with changes in type and virulence of the intestinal microbiota lead to reduced bacterial translocation and a subsequent decrease in the portal load of pathogen-associated molecular patterns. This may reduce hepatic inflammation. Blockage of β-adrenoceptors also decrease angiogenesis by inhibition of vascular endothelial growth factors. Because gut-derived inflammation and neo-angiogenesis are important in hepatic carcinogenesis, non-selective beta-blockers can potentially reduce the development and growth of hepatocellular carcinoma. Rodent and in vitro studies support the hypothesis, but clinical verification is needed. Different study designs may be considered. The feasibility of a randomized controlled trial is limited due to the necessary large number of patients and long follow-up. Observational studies carry a high risk of bias. The meta-analytic approach may be used if the incidence and mortality of hepatocellular carcinoma can be extracted from trials on variceal bleeding and if the combined sample size and follow up is sufficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To investigate risk factors for the loss of multi-rooted teeth (MRT) in subjects treated for periodontitis and enrolled in supportive periodontal therapy (SPT). MATERIAL AND METHODS A total of 172 subjects were examined before (T0) and after active periodontal therapy (APT)(T1) and following a mean of 11.5 ± 5.2 (SD) years of SPT (T2). The association of risk factors with loss of MRT was analysed with multilevel logistic regression. The tooth was the unit of analysis. RESULTS Furcation involvement (FI) = 1 before APT was not a risk factor for tooth loss compared with FI = 0 (p = 0.37). Between T0 and T2, MRT with FI = 2 (OR: 2.92, 95% CI: 1.68, 5.06, p = 0.0001) and FI = 3 (OR: 6.85, 95% CI: 3.40, 13.83, p < 0.0001) were at a significantly higher risk to be lost compared with those with FI = 0. During SPT, smokers lost significantly more MRT compared with non-smokers (OR: 2.37, 95% CI: 1.05, 5.35, p = 0.04). Non-smoking and compliant subjects with FI = 0/1 at T1 lost significantly less MRT during SPT compared with non-compliant smokers with FI = 2 (OR: 10.11, 95% CI: 2.91, 35.11, p < 0.0001) and FI = 3 (OR: 17.18, 95% CI: 4.98, 59.28, p < 0.0001) respectively. CONCLUSIONS FI = 1 was not a risk factor for tooth loss compared with FI = 0. FI = 2/3, smoking and lack of compliance with regular SPT represented risk factors for the loss of MRT in subjects treated for periodontitis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The number of older adults in the global population is increasing. This demographic shift leads to an increasing prevalence of age-associated disorders, such as Alzheimer's disease and other types of dementia. With the progression of the disease, the risk for institutional care increases, which contrasts with the desire of most patients to stay in their home environment. Despite doctors' and caregivers' awareness of the patient's cognitive status, they are often uncertain about its consequences on activities of daily living (ADL). To provide effective care, they need to know how patients cope with ADL, in particular, the estimation of risks associated with the cognitive decline. The occurrence, performance, and duration of different ADL are important indicators of functional ability. The patient's ability to cope with these activities is traditionally assessed with questionnaires, which has disadvantages (eg, lack of reliability and sensitivity). Several groups have proposed sensor-based systems to recognize and quantify these activities in the patient's home. Combined with Web technology, these systems can inform caregivers about their patients in real-time (e.g., via smartphone). OBJECTIVE We hypothesize that a non-intrusive system, which does not use body-mounted sensors, video-based imaging, and microphone recordings would be better suited for use in dementia patients. Since it does not require patient's attention and compliance, such a system might be well accepted by patients. We present a passive, Web-based, non-intrusive, assistive technology system that recognizes and classifies ADL. METHODS The components of this novel assistive technology system were wireless sensors distributed in every room of the participant's home and a central computer unit (CCU). The environmental data were acquired for 20 days (per participant) and then stored and processed on the CCU. In consultation with medical experts, eight ADL were classified. RESULTS In this study, 10 healthy participants (6 women, 4 men; mean age 48.8 years; SD 20.0 years; age range 28-79 years) were included. For explorative purposes, one female Alzheimer patient (Montreal Cognitive Assessment score=23, Timed Up and Go=19.8 seconds, Trail Making Test A=84.3 seconds, Trail Making Test B=146 seconds) was measured in parallel with the healthy subjects. In total, 1317 ADL were performed by the participants, 1211 ADL were classified correctly, and 106 ADL were missed. This led to an overall sensitivity of 91.27% and a specificity of 92.52%. Each subject performed an average of 134.8 ADL (SD 75). CONCLUSIONS The non-intrusive wireless sensor system can acquire environmental data essential for the classification of activities of daily living. By analyzing retrieved data, it is possible to distinguish and assign data patterns to subjects' specific activities and to identify eight different activities in daily living. The Web-based technology allows the system to improve care and provides valuable information about the patient in real-time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Refinements in stent design affecting strut thickness, surface polymer, and drug release have improved clinical outcomes of drug-eluting stents. We aimed to compare the safety and efficacy of a novel, ultrathin strut cobalt-chromium stent releasing sirolimus from a biodegradable polymer with a thin strut durable polymer everolimus-eluting stent. METHODS We did a randomised, single-blind, non-inferiority trial with minimum exclusion criteria at nine hospitals in Switzerland. We randomly assigned (1:1) patients aged 18 years or older with chronic stable coronary artery disease or acute coronary syndromes undergoing percutaneous coronary intervention to treatment with biodegradable polymer sirolimus-eluting stents or durable polymer everolimus-eluting stents. Randomisation was via a central web-based system and stratified by centre and presence of ST segment elevation myocardial infarction. Patients and outcome assessors were masked to treatment allocation, but treating physicians were not. The primary endpoint, target lesion failure, was a composite of cardiac death, target vessel myocardial infarction, and clinically-indicated target lesion revascularisation at 12 months. A margin of 3·5% was defined for non-inferiority of the biodegradable polymer sirolimus-eluting stent compared with the durable polymer everolimus-eluting stent. Analysis was by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT01443104. FINDINGS Between Feb 24, 2012, and May 22, 2013, we randomly assigned 2119 patients with 3139 lesions to treatment with sirolimus-eluting stents (1063 patients, 1594 lesions) or everolimus-eluting stents (1056 patients, 1545 lesions). 407 (19%) patients presented with ST-segment elevation myocardial infarction. Target lesion failure with biodegradable polymer sirolimus-eluting stents (69 cases; 6·5%) was non-inferior to durable polymer everolimus-eluting stents (70 cases; 6·6%) at 12 months (absolute risk difference -0·14%, upper limit of one-sided 95% CI 1·97%, p for non-inferiority <0·0004). No significant differences were noted in rates of definite stent thrombosis (9 [0·9%] vs 4 [0·4%], rate ratio [RR] 2·26, 95% CI 0·70-7·33, p=0·16). In pre-specified stratified analyses of the primary endpoint, biodegradable polymer sirolimus-eluting stents were associated with improved outcome compared with durable polymer everolimus-eluting stents in the subgroup of patients with ST-segment elevation myocardial infarction (7 [3·3%] vs 17 [8·7%], RR 0·38, 95% CI 0·16-0·91, p=0·024, p for interaction=0·014). INTERPRETATION In a patient population with minimum exclusion criteria and high adherence to dual antiplatelet therapy, biodegradable polymer sirolimus-eluting stents were non-inferior to durable polymer everolimus-eluting stents for the combined safety and efficacy outcome target lesion failure at 12 months. The noted benefit in the subgroup of patients with ST-segment elevation myocardial infarction needs further study. FUNDING Clinical Trials Unit, University of Bern, and Biotronik, Bülach, Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Accurate information about the prevalence of Chlamydia trachomatis is needed to assess national prevention and control measures. Methods: We systematically reviewed population-based cross-sectional studies that estimated chlamydia prevalence in European Union/European Economic Area (EU/EEA) Member States and non-European high income countries from January 1990 to August 2012. We examined results in forest plots, explored heterogeneity using the I2 statistic, and conducted random effects meta-analysis if appropriate. Metaregression was used to examine the relationship between study characteristics and chlamydia prevalence estimates. Results: We included 25 population-based studies from 11 EU/EEA countries and 14 studies from five other high income countries. Four EU/EEA Member States reported on nationally representative surveys of sexually experienced adults aged 18-26 years (response rates 52-71%). In women, chlamydia point prevalence estimates ranged from 3.0-5.3%; the pooled average of these estimates was 3.6% (95% CI 2.4, 4.8, I2 0%). In men, estimates ranged from 2.4-7.3% (pooled average 3.5%; 95% CI 1.9, 5.2, I2 27%). Estimates in EU/EEA Member States were statistically consistent with those in other high income countries (I2 0% for women, 6% for men). There was statistical evidence of an association between survey response rate and estimated chlamydia prevalence; estimates were higher in surveys with lower response rates, (p=0.003 in women, 0.018 in men). Conclusions: Population-based surveys that estimate chlamydia prevalence are at risk of participation bias owing to low response rates. Estimates obtained in nationally representative samples of the general population of EU/EEA Member States are similar to estimates from other high income countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To assess the available evidence on the effectiveness of accelerated orthodontic tooth movement through surgical and non-surgical approaches in orthodontic patients. METHODS Randomized controlled trials and controlled clinical trials were identified through electronic and hand searches (last update: March 2014). Orthognathic surgery, distraction osteogenesis, and pharmacological approaches were excluded. Risk of bias was assessed using the Cochrane risk of bias tool. RESULTS Eighteen trials involving 354 participants were included for qualitative and quantitative synthesis. Eight trials reported on low-intensity laser, one on photobiomodulation, one on pulsed electromagnetic fields, seven on corticotomy, and one on interseptal bone reduction. Two studies on corticotomy and two on low-intensity laser, which had low or unclear risk of bias, were mathematically combined using the random effects model. Higher canine retraction rate was evident with corticotomy during the first month of therapy (WMD=0.73; 95% CI: 0.28, 1.19, p<0.01) and with low-intensity laser (WMD=0.42mm/month; 95% CI: 0.26, 0.57, p<0.001) in a period longer than 3 months. The quality of evidence supporting the interventions is moderate for laser therapy and low for corticotomy intervention. CONCLUSIONS There is some evidence that low laser therapy and corticotomy are effective, whereas the evidence is weak for interseptal bone reduction and very weak for photobiomodulation and pulsed electromagnetic fields. Overall, the results should be interpreted with caution given the small number, quality, and heterogeneity of the included studies. Further research is required in this field with additional attention to application protocols, adverse effects, and cost-benefit analysis. CLINICAL SIGNIFICANCE From the qualitative and quantitative synthesis of the studies, it could be concluded that there is some evidence that low laser therapy and corticotomy are associated with accelerated orthodontic tooth movement, while further investigation is required before routine application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bovine mastitis is a frequent problem in Swiss dairy herds. One of the main pathogens causing significant economic loss is Staphylococcus aureus. Various Staph. aureus genotypes with different biological properties have been described. Genotype B (GTB) of Staph. aureus was identified as the most contagious and one of the most prevalent strains in Switzerland. The aim of this study was to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB in Swiss dairy herds with an elevated yield-corrected herd somatic cell count (YCHSCC). One hundred dairy herds with a mean YCHSCC between 200,000 and 300,000cells/mL in 2010 were recruited and each farm was visited once during milking. A standardized protocol investigating demography, mastitis management, cow husbandry, milking system, and milking routine was completed during the visit. A bulk tank milk (BTM) sample was analyzed by real-time PCR for the presence of Staph. aureus GTB to classify the herds into 2 groups: Staph. aureus GTB-positive and Staph. aureus GTB-negative. Moreover, quarter milk samples were aseptically collected for bacteriological culture from cows with a somatic cell count ≥150,000cells/mL on the last test-day before the visit. The culture results allowed us to allocate the Staph. aureus GTB-negative farms to Staph. aureus non-GTB and Staph. aureus-free groups. Multivariable multinomial logistic regression models were built to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB. The prevalence of Staph. aureus GTB herds was 16% (n=16), whereas that of Staph. aureus non-GTB herds was 38% (n=38). Herds that sent lactating cows to seasonal communal pastures had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 10.2, 95% CI: 1.9-56.6), compared with herds without communal pasturing. Herds that purchased heifers had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds without purchase of heifers. Furthermore, herds that did not use udder ointment as supportive therapy for acute mastitis had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 8.5, 95% CI: 1.6-58.4) or Staph. aureus non-GTB (odds ratio: 6.1, 95% CI: 1.3-27.8) than herds that used udder ointment occasionally or regularly. Herds in which the milker performed unrelated activities during milking had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds in which the milker did not perform unrelated activities at milking. Awareness of 4 potential risk factors identified in this study guides implementation of intervention strategies to improve udder health in both Staph. aureus GTB and Staph. aureus non-GTB herds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN We used a prospective cohort study. PARTICIPANTS In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Although the possibility of bleeding during anticoagulant treatment may limit patients from taking part in physical activity, the association between physical activity and anticoagulation-related bleeding is uncertain. OBJECTIVES To determine whether physical activity is associated with bleeding in elderly patients taking anticoagulants. PATIENTS/METHODS In a prospective multicenter cohort study of 988 patients aged ≥65 years receiving anticoagulants for venous thromboembolism, we assessed patients' self-reported physical activity level. The primary outcome was the time to a first major bleeding, defined as fatal bleeding, symptomatic bleeding in a critical site, or bleeding causing a fall in hemoglobin or leading to transfusions. The secondary outcome was the time to a first clinically-relevant non-major bleeding. We examined the association between physical activity level and time to a first bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS During a mean follow-up of 22 months, patients with a low, moderate, and high physical activity level had an incidence of major bleeding of 11.6, 6.3, and 3.1 events per 100 patient-years, and an incidence of clinically relevant non-major bleeding of 14.0, 10.3, and 7.7 events per 100 patient-years, respectively. A high physical activity level was significantly associated with a lower risk of major bleeding (adjusted sub-hazard ratio 0.40, 95%-CI 0.22-0.72). There was no association between physical activity and non-major bleeding. CONCLUSIONS A high level of physical activity is associated with a decreased risk of major bleeding in elderly patients receiving anticoagulant therapy. This article is protected by copyright. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Porcine reproductive and respiratory syndrome virus (PRRSV) is wide-spread in pig populations globally. In many regions of Europe with intensive pig production and high herd densities, the virus is endemic and can cause disease and production losses. This fuels discussion about the feasibility and sustainability of virus elimination from larger geographic regions. The implementation of a program aiming at virus elimination for areas with high pig density is unprecedented and its potential success is unknown. The objective of this work was to approach pig population data with a simple method that could support assessing the feasibility of a sustainable regional PRRSV elimination. Based on known risk factors such as pig herd structure and neighborhood conditions, an index characterizing individual herds' potential for endemic virus circulation and reinfection was designed. This index was subsequently used to compare data of all pig herds in two regions with different pig- and herd-densities in Lower Saxony (North-West Germany) where PRRSV is endemic. Distribution of the indexed herds was displayed using GIS. Clusters of high herd index densities forming potential risk hot spots were identified which could represent key target areas for surveillance and biosecurity measures under a control program aimed at virus elimination. In an additional step, for the study region with the higher pig density (2463 pigs/km(2) farmland), the potential distribution of PRRSV-free and non-free herds during the implementation of a national control program aiming at national virus elimination was modeled. Complex herd and trade network structures suggest that PRRSV elimination in regions with intensive pig farming like that of middle Europe would have to involve legal regulation and be accompanied by important trade and animal movement restrictions. The proposed methodology of risk index mapping could be adapted to areas varying in size, herd structure and density. Interpreted in the regional context, this could help to classify the density of risk and to accordingly target resources and measures for elimination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Bolt-kit systems are increasingly used as an alternative to conventional external cerebrospinal fluid (CSF) drainage systems. Since 2009 we regularly utilize bolt-kit external ventricular drainage (EVD) systems with silver-bearing catheters inserted manually with a hand drill and skull screws for emergency ventriculostomy. For non-emergency situations, we use conventional ventriculostomy with subcutaneous tunneled silver-bearing catheters, performed in the operating room with a pneumatic drill. This retrospective analysis compared the two techniques in terms of infection rates. METHODS 152 patients (aged 17-85 years, mean=55.4 years) were included in the final analysis; 95 received bolt-kit silver-bearing catheters and 57 received conventionally implanted silver-bearing catheters. The primary endpoint combined infection parameters: occurrence of positive CSF culture, colonization of catheter tips, or elevated CSF white blood cell counts (>4/μl). Secondary outcome parameters were presence of microorganisms in CSF or on catheter tips. Incidence of increased CSF cell counts and number of patients with catheter malposition were also compared. RESULTS The primary outcome, defined as analysis of combined infection parameters (occurrence of either positive CSF culture, colonization of the catheter tips or raised CSF white blood cell counts >4/μl)was not significantly different between the groups (58.9% bolt-kit group vs. 63.2% conventionally implanted group, p=0.61, chi-square-test). The bolt-kit group was non-inferior and not superior to the conventional group (relative risk reduction of 6.7%; 90% confidence interval: -19.9% to 25.6%). Secondary outcomes showed no statistically significant difference in the incidence of microorganisms in CSF (2.1% bolt-kit vs. 5.3% conventionally implanted; p=0.30; chi-square-test). CONCLUSIONS This analysis indicates that silver-bearing EVD catheters implanted with a bolt-kit system outside the operating room do not significantly elevate the risk of CSF infection as compared to conventional implant methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Few contemporary data exist on traditional (TRF) and non-TRF (NTRF) burden in patients with premature acute coronary syndrome (ACS). METHODS Prevalence of TRFs and NTRFs were measured in 1015 young (55 years old or younger) ACS patients recruited from 26 centres in Canada, the United States, and Switzerland. Risk factors were compared across sex and family history categories, and against a sample of the general Canadian population based on the 2000-2001 Canadian Community Health Survey. The 10- and 30-year risks of cardiovascular disease (CVD) were estimated using Framingham Risk Scores. RESULTS Risk factors were more prevalent in premature ACS patients compared with the general population. Young women with a family history of coronary artery disease showed the greatest risk factor burden including TRFs of hypertension (67%), dyslipidemia (67%), obesity (53%), smoking (42%), and diabetes (33%), and NTRFs of anxiety (55%), low household income (44%), and depression (37%). The estimated median 10-year risk of CVD was 7% (interquartile range [IQR], 3%-9%) in women and 13% (IQR, 7%-17%) in men, whereas the 30-year risk of CVD was 36% (IQR, 22%-49%) in women and 44% (IQR, 31%-57%) in men. CONCLUSIONS Patients with premature ACS, especially women with a positive family history, are characterized by a very high risk factor burden that is poorly captured by 10-year risk estimates but better captured by 30-year estimates. Consideration of NTRFs and use of 30-year risk estimates might better estimate risk in young individuals and improve the prevention of premature ACS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS Metformin use has been associated with a decreased risk of some cancers, although data on head and neck cancer (HNC) are scarce. We explored the relation between the use of antidiabetic drugs and the risk of HNC. METHODS We conducted a case-control analysis in the UK-based Clinical Practice Research Datalink (CPRD) of people with incident HNC between 1995 and 2013 below the age of 90 years. Six controls per case were matched on age, sex, calendar time, general practice and number of years of active history in the CPRD prior to the index date. Other potential confounders including body mass index (BMI), smoking, alcohol consumption and comorbidities were also evaluated. The final analyses were adjusted for BMI, smoking and diabetes mellitus (or diabetes duration in a sensitivity analysis). Results are presented as odds ratios (ORs) with 95% confidence intervals (CIs). RESULTS Use of metformin was neither associated with a statistically significant altered risk of HNC overall (1-29 prescriptions: adjusted OR 0.87, 95% CI 0.61-1.24 and ≥ 30 prescriptions adjusted OR 0.80, 95% CI 0.53-1.22), nor was long-term use of sulphonylureas (adjusted OR 0.87, 95% CI 0.59-1.30), or any insulin use (adjusted OR 0.92, 95% CI 0.63-1.35). However, we found a (statistically non-significant) decreased risk of laryngeal cancer associated with long-term metformin use (adjusted OR 0.41, 95% CI 0.17-1.03). CONCLUSIONS In this population-based study, the use of antidiabetic drugs was not associated with a materially altered risk of HNC. Our data suggest a protective effect of long-term metformin use for laryngeal cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: A major depressive episode is still a frequently discussed risk factor of suicidal behaviour. However, current studies suggest that depression is predictive of suicidal ideas but much less of suicidal act (Nock et al., 2009). This implies that suicidal behaviour should not only be seen as a symptom of a depressive disorder, but should be understood as an independent behaviour, which must be examined separately. The present qualitative study focuses on typical Plans and motives of suicide attempters compared to non-suicidal depressive individuals.Methods: Plan Analysis (Caspar, 2007), a clinical case conceptualization approach was used to analyze the instrumental relations between participants' behaviours and the hypothetical Plans and motives "behind" this behaviour. Video taped narrative interviews of 17 suicide attempters and intake interviews of 17 non‐suicidal depressive patients were investigated with the Plan Analysis procedure and a Plan structure was developed for each participant. These were used for establishing a prototypical Plan structure for each clinical group.Results: Results indicate that suicidal behaviour serves various Plans and motives only found in suicide attempters. Furthermore depressive patients pursue interpersonal control strategies which may serve as a protective factor for not evolving suicidal behaviour.Discussion. Findings are discussed with respect to current theoretical models of suicidality as well as implications for suicide prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional risk assessments for crop protection chemicals compare the potential for causing toxicity (hazard identification) to anticipated exposure. New regulatory approaches have been proposed that would exclude exposure assessment and just focus on hazard identification based on endocrine disruption. This review comprises a critical analysis of hazard, focusing on the relative sensitivity of endocrine and non-endocrine endpoints, using a class of crop protection chemicals, the azole fungicides. These were selected because they are widely used on important crops (e.g. grains) and thereby can contact target and non-target plants and enter the food chain of humans and wildlife. Inhibition of lanosterol 14α-demethylase (CYP51) mediates the antifungal effect. Inhibition of other CYPs, such as aromatase (CYP19), can lead to numerous toxicological effects, which are also evident from high dose human exposures to therapeutic azoles. Because of its widespread use and substantial database, epoxiconazole was selected as a representative azole fungicide. Our critical analysis concluded that anticipated human exposure to epoxiconazole would yield a margin of safety of at least three orders of magnitude for reproductive effects observed in laboratory rodent studies that are postulated to be endocrine-driven (i.e. fetal resorptions). The most sensitive ecological species is the aquatic plant Lemna (duckweed), for which the margin of safety is less protective than for human health. For humans and wildlife, endocrine disruption is not the most sensitive endpoint. It is concluded that conventional risk assessment, considering anticipated exposure levels, will be protective of both human and ecological health. Although the toxic mechanisms of other azole compounds may be similar, large differences in potency will require a case-by-case risk assessment.