61 resultados para average current control
Resumo:
Prevention of psychoses has been intensively investigated within the past two decades, and particularly, prediction has been much advanced. Depending on the applied risk indicators, current criteria are associated with average, yet significantly heterogeneous transition rates of ≥30 % within 3 years, further increasing with longer follow-up periods. Risk stratification offers a promising approach to advance current prediction as it can help to reduce heterogeneity of transition rates and to identify subgroups with specific needs and response patterns, enabling a targeted intervention. It may also be suitable to improve risk enrichment. Current results suggest the future implementation of multi-step risk algorithms combining sensitive risk detection by cognitive basic symptoms (COGDIS) and ultra-high-risk (UHR) criteria with additional individual risk estimation by a prognostic index that relies on further predictors such as additional clinical indicators, functional impairment, neurocognitive deficits, and EEG and structural MRI abnormalities, but also considers resilience factors. Simply combining COGDIS and UHR criteria in a second step of risk stratification produced already a 4-year hazard rate of 0.66. With regard to prevention, two recent meta-analyses demonstrated that preventive measures enable a reduction in 12-month transition rates by 54-56 % with most favorable numbers needed to treat of 9-10. Unfortunately, psychosocial functioning, another important target of preventive efforts, did not improve. However, these results are based on a relatively small number of trials; and more methodologically sound studies and a stronger consideration of individual profiles of clinical needs by modular intervention programs are required
Resumo:
The epidemiological situation of strongyle infections in adult horses in Switzerland is characterized by a strong dominance of small strongyles (Cyathostominae) and an overall low level of egg shedding in the faeces. The prevailing attitude towards anthelmintic therapy considers neither husbandry conditions nor pasture hygiene measures. Instead, calendar-based routine medication, comprising usually 3 to 4 annual treatments, is the typical strategy. Such an approach, however, often results in an excessive administration of anthelmintics. With respect to the continuous spread of drug resistant cyathostomins a change of strategy seems inevitable. A consensus has been agreed on between equine parasitologists and clinicians of the Vetsuisse Faculty in Zurich and Berne to focus on the concept of a selective control approach, based on individual faecal egg counts as the central element. It is now recommended that clinically healthy horses (> 4 y) are treated only when their strongyle egg count is equal to or higher than 200 eggs per gram of faeces. A regular analysis of the strongyle population based on larval cultures, the control of drug efficacy, and quarantine measures for incoming horses are mandatory components of the concept. Recent experiences in several pilot farms have indicated that only 4 % of the McMaster analyses resulted in a deworming treatment. For horses that did not receive any nematicidal anthelmintic during the current season, a "safety" treatment is recommended at the end of the grazing period.
Resumo:
BACKGROUND Prostate cancer (PCa) is the second most common disease among men worldwide. It is important to know survival outcomes and prognostic factors for this disease. Recruitment for the largest therapeutic randomised controlled trial in PCa-the Systemic Therapy in Advancing or Metastatic Prostate Cancer: Evaluation of Drug Efficacy: A Multi-Stage Multi-Arm Randomised Controlled Trial (STAMPEDE)-includes men with newly diagnosed metastatic PCa who are commencing long-term androgen deprivation therapy (ADT); the control arm provides valuable data for a prospective cohort. OBJECTIVE Describe survival outcomes, along with current treatment standards and factors associated with prognosis, to inform future trial design in this patient group. DESIGN, SETTING, AND PARTICIPANTS STAMPEDE trial control arm comprising men newly diagnosed with M1 disease who were recruited between October 2005 and January 2014. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Overall survival (OS) and failure-free survival (FFS) were reported by primary disease characteristics using Kaplan-Meier methods. Hazard ratios and 95% confidence intervals (CIs) were derived from multivariate Cox models. RESULTS AND LIMITATIONS A cohort of 917 men with newly diagnosed M1 disease was recruited to the control arm in the specified interval. Median follow-up was 20 mo. Median age at randomisation was 66 yr (interquartile range [IQR]: 61-71), and median prostate-specific antigen level was 112 ng/ml (IQR: 34-373). Most men (n=574; 62%) had bone-only metastases, whereas 237 (26%) had both bone and soft tissue metastases; soft tissue metastasis was found mainly in distant lymph nodes. There were 238 deaths, 202 (85%) from PCa. Median FFS was 11 mo; 2-yr FFS was 29% (95% CI, 25-33). Median OS was 42 mo; 2-yr OS was 72% (95% CI, 68-76). Survival time was influenced by performance status, age, Gleason score, and metastases distribution. Median survival after FFS event was 22 mo. Trial eligibility criteria meant men were younger and fitter than general PCa population. CONCLUSIONS Survival remains disappointing in men presenting with M1 disease who are started on only long-term ADT, despite active treatments being available at first failure of ADT. Importantly, men with M1 disease now spend the majority of their remaining life in a state of castration-resistant relapse. PATIENT SUMMARY Results from this control arm cohort found survival is relatively short and highly influenced by patient age, fitness, and where prostate cancer has spread in the body.
Resumo:
Objectives: Athletes differ at staying focused on performance and avoiding distraction. Drawing on the strength model of self-control we investigated whether athletes do not only differ inter-individually in their disposition of staying focused and avoiding distraction but also intra-individually in their situational availability of focused attention. Design/method: In the present experiment we hypothesized that basketball players (N = 40) who have sufficient self-control resources will perform relatively better on a computer based decision making task under distraction conditions compared to a group who's self-control resources have been depleted in a prior task requiring self-control. Results: The results are in line with the strength model of self-control by demonstrating that an athlete's capability to focus attention relies on the situational availability of self-control strength. Conclusions: The current results indicate that having sufficient self-control strength in interference rich sport settings is likely to be beneficial for decision making.
Resumo:
OBJECTIVE Use of diuretics has been associated with an increased risk of gout. Data on different types of diuretics are scarce. We undertook this study to investigate the association between use of loop diuretics, thiazide or thiazide-like diuretics, and potassium-sparing agents and the risk of developing incident gout. METHODS We conducted a retrospective population-based case-control analysis using the General Practice Research Database established in the UK. We identified case patients who were diagnosed as having incident gout between 1990 and 2010. One control patient was matched to each case patient for age, sex, general practice, calendar time, and years of active history in the database. We used conditional logistic regression to calculate odds ratios (ORs) and 95% confidence intervals (95% CIs), and we adjusted for potential confounders. RESULTS We identified 91,530 incident cases of gout and the same number of matched controls. Compared to past use of diuretics from each respective drug class, adjusted ORs for current use of loop diuretics, thiazide diuretics, thiazide-like diuretics, and potassium-sparing diuretics were 2.64 (95% CI 2.47-2.83), 1.70 (95% CI 1.62-1.79), 2.30 (95% CI 1.95-2.70), and 1.06 (95% CI 0.91-1.23), respectively. Combined use of loop diuretics and thiazide diuretics was associated with the highest relative risk estimates of gout (adjusted OR 4.65 [95% CI 3.51-6.16]). Current use of calcium channel blockers or losartan slightly attenuated the risk of gout in patients who took diuretics. CONCLUSION Use of loop diuretics, thiazide diuretics, and thiazide-like diuretics was associated with an increased risk of incident gout, although use of potassium-sparing agents was not.
Resumo:
AIMS To assess incidence rates (IRs) of and identify risk factors for incident severe hypoglycaemia in patients with type 2 diabetes newly treated with antidiabetic drugs. METHODS Using the UK-based General Practice Research Database, we performed a retrospective cohort study between 1994 and 2011 and a nested case-control analysis. Ten controls from the population at risk were matched to each case with a recorded severe hypoglycaemia during follow-up on general practice, years of history in the database and calendar time. Using multivariate conditional logistic regression analyses, we adjusted for potential confounders. RESULTS Of 130,761 patients with newly treated type 2 diabetes (mean age 61.7 ± 13.0 years), 690 (0.5%) had an incident episode of severe hypoglycaemia recorded [estimated IR 11.97 (95% confidence interval, CI, 11.11-12.90) per 10,000 person-years (PYs)]. The IR was markedly higher in insulin users [49.64 (95% CI, 44.08-55.89) per 10,000 PYs] than in patients not using insulin [8.03 (95% CI, 7.30-8.84) per 10,000 PYs]. Based on results of the nested case-control analysis increasing age [≥ 75 vs. 20-59 years; adjusted odds ratio (OR), 2.27; 95% CI, 1.65-3.12], cognitive impairment/dementia (adjusted OR, 2.00; 95% CI, 1.37-2.91), renal failure (adjusted OR, 1.34; 95% CI, 1.04-1.71), current use of sulphonylureas (adjusted OR, 4.45; 95% CI, 3.53-5.60) and current insulin use (adjusted OR, 11.83; 95% CI, 9.00-15.54) were all associated with an increased risk of severe hypoglycaemia. CONCLUSIONS Severe hypoglycaemia was recorded in 12 cases per 10,000 PYs. Risk factors for severe hypoglycaemia included increasing age, renal failure, cognitive impairment/dementia, and current use of insulin or sulphonylureas.
Resumo:
Objectives: The dual-effects model of social control proposes that social control leads to better health practices, but also arouses psychological distress. However, findings are inconsistent in relation to health behavior and psychological distress. Recent research suggests that the most effective control is unnoticed by the receiver (i.e., invisible). There is some evidence that invisible social control is beneficial for positive and negative affective reactions. Yet, investigations of the influence of invisible social control on daily smoking and distress have been limited. In daily diaries, we investigated how invisible social control is associated with number of cigarettes smoked and negative affect on a daily basis. Methods: Overall, 99 smokers (72.0% men, mean age M = 40.48, SD = 9.82) and their non-smoking partners completed electronic diaries from a self-set quit date for 22 consecutive days within the hour before going to bed, reporting received and provided social control, daily number of cigarettes smoked, and negative affect. Results: Multilevel analyses indicated that between-person levels of invisible social control were associated with lower negative affect, whereas they were unrelated to number of cigarettes smoked. On days with higher-than-average invisible social control, smokers reported less cigarettes smoked and more negative affect. Conclusions: Between-person level findings indicate that invisible social control can be beneficial for negative affect. However, findings on the within-person level are in line with the assumptions of the dual-effects model of social control: Invisible social control reduced daily smoking and simultaneously increased daily negative affect within person.
Resumo:
Colors have been found to affect psychological functioning. Empirical evidence suggests that, in test situations, brief perceptions of the color red or even the word "red" printed in black ink prime implicit anxious responses and consequently impair cognitive performance. However, we propose that this red effect depends on people's momentary capacity to exert control over their prepotent responses (i.e., self-control). In three experiments (Ns = 66, 78, and 130), first participants' self-control strength was manipulated. Participants were then primed with the color or word red versus gray prior to completing an arithmetic test or an intelligence test. As expected, self-control strength moderated the red effect. While red had a detrimental effect on performance of participants with depleted self-control strength (ego depletion), it did not affect performance of participants with intact self-control strength. We discuss implications of the present findings within the current debate on the robustness of priming results
Resumo:
In the current study we investigated whether ego depletion negatively affects attention regulation under pressure in sports by assessing participants' dart throwing performance and accompanying gaze behavior. According to the strength model of self-control, the most important aspect of self-control is attention regulation. Because higher levels of state anxiety are associated with impaired attention regulation, we chose a mixed design with ego depletion (yes vs. no) as between-subjects and anxiety level (high vs. low) as within-subjects factor. Participants performed a perceptual-motor task requiring selective attention, namely, dart throwing. In line with our expectations, depleted participants in the high-anxiety condition performed worse and displayed a shorter final fixation on bull's eye, demonstrating that when one's self-control strength is depleted, attention regulation under pressure cannot be maintained. This is the first study that directly supports the general assumption that ego depletion is a major factor in influencing attention regulation under pressure.
Resumo:
Background: There is evidence that drinking during residential treatment is related to various factors, such as patients’ general control beliefs and self-efficacy, as well as to external control of alcohol use by program’s staff and situations where there is temptation to drink. As alcohol use during treatment has been shown to be associated with the resumption of alcohol use after discharge from residential treatment, we aimed to investigate how these variables are related to alcohol use during abstinenceoriented residential treatment programs for alcohol use disorders (AUD). Methods: In total, 509 patients who entered 1 of 2 residential abstinence-oriented treatment programs for AUD were included in the study. After detoxification, patients completed a standardized diagnostic procedure including interviews and questionnaires. Drinking was assessed by patients’ selfreport of at least 1 standard drink or by positive breathalyzer testing. The 2 residential programs were categorized as high or low control according to the average number of tests per patient. Results: Regression analysis revealed a significant interaction effect between internal and external control suggesting that patients with high internal locus of control and high frequency of control by staff demonstrated the least alcohol use during treatment (16.7%) while patients with low internal locus of control in programs with low external control were more likely to use alcohol during Treatment (45.9%). No effects were found for self-efficacy and temptation. Conclusions: As alcohol use during treatment is most likely associated with poor treatment outcomes, external control may improve treatment outcomes and particularly support patients with low internal locus of control, who show the highest risk for alcohol use during treatment. High external control may complement high internal control to improve alcohol use prevention while in treatment. Key Words: Alcohol Dependence, Alcohol Use, Locus of Control, Alcohol Testing.
Resumo:
OBJECTIVE Cochlear implants (CI) are standard treatment for prelingually deafened children and postlingually deafened adults. Computed tomography (CT) is the standard method for postoperative imaging of the electrode position. CT scans accurately reflect electrode depth and position, which is essential prior to use. However, routine CT examinations expose patients to radiation, which is especially problematic in children. We examined whether new CT protocols could reduce radiation doses while preserving diagnostic accuracy. METHODS To investigate whether electrode position can be assessed by low-dose CT protocols, a cadaveric lamb model was used because the inner ear morphology is similar to humans. The scans were performed at various volumetric CT dose-indexes CTDIvol)/kV combinations. For each constant CTDIvol the tube voltage was varied (i.e., 80, 100, 120 and 140kV). This procedure was repeated at different CTDIvol values (21mGy, 11mGy, 5.5mGy, 2.8mGy and 1.8mGy). To keep the CTDIvol constant at different tube voltages, the tube current values were adjusted. Independent evaluations of the images were performed by two experienced and blinded neuroradiologists. The criteria diagnostic usefulness, image quality and artifacts (scaled 1-4) were assessed in 14 cochlear-implanted cadaveric lamb heads with variable tube voltages. RESULTS Results showed that the standard CT dose could be substantially reduced without sacrificing diagnostic accuracy of electrode position. The assessment of the CI electrode position was feasible in almost all cases up to a CTDIvol of 2-3mGy. The number of artifacts did not increase for images within this dose range as compared to higher dosages. The extent of the artifacts caused by the implanted metal-containing CI electrode does not depend on the radiation dose and is not perceptibly influenced by changes in the tube voltage. Summarizing the evaluation of the CI electrode position is possible even at a very low radiation dose. CONCLUSIONS CT imaging of the temporal bone for postoperative electrode position control of the CI is possible with a very low and significantly radiation dose. The tube current-time product and voltage can be reduced by 50% without increasing artifacts. Low-dose postoperative CT scans are sufficient for localizing the CI electrode.
Resumo:
OBJECTIVE This study aims to assess the odds of developing incident gout in association with the use of postmenopausal estrogen-progestogen therapy, according to type, timing, duration, and route of administration of estrogen-progestogen therapy. METHODS We conducted a retrospective population-based case-control analysis using the United Kingdom-based Clinical Practice Research Datalink. We identified women (aged 45 y or older) who had a first-time diagnosis of gout recorded between 1990 and 2010. We matched one female control with each case on age, general practice, calendar time, and years of active history in the database. We used multivariate conditional logistic regression to calculate odds ratios (ORs) with 95% CIs (adjusted for confounders). RESULTS The adjusted OR for gout with current use of oral formulations of opposed estrogens (estrogen-progestogen) was 0.69 (95% CI, 0.56-0.86) compared with never use. Current use was associated with a decreased OR for gout in women without renal failure (adjusted OR, 0.71; 95% CI, 0.57-0.87) and hypertension (adjusted OR, 0.62; 95% CI, 0.44-0.87) compared with never use. Tibolone was associated with a decreased OR for gout (adjusted OR, 0.77; 95% CI, 0.63-0.95) compared with never use. Estrogens alone did not alter the OR for gout. CONCLUSIONS Current use of oral opposed estrogens, but not unopposed estrogens, is associated with a decreased OR for incident gout in women without renal failure and is more pronounced in women with hypertension. Use of tibolone is associated with a decreased OR for incident gout. The decreased OR for gout may be related to the progestogen component rather than the estrogen component.
Resumo:
Traumatic experiences may affect an individual's ability to exercise self-control, which is an essential characteristic for successfully managing life. As a measure of self-control, we used the delay discounting paradigm, that is, the extent to which a person devalues delayed gratification. The aim of this study was to investigate the relationship between childhood trauma and delay discounting using a control group design with elderly participants with a mean age of 76.2 years. Swiss former indentured child laborers (n=103) who had been exposed to trauma during their childhood were compared with nontraumatized controls (n=50). The trauma exposure group showed a considerably higher preference for immediate smaller rewards than the controls, indicating their lower self-control. A hierarchical regression analysis revealed that a history of abuse, current self-efficacy, and education were significantly associated with delay discounting. Implications for future research are discussed.
Resumo:
Schmallenberg virus (SBV) was first detected in Switzerland in July 2012 and many Swiss dairy farmers reported acute clinical signs in dairy cattle during the spread of the virus until December 2012. The objectives of the present study were to investigate the effects of an acute infection with SBV on milk yield, fertility and veterinary costs in dairy farms with clinical signs of SBV infection (case farms), and to compare those farms to a matched control group of dairy farms in which cattle did not show clinical signs of SBV infection. Herd size was significantly (p<0.001) larger in case farms (33 cows, n=77) than in control farms (25 cows, n=84). Within case herds, 14.8% (median) of the cows showed acute clinical signs. Managers from case farms indicated to have observed a higher abortion rate during the year with SBV (6.5%) than in the previous year (3.7%). Analysis of fertility parameters based on veterinary bills and data from the breeding associations showed no significant differences between case and control farms. The general veterinary costs per cow from July to December 2012 were significantly higher (p=0.02) in case (CHF 19.80; EUR 16.50) than in control farms (CHF 15.90; EUR 13.25). No differences in milk yield were found between groups, but there was a significant decrease in milk production in case farms in the second half year in 2012 compared to the same period in 2011 (p<0.001) and 2013 (p=0.009). The average daily milk yield per cow (both groups together) was +0.73kg higher (p=0.03) in the second half year 2011 and +0.52kg (p=0.12) in the second half year 2013 compared to the same half year 2012. Fifty-seven percent of the cows with acute clinical signs (n=461) were treated by a veterinarian. The average calculated loss after SBV infection for a standardized farm was CHF 1606 (EUR 1338), which can be considered as low at the national level, but the losses were subject to great fluctuations between farms, so that individual farms could have very high losses (>CHF 10,000, EUR 8333).
Resumo:
Occurring for the first time in 1986 in the United Kingdom, bovine spongiform encephalopathy (BSE), the so-called “mad-cow disease”, has had unprecedented consequences in veterinary public health. The implementation of drastic measures, including the ban of meat-and-bone-meal from livestock feed and the removal of specified risk materials from the food chain has eventually resulted in a significant decline of the epidemic. The disease was long thought to be caused by a single agent, but since the introduction of immunochemical diagnostic techniques, evidence of a phenotypic variation of BSE has emerged. Reviewing the literature available on the subject, this paper briefly summarizes the current knowledge about these atypical forms of BSE and discusses the consequences of their occurrence for disease control measures.