90 resultados para Age effect
Age affects the adjustment of cognitive control after a conflict: evidence from the bivalency effect
Resumo:
Age affects cognitive control. When facing a conflict, older adults are less able to activate goal-relevant information and inhibit irrelevant information. However, cognitive control also affects the events after a conflict. The purpose of this study was to determine whether age affects the adjustment of cognitive control following a conflict. To this end, we investigated the bivalency effect, that is, the performance slowing occurring after the conflict induced by bivalent stimuli (i.e., stimuli with features for two tasks). In two experiments, we tested young adults (aged 20-30) and older adults (aged 65-85) in a paradigm requiring alternations between three tasks, with bivalent stimuli occasionally occurring on one task. The young adults showed a slowing for all trials following bivalent stimuli. This indicates a widespread and long-lasting bivalency effect, replicating previous findings. In contrast, the older adults showed a more specific and shorter-lived slowing. Thus, age affects the adjustment of cognitive control following a conflict.
Resumo:
People differ in how open-ended or limited they perceive their future. We argue that individual differences in future time perspective affect the activation of implicit motives. Perceiving the time remaining for the satisfaction of one’s motives as limited should be associated with a higher activation of these motives than perceiving one’s future as more open-ended. Given that future time perspective decreases across adulthood, older adults should score higher on implicit motives than younger adults. This hypothesis was supported in a study with young (n = 53, age M = 25.60 years) and older adults (n = 55, age M = 68.05 years). Additionally, an experimental manipulation of future time perspective showed that age-related differences in implicit motives are influenced by future time perspective. These findings demonstrate that future time perspective is an important factor to explain the strength of motives.
Resumo:
QUESTIONS UNDER STUDY / PRINCIPLES: Interest groups advocate centre-specific outcome data as a useful tool for patients in choosing a hospital for their treatment and for decision-making by politicians and the insurance industry. Haematopoietic stem cell transplantation (HSCT) requires significant infrastructure and represents a cost-intensive procedure. It therefore qualifies as a prime target for such a policy. METHODS: We made use of the comprehensive database of the Swiss Blood Stem Cells Transplant Group (SBST) to evaluate potential use of mortality rates. Nine institutions reported a total of 4717 HSCT - 1427 allogeneic (30.3%), 3290 autologous (69.7%) - in 3808 patients between the years 1997 and 2008. Data were analysed for survival- and transplantation-related mortality (TRM) at day 100 and at 5 years. RESULTS: The data showed marked and significant differences between centres in unadjusted analyses. These differences were absent or marginal when the results were adjusted for disease, year of transplant and the EBMT risk score (a score incorporating patient age, disease stage, time interval between diagnosis and transplantation, and, for allogeneic transplants, donor type and donor-recipient gender combination) in a multivariable analysis. CONCLUSIONS: These data indicate comparable quality among centres in Switzerland. They show that comparison of crude centre-specific outcome data without adjustment for the patient mix may be misleading. Mandatory data collection and systematic review of all cases within a comprehensive quality management system might, in contrast, serve as a model to ascertain the quality of other cost-intensive therapies in Switzerland.
Resumo:
Potential drug-drug interactions (PDDIs) might expand with new combination antiretroviral therapies (ART) and polypharmacy related to increasing age and comorbidities. We investigated the prevalence of comedications and PDDIs within a large HIV cohort, and their effect on ART efficacy and tolerability.
Resumo:
Introduction: As a previous study revealed, arts speech therapy (AST) affects cardiorespiratory interaction [1]. The aim of the present study was to investigate whether AST also has effects on brain oxygenation and hemodynamics measured non-invasively using near-infrared spectroscopy (NIRS). Material and methods: NIRS measurements were performed on 17 subjects (8 men and 9 women, mean age: 35.6 ± 12.7 y) during AST. Each measurement lasted 35 min, comprising 8 min pre-baseline, 10 min recitation and 20 min post-baseline. For each subject, measurements were performed for three different AST recitation tasks (recitation of alliterative, hexameter and prose verse). Relative concentration changes of oxyhemoglobin (Δ[O2Hb]) and deoxyhemoglobin (Δ[HHb]) as well as the tissue oxygenation index (TOI) were measured using a Hamamatsu NIRO300 NIRS device and a sensor placed on the subjects forehead. Movement artifacts were removed using a novel method [2]. Statistical analysis (Wilcoxon test) was applied to the data to investigate (i) if the recitation causes changes in the median values and/or in the Mayer wave power spectral density (MW-PSD, range: 0.07–0.13 Hz) of Δ[O2Hb], Δ[HHb] or TOI, and (ii) if these changes vary between the 3 recitation forms. Results: For all three recitation styles a significant (p < 0.05) decrease in Δ[O2Hb] and TOI was found, indicating a decrease in blood flow. These decreases did not vary significantly between the three styles. MW-PSD increased significantly for Δ[O2Hb] when reciting the hexameter and prose verse, and for Δ[HHb] and TOI when reciting alliterations and hexameter, representing an increase in Mayer waves. The MW-PSD increase for Δ[O2Hb] was significantly larger for the hexameter verse compared to alliterative and prose verse Conclusion: The study showed that AST affects brain hemodynamics (oxygenation, blood flow and Mayer waves). Recitation caused a significant decrease in cerebral blood flow for all recitation styles as well as an increase in Mayer waves, particularly for the hexameter, which may indicate a sympathetic activation. References 1. D. Cysarz, D. von Bonin, H. Lackner, P. Heusser, M. Moser, H. Bettermann. Am J Physiol Heart Circ Physiol, 287 (2) (2004), pp. H579–H587 2. F. Scholkmann, S. Spichtig, T. Muehlemann, M. Wolf. Physiol Meas, 31 (5) (2010), pp. 649–662
Resumo:
In many patients, optimal results after pallidal deep brain stimulation (DBS) for primary dystonia may appear over several months, possibly beyond 1 year after implant. In order to elucidate the factors predicting such protracted clinical effect, we retrospectively reviewed the clinical records of 44 patients with primary dystonia and bilateral pallidal DBS implants. Patients with fixed skeletal deformities, as well as those with a history of prior ablative procedures, were excluded. The Burke-Fahn-Marsden Dystonia Rating Scale (BFMDRS) scores at baseline, 1 and 3 years after DBS were used to evaluate clinical outcome. All subjects showed a significant improvement after DBS implants (mean BFMDRS improvement of 74.9% at 1 year and 82.6% at 3 years). Disease duration (DD, median 15 years, range 2-42) and age at surgery (AS, median 31 years, range 10-59) showed a significant negative correlation with DBS outcome at 1 and 3 years. A partition analysis, using DD and AS, clustered subjects into three groups: (1) younger subjects with shorter DD (n = 19, AS < 27, DD ? 17); (2) older subjects with shorter DD (n = 8, DD ? 17, AS ? 27); (3) older subjects with longer DD (n = 17, DD > 17, AS ? 27). Younger patients with short DD benefitted more and faster than older patients, who however continued to improve 10% on average 1 year after DBS implants. Our data suggest that subjects with short DD may expect to achieve a better general outcome than those with longer DD and that AS may influence the time necessary to achieve maximal clinical response.
Resumo:
PURPOSE. To evaluate the role of fellow eye status in determining progression of geographic atrophy (GA) in patients with age-related macular degeneration (AMD). METHODS. A total of 300 eyes with GA of 193 patients from the prospective, longitudinal, natural history FAM Study were classified into three groups according to the AMD manifestation in the fellow eye at baseline examination: (1) bilateral GA, (2) early/intermediate AMD, and (3) exudative AMD. GA areas were quantified based on fundus autofluorescence images using a semiautomated image-processing method, and progression rates (PR) were estimated using two-level, linear, mixed-effects models. RESULTS. Crude GA-PR in the bilateral GA group (mean, 1.64 mm(2)/y; 95% CI, 1.478-1.803) was significantly higher than in the fellow eye early/intermediate group (0.74 mm(2)/y, 0.146-1.342). Although there was a significant difference in baseline GA size (P = 0.0013, t-test), and there was a significant increase in GA-PR by 0.11 mm(2)/y (0.05-0.17) per 1 disc area (DA; 2.54 mm(2)), an additional mean change of -0.79 (-1.43 to -0.15) was given to the PR beside the effect of baseline GA size. However, this difference was only significant when GA size was ?1 DA at baseline with a GA-PR of 1.70 mm(2)/y (1.54-1.85) in the bilateral and 0.95 mm(2)/y (0.37-1.54) in the early/intermediate group. There was no significant difference in PR compared with that in the fellow eye exudative group. CONCLUSIONS. The results indicate that the AMD manifestation of the fellow eye at baseline serves as an indicator for disease progression in eyes with GA ? 1 DA. Predictive characteristics not only contribute to the understanding of pathophysiological mechanisms, but also are useful for the design of future interventional trials in GA patients.
Resumo:
INTRODUCTION Age at onset of psychosis (AAO) may be younger in patients with cannabis use disorders (CUD) compared to those without CUD (NCUD). Previous studies included CUD co-morbid with other substance use disorders (SUD), and many did not control for confounders. METHODS Controlling for relevant confounders, differences in AAO between patients with and without CUD excluding those with any other SUD were analyzed in a large representative file audit of 625 first-episode psychosis (FEP) patients (age 14 to 29years) admitted to the Early Psychosis Prevention and Intervention Centre in Melbourne, Australia. RESULTS Three quarters of the 625 FEP patients had a CUD. Cannabis use started before psychosis onset in 87.6% of patients. AAO was not significantly different between CUD (without other SUD, n=201) and NCUD (n=157). However, AAO was younger in those with early CUD (starting age 14 or younger) compared to NCUD (F(1)=5.2; p=0.024; partial η(2)=0.026). Earlier age at onset of cannabis use predicted earlier age at onset of psychosis (β=-0.49, R(2)-change=0.25, p<0.001). CONCLUSION Only CUD starting age 14 or younger was associated with an earlier AAO at a small effect size. These findings suggest that CUD may exert an indirect effect on brain maturation resulting in earlier AAO potentially only in cannabis sensitive subjects.
Resumo:
Background Loss to follow-up (LTFU) is common in antiretroviral therapy (ART) programmes. Mortality is a competing risk (CR) for LTFU; however, it is often overlooked in cohort analyses. We examined how the CR of death affected LTFU estimates in Zambia and Switzerland. Methods and Findings HIV-infected patients aged ≥18 years who started ART 2004–2008 in observational cohorts in Zambia and Switzerland were included. We compared standard Kaplan-Meier curves with CR cumulative incidence. We calculated hazard ratios for LTFU across CD4 cell count strata using cause-specific Cox models, or Fine and Gray subdistribution models, adjusting for age, gender, body mass index and clinical stage. 89,339 patients from Zambia and 1,860 patients from Switzerland were included. 12,237 patients (13.7%) in Zambia and 129 patients (6.9%) in Switzerland were LTFU and 8,498 (9.5%) and 29 patients (1.6%), respectively, died. In Zambia, the probability of LTFU was overestimated in Kaplan-Meier curves: estimates at 3.5 years were 29.3% for patients starting ART with CD4 cells <100 cells/µl and 15.4% among patients starting with ≥350 cells/µL. The estimates from CR cumulative incidence were 22.9% and 13.6%, respectively. Little difference was found between naïve and CR analyses in Switzerland since only few patients died. The results from Cox and Fine and Gray models were similar: in Zambia the risk of loss to follow-up and death increased with decreasing CD4 counts at the start of ART, whereas in Switzerland there was a trend in the opposite direction, with patients with higher CD4 cell counts more likely to be lost to follow-up. Conclusions In ART programmes in low-income settings the competing risk of death can substantially bias standard analyses of LTFU. The CD4 cell count and other prognostic factors may be differentially associated with LTFU in low-income and high-income settings.
Resumo:
Background Synchronization programs have become standard in the dairy industry in many countries. In Switzerland, these programs are not routinely used for groups of cows, but predominantly as a therapy for individual problem cows. The objective of this study was to compare the effect of a CIDR-Select Synch and a 12-d CIDR protocol on the pregnancy rate in healthy, multiparous dairy cows in Swiss dairy farms. Methods Cows (N = 508) were randomly assigned to CIDR-Select Synch (N = 262) or 12-d CIDR (N = 246) protocols. Cows in the CIDR-Select Synch group received a CIDR and 2.5 ml of buserelin i.m. on d 0. On d 7, the CIDR insert was removed and 5 ml of dinoprost was administered i.m.. Cows in the 12-d CIDR group received the CIDR on d 0 and it was removed on d 12 (the routine CIDR protocol in Swiss dairies). On d 0 a milk sample for progesterone analysis was taken. Cows were inseminated upon observed estrus. Pregnancy was determined at or more than 35 days after artificial insemination. As a first step, the two groups were compared as to indication for treatment, breed, stud book, stall, pasture, and farmer's business using chi square tests or Fisher's exact test. Furthermore, groups were compared as to age, DIM, number of AI's, number of cows per farm, and yearly milk yield per cow using nonparametric ANOVA. A multiple logistic model was used to relate the success of the protocols to all of the available factors; in particular treatment (CIDR-Select Synch/12-d CIDR), milk progesterone value, age, DIM, previous treatment of the uterus, previous gynecological treatment, and number of preceding inseminations. Results The pregnancy rate was higher in cows following the CIDR-Select Synch compared to the 12-d CIDR protocol (50.4% vs. 22.4%; P < 0.0001). Conclusion The CIDR-Select Synch protocol may be highly recommended for multiparous dairy cows. The reduced time span of the progesterone insert decreased the number of days open, improved the pregnancy rate compared to the 12-d CIDR protocol and the cows did not to have to be handled more often.
Resumo:
In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies.
Resumo:
Objective To examine the associations between pet keeping in early childhood and asthma and allergies in children aged 6–10 years. Design Pooled analysis of individual participant data of 11 prospective European birth cohorts that recruited a total of over 22,000 children in the 1990s. Exposure definition Ownership of only cats, dogs, birds, rodents, or cats/dogs combined during the first 2 years of life. Outcome definition Current asthma (primary outcome), allergic asthma, allergic rhinitis and allergic sensitization during 6–10 years of age. Data synthesis Three-step approach: (i) Common definition of outcome and exposure variables across cohorts; (ii) calculation of adjusted effect estimates for each cohort; (iii) pooling of effect estimates by using random effects meta-analysis models. Results We found no association between furry and feathered pet keeping early in life and asthma in school age. For example, the odds ratio for asthma comparing cat ownership with “no pets” (10 studies, 11489 participants) was 1.00 (95% confidence interval 0.78 to 1.28) (I2 = 9%; p = 0.36). The odds ratio for asthma comparing dog ownership with “no pets” (9 studies, 11433 participants) was 0.77 (0.58 to 1.03) (I2 = 0%, p = 0.89). Owning both cat(s) and dog(s) compared to “no pets” resulted in an odds ratio of 1.04 (0.59 to 1.84) (I2 = 33%, p = 0.18). Similarly, for allergic asthma and for allergic rhinitis we did not find associations regarding any type of pet ownership early in life. However, we found some evidence for an association between ownership of furry pets during the first 2 years of life and reduced likelihood of becoming sensitized to aero-allergens. Conclusions Pet ownership in early life did not appear to either increase or reduce the risk of asthma or allergic rhinitis symptoms in children aged 6–10. Advice from health care practitioners to avoid or to specifically acquire pets for primary prevention of asthma or allergic rhinitis in children should not be given.
Resumo:
We report on a patient who developed, from 5 months of age, multiple seizure types, including myoclonic, associated with severe psychomotor delay, leading to the diagnosis of Dravet syndrome. Over the years, he developed refractory epilepsy and was implanted with a vagus nerve stimulator at the age of 19. After 3 months, he experienced a progressive improvement of partial and generalized seizures, with a >90% reduction, and better alertness. This meaningful clinical improvement is discussed in the light of the sudden unexpected death in epilepsy risk, which is high in this setting, and seems remarkably diminished in our patient in view of the reduction of generalized convulsions.
Resumo:
Survival after surgical treatment using competing-risk analysis has been previously examined in patients with prostate cancer (PCa). However, the combined effect of age and comorbidities has not been assessed in patients with high-risk PCa who might have heterogeneous rates of competing mortality despite the presence of aggressive disease.
Resumo:
Brain-derived neurotrophic factor (BDNF) has been implicated in the pathophysiology of psychiatric and neurological disorders and in the mechanisms of antidepressant pharmacotherapy. Psychiatric and neurological conditions have also been associated with reduced brain levels of N-acetyl-aspartate (NAA), which has been used as a putative marker of neural integrity. However, few studies have explored the relationship between BDNF polymorphisms and NAA levels directly. Here, we present data from a single-voxel proton magnetic resonance spectroscopy study of 64 individuals and explore the relationship between BDNF polymorphisms and prefrontal NAA level. Our results indicate an association between a single nucleotide polymorphism (SNP) within BDNF, known as rs1519480, and reduced NAA level (p = 0.023). NAA levels were further predicted by age and Asian ancestry. There was a significant rs1519480 × age interaction on NAA level (p = 0.031). Specifically, the effect of rs1519480 on NAA level became significant at age ⩾34.17 yr. NAA level decreased with advancing age for genotype TT (p = 0.001) but not for genotype CT (p = 0.82) or CC (p = 0.34). Additional in silico analysis of 142 post-mortem brain samples revealed an association between the same SNP and reduced BDNF mRNA expression in the prefrontal cortex. The rs1519480 SNP influences BDNF mRNA expression and has an impact on prefrontal NAA level over time. This genetic mechanism may contribute to inter-individual variation in cognitive performance seen during normal ageing, as well as contributing to the risk for developing psychiatric and neurological conditions.