960 resultados para long yearling age


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Influence of genetic variants in the NOD2 gene may play a more important role in disease activity, behaviour and treatment of pediatric- than adult-onset Crohn's disease (CD). METHODS: 85 pediatric- and 117 adult-onset CD patients were tested for the three main NOD2 CD-associated variants (p.R702W, p.G908R and p.10007fs) and clinical data of at least two years of follow-up were compared regarding disease behaviour and activity, response to therapy and bone mineral density (BMD). RESULTS: Chronic active and moderate to severe course of CD is associated in patients with pediatric-onset (p=0.0001) and NOD2 variant alleles (p=0.0001). In pediatric-onset CD the average PCDAI-Score was significantly higher in patients carrying NOD2 variants (p=0.0008). In addition, underweight during course of the disease (p=0.012) was associated with NOD2 variants. Interestingly, osteoporosis was found more frequently in patients carrying NOD2 variant alleles (p=0.033), especially in pediatric-onset CD patients with homozygous NOD2 variants (p=0.037). Accordingly, low BMD in pediatric-onset CD is associated with a higher PCDAI (p=0.0092), chronic active disease (p=0.0148), underweight at diagnosis (p=0.0271) and during follow-up (p=0.0109). Furthermore, pediatric-onset CD patients with NOD2 variants are more frequently steroid-dependent or refractory (p=0.048) and need long-term immunosuppressive therapy (p=0.0213). CONCLUSIONS: These data suggests that the presence of any of the main NOD2 variants in CD is associated with osteoporosis and an age of onset dependent influence towards underweight, higher disease activity and a more intensive immunosuppressive therapy. This observation supports the idea for an early intensive treatment strategy in children and adolescent CD patients with NOD2 gene variants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of the cerebellum for non‐motor functions is becoming more and more evident. The influence on cognitive functions from acquired cerebellar lesions during childhood, however, is not well known. We present follow‐up data from 24 patients, who were operated upon during childhood for benign cerebellar tumours. The benign histology of these tumours required neither radiotherapy nor chemotherapy. Post‐operatively, these children were of normal intelligence with a mean IQ of 99.1, performance intelligence quotient (PIQ) of 101.3 and verbal intelligence quotient (VIQ) of 96.8. However, 57% of patients showed abnormalities in subtesting. In addition, more extensive neuropsychological testing revealed significant problems for attention, memory, processing speed and interference. Visuo‐constructive problems were marked for copying the Rey figure, but less pronounced for recall of the figure. Verbal fluency was more affected than design fluency. Behavioural deficits could be detected in 33% of patients. Attention deficit problems were marked in 12.5%, whereas others demonstrated psychiatric symptoms such as mutism, addiction problems, anorexia, uncontrolled temper tantrums and phobia. Age at tumour operation and size of tumour had no influence on outcome. Vermis involvement was related to an increase in neuropsychological and psychiatric problems. The observation that patients with left‐sided cerebellar tumours were more affected than patients with right‐sided tumours is probably also influenced by a more pronounced vermian involvement in the former group. In summary, this study confirms the importance of the cerebellum for cognitive development and points to the necessity of careful follow‐up for these children to provide them with the necessary help to achieve full integration into professional life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There are differences in the literature regarding outcomes of premature small-for-gestational-age (SGA) and appropriate-for gestational-age (AGA) infants, possibly due to failure to take into account gestational age at birth. OBJECTIVE: To compare mortality and respiratory morbidity of SGA and AGA premature newborn infants. DESIGN/METHODS: A retrospective study was done of the 2,487 infants born without congenital anomalies at RESULTS: Controlling for GA, premature SGA infants were at a higher risk for mortality (Odds ratio 3.1, P = 0.001) and at lower risk of respiratory distress syndrome (OR = 0.71, p = 0.02) than AGA infants. However multivariate logistic regression modeling found that the odds of having respiratory distress syndrome (RDS) varied between SGA and AGA infants by GA. There was no change in RDS risk in SGA infants at GA 32 wk (OR = 0.41, 95% CI 0.27 - 0.63; p < 0.01). After controlling for GA, SGA infants were observed to be at a significantly higher risk for developing chronic lung disease as compared to AGA infants (OR = 2.2, 95% CI = 1.2 - 3.9, P = 0.01). There was no significant difference between SGA and AGA infants in total days on ventilator. Among infants who survived, mean length of hospital stay was significantly higher in SGA infants born between 26-36 wks GA than AGA infants. CONCLUSIONS: Premature SGA infants have significantly higher mortality, significantly higher risk of developing chronic lung disease and longer hospital stay as compared to premature AGA infants. Even the reduced risk of RDS in infants born at >/=32 wk GA, (conferred possibly by intra-uterine stress leading to accelerated lung maturation) appears to be of transient effect and is counterbalanced by adverse effects of poor intrauterine growth on long term pulmonary outcomes such as chronic lung disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the causes and consequences of wildfires in forests of the western United States requires integrated information about fire, climate changes, and human activity on multiple temporal scales. We use sedimentary charcoal accumulation rates to construct long-term variations in fire during the past 3,000 y in the American West and compare this record to independent fire-history data from historical records and fire scars. There has been a slight decline in burning over the past 3,000 y, with the lowest levels attained during the 20th century and during the Little Ice Age (LIA, ca. 1400-1700 CE Common Era]). Prominent peaks in forest fires occurred during the Medieval Climate Anomaly (ca. 950-1250 CE) and during the 1800s. Analysis of climate reconstructions beginning from 500 CE and population data show that temperature and drought predict changes in biomass burning up to the late 1800s CE. Since the late 1800s, human activities and the ecological effects of recent high fire activity caused a large, abrupt decline in burning similar to the LIA fire decline. Consequently, there is now a forest ``fire deficit'' in the western United States attributable to the combined effects of human activities, ecological, and climate changes. Large fires in the late 20th and 21st century fires have begun to address the fire deficit, but it is continuing to grow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Copper and its main transport protein ceruloplasmin have been suggested to promote the development of atherosclerosis. Most of the data come from experimental and animal model studies. Copper and mortality have not been simultaneously evaluated in patients undergoing coronary angiography. METHODS AND RESULTS We examined whether serum copper and ceruloplasmin concentrations are associated with angiographic coronary artery disease (CAD) and mortality from all causes and cardiovascular causes in 3253 participants of the Ludwigshafen Risk and Cardiovascular Health Study. Age and sex-adjusted hazard ratios (HR) for death from any cause were 2.23 (95% CI, 1.85-2.68) for copper and 2.63 (95% CI, 2.17-3.20) for ceruloplasmin when we compared the highest with the lowest quartiles. Corresponding hazard ratios (HR) for death from cardiovascular causes were 2.58 (95% CI, 2.05-3.25) and 3.02 (95% CI, 2.36-3.86), respectively. Further adjustments for various risk factors and clinical variables considerably attenuated these associations, which, however, were still statistically significant and the results remained consistent across subgroups. CONCLUSIONS The elevated concentrations of both copper and ceruloplasmin are independently associated with increased risk of mortality from all causes and from cardiovascular causes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing body of evidence suggests a link between early childhood trauma, post-traumatic stress disorder (PTSD) and higher risk for dementia in old age. The aim of the present study was to investigate the association between childhood trauma exposure, PTSD and neurocognitive function in a unique cohort of former indentured Swiss child laborers in their late adulthood. To the best of our knowledge this is the first study ever conducted on former indentured child laborers and the first to investigate the relationship between childhood versus adulthood trauma and cognitive function. According to PTSD symptoms and whether they experienced childhood trauma (CT) or adulthood trauma (AT), participants (n = 96) were categorized as belonging to one of four groups: CT/PTSD+, CT/PTSD-, AT/PTSD+, AT/PTSD-. Information on cognitive function was assessed using the Structured Interview for Diagnosis of Dementia of Alzheimer Type, Multi-infarct Dementia and Dementia of other Etiology according to ICD-10 and DSM-III-R, the Mini-Mental State Examination, and a vocabulary test. Depressive symptoms were investigated as a potential mediator for neurocognitive functioning. Individuals screening positively for PTSD symptoms performed worse on all cognitive tasks compared to healthy individuals, independent of whether they reported childhood or adulthood adversity. When controlling for depressive symptoms, the relationship between PTSD symptoms and poor cognitive function became stronger. Overall, results tentatively indicate that PTSD is accompanied by cognitive deficits which appear to be independent of earlier childhood adversity. Our findings suggest that cognitive deficits in old age may be partly a consequence of PTSD or at least be aggravated by it. However, several study limitations need to considered. Consideration of cognitive deficits when treating PTSD patients and victims of lifespan trauma (even without a diagnosis of a psychiatric condition) is crucial. Furthermore, early intervention may prevent long-term deficits in memory function and development of dementia in adulthood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To assess long-term clinical outcomes of consecutive high-risk patients with severe aortic stenosis according to treatment allocation to transcatheter aortic valve implantation (TAVI), surgical aortic valve replacement (SAVR) or medical treatment (MT). METHODS Patients with severe aortic stenosis were consecutively enrolled into a prospective single centre registry. RESULTS Among 442 patients (median age 83 years, median STS-score 4.7) allocated to MT (n=78), SAVR (n=107), or TAVI (n=257) all-cause mortality amounted to 81%, 37% and 43% after a median duration of follow-up of 3.9 years (p<0.001). Rates of major adverse cerebro-cardiovascular events were lower in patients undergoing SAVR or TAVI as compared with MT (SAVR vs MT: HR 0.31, 95% CI 0.21 to 0.46) (TAVI vs MT: HR 0.34, 95% CI 0.25 to 0.46), with no significant difference between SAVR and TAVI (HR 0.88, 95% CI 0.62 to 1.25). Whereas SAVR (HR 0.39, 95% CI 0.24 to 0.61), TAVI (HR 0.37, 95% CI 0.26 to 0.52), and female gender (HR 0.72, 95% CI 0.53 to 0.99) were associated with improved survival, body mass index ≤20 kg/m(2) (HR 1.60, 95% CI 1.04 to 2.47), diabetes (HR 1.48, 95% CI 1.03 to 2.12), peripheral vascular disease (HR 2.01, 95% CI 1.44 to 2.81), atrial fibrillation (HR 1.74, 95% CI 1.28 to 2.37) and pulmonary hypertension (HR 1.43, 95% CI 1.03 to 2.00) were identified as independent predictors of mortality. CONCLUSIONS Among high-risk patients with severe aortic stenosis, long-term clinical outcome through 5 years was comparable between patients allocated to SAVR or TAVI. In contrast, patients with MT had a dismal prognosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Evidence-based guidelines are needed to guide effective long-term follow-up (LTFU) of childhood cancer survivors (CCS) at risk of late adverse effects (LAEs). We aimed to ascertain the use of LTFU guidelines throughout Europe, and seek views on the need for pan-European LTFU guidelines. PROCEDURES One expert clinician from each of 44 European countries was invited to participate in an online survey. Information was sought regarding the use and content of LTFU guidelines in the respondent's centre and country, and their views about developing pan-European LTFU guidelines. RESULTS Thirty-one countries (70%) responded, including 24 of 26 full EU countries (92%). LTFU guidelines were implemented nationally in 17 countries (55%). All guidelines included recommendations about physical LAEs, specific risk groups and frequency of surveillance, and the majority about psychosocial LAEs (70%), and healthy lifestyle promotion (65%). A minority of guidelines described recommendations about transition to age-appropriate LTFU services (22%), where LTFU should be performed (22%) and by whom (30%). Most respondents (94%) agreed on the need for pan-European LTFU guidelines, specifically including recommendations about surveillance for specific physical LAEs (97%), action to be taken if a specific LAE is detected (90%), minimum requirements for LTFU (93%), transition and health promotion (both 87%). CONCLUSIONS Guidelines are not universally used throughout Europe. However, there is strong support for developing pan-European LTFU guidelines for CCS. PanCareSurFup (www.pancare.eu) will collaborate with partners to develop such guidelines, including recommendations for hitherto relatively neglected topics, such as minimum LTFU requirements, transition and health promotion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Nowadays there is extensive evidence available showing the efficacy of cognitive remediation therapies. Integrative approaches seem superior regarding the maintenance of proximal outcome at follow-up as well as generalization to other areas of functioning. To date, only limited evidence about the efficacy of CRT is available concerning elder schizophrenia patients. The Integrated Neurocognitive Therapy (INT) represents a new developed cognitive remediation approach. It is a manualized group therapy approach targeting all 11 NIMH-MATRICS dimensions within one therapy concept. In this study we compared the effects of INT on an early course group (duration of disease<5 years) to a long-term group of schizophrenia outpatients (duration of disease>15 years). Methods An international multicenter study carried out in Germany, Switzerland and Austria with a total of 90 outpatients diagnosed with Schizophrenia (DSM-IV-TR) were randomly assigned either to an INT-Therapy or to Treatment-As-Usual (TAU). 50 of the 90 Patients were an Early-Course (EC) group, suffering from schizophrenia for less than 5 years (Mean age=29 years, Mean duration of illness=3.3 years). The other 40 were a Long-term Course (LC) group, suffering from schizophrenia longer than 15 years (Mean age= 45 years, Mean duration of illness=22 years). Treatment comprised of 15 biweekly sessions. An extensive assessment battery was conducted before and after treatment and at follow up (1 year). Multivariate General Linear Models (GLM) (duration of illness x treatment x time) examined our hypothesis, if an EC group of schizophrenia outpatients differ in proximal and distal outcome from a LC group. Results Irrespective of the duration of illness, both groups (EC & LC) were able to benefit from the INT. INT was superior compared to TAU in most of the assessed domains. Dropout rate of EC group was much higher (21.4%) than LC group (8%) during therapy phase. However, interaction effects show that the LC group revealed significantly higher effects in the neurocognitive domains of speed of processing (F>3.6) and vigilance (F>2.4). In social cognition the EC group showed significantly higher effects in social schema (F>2.5) and social attribution (blame; F>6.0) compared to the LC group. Regarding more distal outcome, patients treated with INT obtained reduced general symptoms unaffected by the duration of illness during therapy phase and at follow-up (F>4.3). Discussion Results suggest that INT is a valid goal-oriented treatment to improve cognitive functions in schizophrenia outpatients. Irrespective of the duration of illness significant treatment, effects were evident. Against common expectations, long-term, more chronic patients showed higher effects in basal cognitive functions compared to younger patients and patients without any active therapy (TAU). Consequently, more integrated therapy offers are also recommended for long-term course schizophrenia patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Spinal myxopapillary ependymomas (MPEs) are slowly growing ependymal gliomas with preferential manifestation in young adults. The aim of this study was to assess the outcome of patients with MPE treated with surgery, radiotherapy (RT), and/or chemotherapy. METHODS The medical records of 183 MPE patients (male: 59%) treated at the MD Anderson Cancer Center and 11 institutions from the Rare Cancer Network were retrospectively reviewed. Mean patient' age at diagnosis was 35.5 ± 15.8 years. Ninety-seven (53.0%) patients underwent surgery without RT, and 86 (47.0%) were treated with surgery and/or RT. Median RT dose was 50.4 Gy. Median follow-up was 83.9 months. RESULTS Fifteen (8.2%) patients died, 7 of unrelated cause. The estimated 10-year overall survival was 92.4% (95% CI: 87.7-97.1). Treatment failure was observed in 58 (31.7%) patients. Local failure, distant spinal relapse, and brain failure were observed in 49 (26.8%), 17 (9.3%), and 11 (6.0%) patients, respectively. The estimated 10-year progression-free survival was 61.2% (95% CI: 52.8-69.6). Age (<36 vs ≥36 y), treatment modality (surgery alone vs surgery and RT), and extent of surgery were prognostic factors for local control and progression-free survival on univariate and multivariate analysis. CONCLUSIONS In this series, treatment failure of MPE occurred in approximately one third of patients. The observed recurrence pattern of primary spinal MPE was mainly local, but a substantial number of patients failed nonlocally. Younger patients and those not treated initially with adjuvant RT or not undergoing gross total resection were significantly more likely to present with tumor recurrence/progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Sporadic late-onset nemaline myopathy (SLONM) is a rare, late-onset myopathy that progresses subacutely. If associated with a monoclonal gammopathy of unknown significance (MGUS), the outcome is unfavorable: the majority of these patients die within 1 to 5 years of respiratory failure. This study aims to qualitatively assess the long-term treatment effect of high-dose melphalan (HDM) followed by autologous stem cell transplantation (SCT) in a series of 8 patients with SLONM-MGUS. METHODS We performed a retrospective case series study (n = 8) on the long-term (1-8 years) treatment effect of HDM followed by autologous SCT (HDM-SCT) on survival, muscle strength, and functional capacities. RESULTS Seven patients showed a lasting moderate-good clinical response, 2 of them after the second HDM-SCT. All of them had a complete, a very good partial, or a partial hematologic response. One patient showed no clinical or hematologic response and died. CONCLUSIONS This case series shows the positive effect of HDM-SCT in this rare disorder. Factors that may portend an unfavorable outcome are a long disease course before the hematologic treatment and a poor hematologic response. Age at onset, level and type of M protein (κ vs λ), and severity of muscle weakness were not associated with a specific outcome. CLASSIFICATION OF EVIDENCE This study provides Class IV evidence that for patients with SLONM-MGUS, HDM-SCT increases the probability of survival and functional improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in fire occurrence during the last decades in the southern Swiss Alps make knowledge on fire history essential to understand future evolution of the ecosystem composition and functioning. In this context, palaeoecology provides useful insights into processes operating at decadal-to-millennial time scales, such as the response of plant communities to intensified fire disturbances during periods of cultural change. We provide a high-resolution macroscopic charcoal and pollen series from Guèr, a well-dated peat sequence at mid-elevation (832 m.a.s.l.) in southern Switzerland, where the presence of local settlements is documented since the late Bronze Age and the Iron Age. Quantitative fire reconstruction shows that fire activity sharply increased from the Neolithic period (1–3 episodes/1000 year) to the late Bronze and Iron Age (7–9 episodes/1000 year), leading to extensive clearance of the former mixed deciduous forest (Alnus glutinosa, Betula, deciduous Quercus). The increase in anthropogenic pollen indicators (e.g. Cerealia-type, Plantago lanceolata) together with macroscopic charcoal suggests anthropogenic rather than climatic forcing as the main cause of the observed vegetation shift. Fire and controlled burning were extensively used during the late Roman Times and early Middle Ages to promote the introduction and establishment of chestnut (Castanea sativa) stands, which provided an important wood and food supply. Fire occurrence declined markedly (from 9 to 5–6 episodes/1000 year) during late Middle Ages because of fire suppression, biomass removal by human population, and landscape fragmentation. Land-abandonment during the last decades allowed forest to partly re-expand (mainly Alnus glutinosa, Betula) and fire frequency to increase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Objectives We report our institutional experience and long-term results with the Sorin Freedom SOLO bovine pericardial stentless bioprosthesis. Methods Between January 2005 and November 2009, 149 patients (mean age 73.6±8.7 years, 68 [45.6%] female) underwent isolated (n=75) or combined (n=74) aortic valve replacement (AVR) using the SOLO in our institution. Follow-up was 100% complete with an average follow-up time of 5.9±2.6 years (maximum 9.6 years) and a total of 885.3 patient years. Results Operative (30-day) mortality was 2.7% (1.3% for isolated AVR [n=1] and 4.0% for combined procedures [n=3]). All causes of death were not valve-related. Preoperative peak (mean) gradients of 74.2±23.0 mmHg (48.6 ± 16.3 mmHg) decreased to 15.6±5.4 (8.8±3.0) after AVR, and remained low for up to 9 years. The postoperative effective orifice area (EOA) was 1.6 ±0.57 cm2, 1.90±0.45 cm2, 2.12±0.48 cm2 and 2.20±0.66 cm2 for the valve sizes 21, 23, 25 and 27, respectively; with absence of severe prosthesis-patient-mismatch (PPM) and 0.7% (n=1) moderate PPM. During follow-up, Twenty-six patients experienced structural valve deterioration (SVD) and 14 patients underwent explantation. Kaplan-Meier estimates for freedom from death, explantation and SVD at 9 years averaged 0.57 [0.47‒0.66], 0.82 [0.69‒0.90] and 0.70 [0.57‒0.79], respectively. Conclusions The Freedom SOLO stentless aortic valve is safe to implant and shows excellent early and mid-term hemodynamic performance. However, SVD was observed in a substantial number of patients after only 5 ̶ 6 years and the need for explantation increased markedly, suggesting low durability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Algorithms to predict the future long-term risk of patients with stable coronary artery disease (CAD) are rare. The VIenna and Ludwigshafen CAD (VILCAD) risk score was one of the first scores specifically tailored for this clinically important patient population. The aim of this study was to refine risk prediction in stable CAD creating a new prediction model encompassing various pathophysiological pathways. Therefore, we assessed the predictive power of 135 novel biomarkers for long-term mortality in patients with stable CAD. DESIGN, SETTING AND SUBJECTS We included 1275 patients with stable CAD from the LUdwigshafen RIsk and Cardiovascular health study with a median follow-up of 9.8 years to investigate whether the predictive power of the VILCAD score could be improved by the addition of novel biomarkers. Additional biomarkers were selected in a bootstrapping procedure based on Cox regression to determine the most informative predictors of mortality. RESULTS The final multivariable model encompassed nine clinical and biochemical markers: age, sex, left ventricular ejection fraction (LVEF), heart rate, N-terminal pro-brain natriuretic peptide, cystatin C, renin, 25OH-vitamin D3 and haemoglobin A1c. The extended VILCAD biomarker score achieved a significantly improved C-statistic (0.78 vs. 0.73; P = 0.035) and net reclassification index (14.9%; P < 0.001) compared to the original VILCAD score. Omitting LVEF, which might not be readily measureable in clinical practice, slightly reduced the accuracy of the new BIO-VILCAD score but still significantly improved risk classification (net reclassification improvement 12.5%; P < 0.001). CONCLUSION The VILCAD biomarker score based on routine parameters complemented by novel biomarkers outperforms previous risk algorithms and allows more accurate classification of patients with stable CAD, enabling physicians to choose more personalized treatment regimens for their patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a cohort study among 2751 members (71.5% females) of the German and Swiss RLS patient organizations changes in restless legs syndrome (RLS) severity over time was assessed and the impact on quality of life, sleep quality and depressive symptoms was analysed. A standard set of scales (RLS severity scale IRLS, SF-36, Pittsburgh Sleep Quality Index and the Centre for Epidemiologic Studies Depression Scale) in mailed questionnaires was repeatedly used to assess RLS severity and health status over time and a 7-day diary once to assess short-term variations. A clinically relevant change of the RLS severity was defined by a change of at least 5 points on the IRLS scale. During 36 months follow-up minimal improvement of RLS severity between assessments was observed. Men consistently reported higher severity scores. RLS severity increased with age reaching a plateau in the age group 45-54 years. During 3 years 60.2% of the participants had no relevant (±5 points) change in RLS severity. RLS worsening was significantly related to an increase in depressive symptoms and a decrease in sleep quality and quality of life. The short-term variation showed distinctive circadian patterns with rhythm magnitudes strongly related to RLS severity. The majority of participants had a stable course of severe RLS over three years. An increase in RLS severity was accompanied by a small to moderate negative, a decrease by a small positive influence on quality of life, depressive symptoms and sleep quality.