950 resultados para European adults
Resumo:
BACKGROUND: Serologic methods have been used widely to test for celiac disease and have gained importance in diagnostic definition and in new epidemiologic findings. However, there is no standardization, and there are no reference protocols and materials. METHODS: The European working group on Serological Screening for Celiac Disease has defined robust noncommercial test protocols for immunoglobulin (Ig)G and IgA gliadin antibodies and for IgA autoantibodies against endomysium and tissue transglutaminase. Standard curves were linear in the decisive range, and intra-assay variation coefficients were less than 5% to 10%. Calibration was performed with a group reference serum. Joint cutoff limits were used. Seven laboratories took part in the final collaborative study on 252 randomized sera classified by histology (103 pediatric and adult patients with active celiac disease, 89 disease control subjects, and 60 blood donors). RESULTS: IgA autoantibodies against endomysium and tissue transglutaminase rendered superior sensitivity (90% and 93%, respectively) and specificity (99% and 95%, respectively) over IgA and IgG gliadin antibodies. Tissue transglutaminase antibody testing showed superior receiver operating characteristic performance compared with gliadin antibodies. The K values for interlaboratory reproducibility showed superiority for IgA endomysium (0.93) in comparison with tissue transglutaminase antibodies (0.83) and gliadin antibodies (0.82 for IgG, 0.62 for IgA). CONCLUSIONS: Basic criteria of standardization and quality assessment must be fulfilled by any given test protocol proposed for serologic investigation of celiac disease. The working group has produced robust test protocols and reference materials available for standardization to further improve reliability of serologic testing for celiac disease.
Resumo:
OBJECTIVE: To investigate the value of serum antitissue transglutaminase IgA antibodies (IgA-TTG) and IgA antiendomysial antibodies (IgA-EMA) in the diagnosis of coeliac disease in cohorts from different geographical areas in Europe. The setting allowed a further comparison between the antibody results and the conventional small-intestinal histology. METHODS: A total of 144 cases with coeliac disease [median age 19.5 years (range 0.9-81.4)], and 127 disease controls [median age 29.2 years (range 0.5-79.0)], were recruited, on the basis of biopsy, from 13 centres in nine countries. All biopsy specimens were re-evaluated and classified blindly a second time by two investigators. IgA-TTG were determined by ELISA with human recombinant antigen and IgA-EMA by an immunofluorescence test with human umbilical cord as antigen. RESULTS: The quality of the biopsy specimens was not acceptable in 29 (10.7%) of 271 cases and a reliable judgement could not be made, mainly due to poor orientation of the samples. The primary clinical diagnosis and the second classification of the biopsy specimens were divergent in nine cases, and one patient was initially enrolled in the wrong group. Thus, 126 coeliac patients and 106 controls, verified by biopsy, remained for final analysis. The sensitivity of IgA-TTG was 94% and IgA-EMA 89%, the specificity was 99% and 98%, respectively. CONCLUSIONS: Serum IgA-TTG measurement is effective and at least as good as IgA-EMA in the identification of coeliac disease. Due to a high percentage of poor histological specimens, the diagnosis of coeliac disease should not depend only on biopsy, but in addition the clinical picture and serology should be considered.
Resumo:
PURPOSE: To compare the efficacy of paclitaxel versus doxorubicin given as single agents in first-line therapy of advanced breast cancer (primary end point, progression-free survival ¿PFS) and to explore the degree of cross-resistance between the two agents. PATIENTS AND METHODS: Three hundred thirty-one patients were randomized to receive either paclitaxel 200 mg/m(2), 3-hour infusion every 3 weeks, or doxorubicin 75 mg/m(2), intravenous bolus every 3 weeks. Seven courses were planned unless progression or unacceptable toxicity occurred before the seven courses were finished. Patients who progressed within the seven courses underwent early cross-over to the alternative drug, while a delayed cross-over was optional for the remainder of patients at the time of disease progression. RESULTS: Objective response in first-line therapy was significantly better (P =.003) for doxorubicin (response rate ¿RR, 41%) than for paclitaxel (RR, 25%), with doxorubicin achieving a longer median PFS (7.5 months for doxorubicin v 3.9 months for paclitaxel, P <.001). In second-line therapy, cross-over to doxorubicin (91 patients) and to paclitaxel (77 patients) gave response rates of 30% and 16%, respectively. The median survival durations of 18.3 months for doxorubicin and 15.6 months for paclitaxel were not significantly different (P =.38). The doxorubicin arm had greater toxicity, but this was counterbalanced by better symptom control. CONCLUSION: At the dosages and schedules used in the present study, doxorubicin achieves better disease and symptom control than paclitaxel in first-line treatment. Doxorubicin and paclitaxel are not totally cross-resistant, which supports further investigation of these drugs in combination or in sequence, both in advanced disease and in the adjuvant setting.
Resumo:
We have performed a retrospective analysis to evaluate the impact of age, using a 70 year cutoff, on the safety and efficacy of pegylated liposomal doxorubicin (Caelyx) given at 60 mg/m(2) every 6 weeks (treatment A) or 50 mg/m(2) every 4 weeks (treatment B) to 136 metastatic breast cancer patients in two EORTC trials, of whom 65 were 70 years of age or older. No difference in terms of toxicity was observed between younger and older patients treated with the 4-week schedule, while a higher incidence of hematological toxicity, anorexia, asthenia, and stomatitis was observed in older patients when the 6-week schedule was used. Antitumor activity was not affected by age. In the older cohort of patients, no dependence was found between the incidence of grade 3-4 toxicity or antitumor activity and patients' baseline performance status, number and severity of comorbidities, or number of concomitant medications. The higher therapeutic index of Caelyx 50 mg/m(2) every 4 weeks makes it, of the two dose schedules investigated, the preferred regimen in the elderly.
Resumo:
PURPOSE: To compare health-related quality of life (HRQOL) in patients with metastatic breast cancer receiving the combination of doxorubicin and paclitaxel (AT) or doxorubicin and cyclophosphamide (AC) as first-line chemotherapy treatment. PATIENTS AND METHODS: Eligible patients (n = 275) with anthracycline-naive measurable metastatic breast cancer were randomly assigned to AT (doxorubicin 60 mg/m(2) as an intravenous bolus plus paclitaxel 175 mg/m(2) as a 3-hour infusion) or AC (doxorubicin 60 mg/m(2) plus cyclophosphamide 600 mg/m(2)) every 3 weeks for a maximum of six cycles. Dose escalation of paclitaxel (200 mg/m(2)) and cyclophosphamide (750 mg/m(2)) was planned at cycle 2 to reach equivalent myelosuppression in the two groups. HRQOL was assessed with the European Organization for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire C30 and the EORTC Breast Module at baseline and the start of cycles 2, 4, and 6, and 3 months after the last cycle. RESULTS: Seventy-nine percent of the patients (n = 219) completed a baseline measure. However, there were no statistically significant differences in HRQOL between the two treatment groups. In both groups, selected aspects of HRQOL were impaired over time, with increased fatigue, although some clinically significant improvements in emotional functioning were seen, as well as a reduction in pain over time. Overall, global quality of life was maintained in both treatment groups. CONCLUSION: This information is important when advising women patients of the expected HRQOL consequences of treatment regimens and should help clinicians and their patients make informed treatment decisions.
Resumo:
BACKGROUND: The potential cardiotoxicity of the doxorubicin-paclitaxel regimen, when paclitaxel is given shortly after the end of the anthracycline infusion, is an issue of concern, as suggested by small single institution Phase II studies. METHODS: In a large multicenter Phase III trial, 275 anthracycline naive metastatic breast carcinoma patients were randomized to receive either doxorubicin (60 mg/m(2)) followed 30 minutes later by paclitaxel (175 mg/m(2) 3-hour infusion; AT) or a standard doxorubicin-cyclophosphamide regimen (AC; 60/600 mg/m(2)). Both treatments were given once every 3 weeks for a maximum of six cycles. Close cardiac monitoring was implemented in the study design. RESULTS: Congestive heart failure (CHF) occurred in three patients in the AT arm and in one patient in the AC arm (P = 0.62). Decreases in left ventricular ejection fraction to below the limit of normal were documented in 33% AT and 19% AC patients and were not predictive of CHF development. CONCLUSIONS: AT is devoid of excessive cardiac risk among metastatic breast carcinoma patients, when the maximum planned cumulative dose of doxorubicin does not exceed 360 mg/m(2).
Resumo:
PURPOSE: To compare the efficacy and tolerability of the combination of doxorubicin and paclitaxel (AT) with a standard doxorubicin and cyclophosphamide (AC) regimen as first-line chemotherapy for metastatic breast cancer. PATIENTS AND METHODS: Eligible patients were anthracycline-naive and had bidimensionally measurable metastatic breast cancer. Two hundred seventy-five patients were randomly assigned to be treated with AT (doxorubicin 60 mg/m(2) as an intravenous bolus plus paclitaxel 175 mg/m(2) as a 3-hour infusion) or AC (doxorubicin 60 mg/m(2) plus cyclophosphamide 600 mg/m(2)) every 3 weeks for a maximum of six cycles. A paclitaxel (200 mg/m(2)) and cyclophosphamide (750 mg/m(2)) dose escalation was planned at cycle 2 if no grade >or= 3 neutropenia occurred in cycle 1. The primary efficacy end point was progression-free survival (PFS). Secondary end points were response rate (RR), safety, overall survival (OS), and quality of life. RESULTS: A median number of six cycles were delivered in the two treatment arms. The relative dose-intensity and delivered cumulative dose of doxorubicin were lower in the AT arm. Dose escalation was only possible in 17% and 20% of the AT and AC patients, respectively. Median PFS was 6 months in the two treatments arms. RR was 58% versus 54%, and median OS was 20.6 versus 20.5 months in the AT and AC arms, respectively. The AT regimen was characterized by a higher incidence of febrile neutropenia, 32% versus 9% in the AC arm. CONCLUSION: No differences in the efficacy study end points were observed between the two treatment arms. Treatment-related toxicity compromised doxorubicin-delivered dose-intensity in the paclitaxel-based regimen
Resumo:
info:eu-repo/semantics/published
Resumo:
Childhood sexual abuse is prevalent among people living with HIV, and the experience of shame is a common consequence of childhood sexual abuse and HIV infection. This study examined the role of shame in health-related quality of life among HIV-positive adults who have experienced childhood sexual abuse. Data from 247 HIV-infected adults with a history of childhood sexual abuse were analyzed. Hierarchical linear regression was conducted to assess the impact of shame regarding both sexual abuse and HIV infection, while controlling for demographic, clinical, and psychosocial factors. In bivariate analyses, shame regarding sexual abuse and HIV infection were each negatively associated with health-related quality of life and its components (physical well-being, function and global well-being, emotional and social well-being, and cognitive functioning). After controlling for demographic, clinical, and psychosocial factors, HIV-related, but not sexual abuse-related, shame remained a significant predictor of reduced health-related quality of life, explaining up to 10% of the variance in multivariable models for overall health-related quality of life, emotional, function and global, and social well-being and cognitive functioning over and above that of other variables entered into the model. Additionally, HIV symptoms, perceived stress, and perceived availability of social support were associated with health-related quality of life in multivariable models. Shame is an important and modifiable predictor of health-related quality of life in HIV-positive populations, and medical and mental health providers serving HIV-infected populations should be aware of the importance of shame and its impact on the well-being of their patients.
Resumo:
Measuring the entorhinal cortex (ERC) is challenging due to lateral border discrimination from the perirhinal cortex. From a sample of 39 nondemented older adults who completed volumetric image scans and verbal memory indices, we examined reliability and validity concerns for three ERC protocols with different lateral boundary guidelines (i.e., Goncharova, Dickerson, Stoub, & deToledo-Morrell, 2001; Honeycutt et al., 1998; Insausti et al., 1998). We used three novice raters to assess inter-rater reliability on a subset of scans (216 total ERCs), with the entire dataset measured by one rater with strong intra-rater reliability on each technique (234 total ERCs). We found moderate to strong inter-rater reliability for two techniques with consistent ERC lateral boundary endpoints (Goncharova, Honeycutt), with negligible to moderate reliability for the technique requiring consideration of collateral sulcal depth (Insausti). Left ERC and story memory associations were moderate and positive for two techniques designed to exclude the perirhinal cortex (Insausti, Goncharova), with the Insausti technique continuing to explain 10% of memory score variance after additionally controlling for depression symptom severity. Right ERC-story memory associations were nonexistent after excluding an outlier. Researchers are encouraged to consider challenges of rater training for ERC techniques and how lateral boundary endpoints may impact structure-function associations.
Resumo:
This research tested if a 12-session coping improvement group intervention (n = 104) reduced depressive symptoms in HIV-infected older adults compared to an interpersonal support group intervention (n = 105) and an individual therapy upon request (ITUR) control condition (n = 86). Participants were 295 HIV-infected men and women 50-plus years of age living in New York City, Cincinnati, OH, and Columbus, OH. Using A-CASI assessment methodology, participants provided data on their depressive symptoms using the Geriatric Depression Screening Scale (GDS) at pre-intervention, post-intervention, and 4- and 8-month follow-up. Whether conducted with all participants (N = 295) or only a subset of participants diagnosed with mild, moderate, or severe depressive symptoms (N = 171), mixed models analyses of repeated measures found that both coping improvement and interpersonal support group intervention participants reported fewer depressive symptoms than ITUR controls at post-intervention, 4-month follow-up, and 8-month follow-up. The effect sizes of the differences between the two active interventions and the control group were greater when outcome analyses were limited to those participants with mild, moderate, or severe depressive symptoms. At no assessment period did coping improvement and interpersonal support group intervention participants differ in depressive symptoms.
Resumo:
Gemstone Team Om
Resumo:
To make adaptive choices, individuals must sometimes exhibit patience, forgoing immediate benefits to acquire more valuable future rewards [1-3]. Although humans account for future consequences when making temporal decisions [4], many animal species wait only a few seconds for delayed benefits [5-10]. Current research thus suggests a phylogenetic gap between patient humans and impulsive, present-oriented animals [9, 11], a distinction with implications for our understanding of economic decision making [12] and the origins of human cooperation [13]. On the basis of a series of experimental results, we reject this conclusion. First, bonobos (Pan paniscus) and chimpanzees (Pan troglodytes) exhibit a degree of patience not seen in other animals tested thus far. Second, humans are less willing to wait for food rewards than are chimpanzees. Third, humans are more willing to wait for monetary rewards than for food, and show the highest degree of patience only in response to decisions about money involving low opportunity costs. These findings suggest that core components of the capacity for future-oriented decisions evolved before the human lineage diverged from apes. Moreover, the different levels of patience that humans exhibit might be driven by fundamental differences in the mechanisms representing biological versus abstract rewards.
Resumo:
Adult humans, infants, pre-school children, and non-human animals appear to share a system of approximate numerical processing for non-symbolic stimuli such as arrays of dots or sequences of tones. Behavioral studies of adult humans implicate a link between these non-symbolic numerical abilities and symbolic numerical processing (e.g., similar distance effects in accuracy and reaction-time for arrays of dots and Arabic numerals). However, neuroimaging studies have remained inconclusive on the neural basis of this link. The intraparietal sulcus (IPS) is known to respond selectively to symbolic numerical stimuli such as Arabic numerals. Recent studies, however, have arrived at conflicting conclusions regarding the role of the IPS in processing non-symbolic, numerosity arrays in adulthood, and very little is known about the brain basis of numerical processing early in development. Addressing the question of whether there is an early-developing neural basis for abstract numerical processing is essential for understanding the cognitive origins of our uniquely human capacity for math and science. Using functional magnetic resonance imaging (fMRI) at 4-Tesla and an event-related fMRI adaptation paradigm, we found that adults showed a greater IPS response to visual arrays that deviated from standard stimuli in their number of elements, than to stimuli that deviated in local element shape. These results support previous claims that there is a neurophysiological link between non-symbolic and symbolic numerical processing in adulthood. In parallel, we tested 4-y-old children with the same fMRI adaptation paradigm as adults to determine whether the neural locus of non-symbolic numerical activity in adults shows continuity in function over development. We found that the IPS responded to numerical deviants similarly in 4-y-old children and adults. To our knowledge, this is the first evidence that the neural locus of adult numerical cognition takes form early in development, prior to sophisticated symbolic numerical experience. More broadly, this is also, to our knowledge, the first cognitive fMRI study to test healthy children as young as 4 y, providing new insights into the neurophysiology of human cognitive development.
Resumo:
This study assessed the sustained effect of a physical activity (PA) counseling intervention on PA one year after intervention, predictors of sustained PA participation, and three classes of post-intervention PA trajectories (improvers, maintainers, and decliners) in 238 older Veterans. Declines in minutes of PA from 12 to 24 months were observed for both the treatment and control arms of the study. PA at 12 months was the strongest predictor of post-intervention changes in PA. To our surprise, those who took up the intervention and increased PA levels the most, had significant declines in post-intervention PA. Analysis of the three post-intervention PA trajectories demonstrated that the maintenance group actually reflected a group of nonresponders to the intervention who had more comorbidities, lower self-efficacy, and worse physical function than the improvers or decliners. Results suggest that behavioral counseling/support must be ongoing to promote maintenance. Strategies to promote PA appropriately to subgroups of individuals are needed.