903 resultados para meta-analytical study
Resumo:
BACKGROUND Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. METHODOLOGY/PRINCIPAL FINDINGS We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. CONCLUSION/SIGNIFICANCE The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and perspectives of using nutrient patterns in future studies particularly at international level.
Resumo:
The diagnosis of mucocutaneous leishmaniasis (MCL) is hampered by the absence of a gold standard. An accurate diagnosis is essential because of the high toxicity of the medications for the disease. This study aimed to assess the ability of polymerase chain reaction (PCR) to identify MCL and to compare these results with clinical research recently published by the authors. A systematic literature review based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA Statement was performed using comprehensive search criteria and communication with the authors. A meta-analysis considering the estimates of the univariate and bivariate models was performed. Specificity near 100% was common among the papers. The primary reason for accuracy differences was sensitivity. The meta-analysis, which was only possible for PCR samples of lesion fragments, revealed a sensitivity of 71% [95% confidence interval (CI) = 0.59; 0.81] and a specificity of 93% (95% CI = 0.83; 0.98) in the bivariate model. The search for measures that could increase the sensitivity of PCR should be encouraged. The quality of the collected material and the optimisation of the amplification of genetic material should be prioritised.
Resumo:
BACKGROUND Breast cancer survivors suffer physical impairment after oncology treatment. This impairment reduces quality of life (QoL) and increase the prevalence of handicaps associated to unhealthy lifestyle (for example, decreased aerobic capacity and strength, weight gain, and fatigue). Recent work has shown that exercise adapted to individual characteristics of patients is related to improved overall and disease-free survival. Nowadays, technological support using telerehabilitation systems is a promising strategy with great advantage of a quick and efficient contact with the health professional. It is not known the role of telerehabilitation through therapeutic exercise as a support tool to implement an active lifestyle which has been shown as an effective resource to improve fitness and reduce musculoskeletal disorders of these women. METHODS / DESIGN This study will use a two-arm, assessor blinded, parallel randomized controlled trial design. People will be eligible if: their diagnosis is of stages I, II, or IIIA breast cancer; they are without chronic disease or orthopedic issues that would interfere with ability to participate in a physical activity program; they had access to the Internet and basic knowledge of computer use or living with a relative who has this knowledge; they had completed adjuvant therapy except for hormone therapy and not have a history of cancer recurrence; and they have an interest in improving lifestyle. Participants will be randomized into e-CUIDATE or usual care groups. E-CUIDATE give participants access to a range of contents: planning exercise arranged in series with breathing exercises, mobility, strength, and stretching. All of these exercises will be assigned to women in the telerehabilitation group according to perceived needs. The control group will be asked to maintain their usual routine. Study endpoints will be assessed after 8 weeks (immediate effects) and after 6 months. The primary outcome will be QoL measured by The European Organization for Research and Treatment of Cancer Quality of Life Questionnaire Core 30 version 3.0 and breast module called The European Organization for Research and Treatment of Cancer Breast Cancer-Specific Quality of Life questionnaire. The secondary outcomes: pain (algometry, Visual Analogue Scale, Brief Pain Inventory short form); body composition; physical measurement (abdominal test, handgrip strength, back muscle strength, and multiple sit-to-stand test); cardiorespiratory fitness (International Fitness Scale, 6-minute walk test, International Physical Activity Questionnaire-Short Form); fatigue (Piper Fatigue Scale and Borg Fatigue Scale); anxiety and depression (Hospital Anxiety and Depression Scale); cognitive function (Trail Making Test and Auditory Consonant Trigram); accelerometry; lymphedema; and anthropometric perimeters. DISCUSSION This study investigates the feasibility and effectiveness of a telerehabilitation system during adjuvant treatment of patients with breast cancer. If this treatment option is effective, telehealth systems could offer a choice of supportive care to cancer patients during the survivorship phase. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT01801527.
Resumo:
INTRODUCTION For critically patients, enteral immunonutrition results in notable reductions in infections and in length of stay in hospital, but not on mortality, raising the question as to whether this relate to the heterogeneous nature of critically ill patients or to the absence of the altered absorption of specific nutrients within the immunonutrient mix (e.g. iron). Immune-associated functional iron deficiency (FID) is not only one of the many causes or anaemia in the critically ill, but also a cause of inappropriate immune response, leading to a longer duration of episodes of systemic inflammatory response syndrome and poor outcome. OBJECTIVE This prospective cross-sectional study was undertaken to assess the prevalence of FID in critically ill patients during their stay in intensive care (ICU) in order to find the more appropriate population of patients that can benefit from iron therapy. METHOD Full blood cell counts, including reticulocytes (RETIC), serum iron (SI), transferring levels (TRF) and saturation (satTRF), serum TFR receptor (sTfR), ferritin (FRT) and C-reactive protein (CRP) were measured in venous blood samples from 131 random patients admitted to the ICU for at least 24 h (Length of ICU stay, LIS; min: 1 day; max: 38 days). RESULTS Anaemia (Hb < 12 g/dL) was present in 76% of the patients (Hb < 10 g/dL in 33%), hypoferremia (SI < 45 microg/dl) in 69%; satTRF < 20% in 53%; FRT < 100 ng/mL in 23%; sTfR > 2.3 mg/dL in 13%; and CRP > 0.5 mg/dL in 88%. Statistically significant correlations (r of Pearson; *p < 0.05, **p < 0.01) were obtained for serum CRP levels and WBC**, Hb*, TRF**, satTRF*, and FRT**. There was also a strong correlation between TRF and FRT (-0.650**), but not between FRT and satTRF or SI. LIS correlated with Hb*, CRP**, TRF*, satTRF* and FRT**. CONCLUSIONS A large proportion of critically ill patients admitted to the ICU presented the typical functional iron deficiency (FID) of acute inflammation-related anaemia (AIRA). This FID correlates with the inflammatory status and the length of stay at the ICU. However, 21% of the ICU patients with AIRA had an associated real iron deficiency (satTRF < 20; FRT < 100 and sTfR > 2.3). Since oral supplementation of iron seems to be ineffective, all these patients might benefit of iv iron therapy for correction of real or functional iron deficiency, which in turn might help to ameliorate their inflammatory status.
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
INTRODUCTION According to genome wide association (GWA) studies as well as candidate gene approaches, Behçet's disease (BD) is associated with human leukocyte antigen (HLA)-A and HLA-B gene regions. The HLA-B51 has been consistently associated with the disease, but the role of other HLA class I molecules remains controversial. Recently, variants in non-HLA genes have also been associated with BD. The aims of this study were to further investigate the influence of the HLA region in BD and to explore the relationship with non-HLA genes recently described to be associated in other populations. METHODS This study included 304 BD patients and 313 ethnically matched controls. HLA-A and HLA-B low resolution typing was carried out by PCR-SSOP Luminex. Eleven tag single nucleotide polymorphisms (SNPs) located outside of the HLA-region, previously described associated with the disease in GWA studies and having a minor allele frequency in Caucasians greater than 0.15 were genotyped using TaqMan assays. Phenotypic and genotypic frequencies were estimated by direct counting and distributions were compared using the χ(2) test. RESULTS In addition to HLA-B*51, HLA-B*57 was found as a risk factor in BD, whereas, B*35 was found to be protective. Other HLA-A and B specificities were suggestive of association with the disease as risk (A*02 and A*24) or protective factors (A*03 and B*58). Regarding the non-HLA genes, the three SNPs located in IL23R and one of the SNPs in IL10 were found to be significantly associated with susceptibility to BD in our population. CONCLUSION Different HLA specificities are associated with Behçet's disease in addition to B*51. Other non-HLA genes, such as IL23R and IL-10, play a role in the susceptibility to the disease.
Resumo:
BACKGROUND While pain is frequently associated with unipolar depression, few studies have investigated the link between pain and bipolar depression. In the present study we estimated the prevalence and characteristics of pain among patients with bipolar depression treated by psychiatrists in their regular clinical practice. The study was designed to identify factors associated with the manifestation of pain in these patients. METHODS Patients diagnosed with bipolar disorder (n=121) were selected to participate in a cross-sectional study in which DSM-IV-TR criteria were employed to identify depressive episodes. The patients were asked to describe any pain experienced during the study, and in the 6 weeks beforehand, by means of a Visual Analogical Scale (VAS). RESULTS Over half of the bipolar depressed patients (51.2%, 95% CI: 41.9%-60.6%), and 2/3 of the female experienced concomitant pain. The pain was of moderate to severe intensity and prolonged duration, and it occurred at multiple sites, significantly limiting the patient's everyday activities. The most important factors associated with the presence of pain were older age, sleep disorders and delayed diagnosis of bipolar disorder. CONCLUSIONS Chronic pain is common in bipolar depressed patients, and it is related to sleep disorders and delayed diagnosis of their disorder. More attention should be paid to study the presence of pain in bipolar depressed patients, in order to achieve more accurate diagnoses and to provide better treatment options.
Resumo:
The emergence of novel drugs corresponds with the determination of the effectiveness of the current treatments used in clinical practice. A retrospective observational study was conducted to evaluate the effectiveness of first-line treatments and to test the influence of the prognostic factors established using the Memorial Sloan-Kettering Cancer Center (MSKCC) and the analysis of Mekhail's study for two or more metastatic sites. The primary endpoints were median progression-free survival (mPFS) and median overall survival (mOS) times. A total of 65 patients were enrolled and the mPFS and mOS of the patients treated with sunitinib (n=51) were 9.0 and 20.1 months, respectively, and for the patients treated with temsirolimus (n=14) these were 3.0 and 6.2 months, respectively. In the poor-prognosis (PP) group, a difference of 1.2 months (P=0.049) was found in mPFS depending on the first-line treatment. A difference of 4.1 months (P=0.023) was also found in mPFS when classified by histology (clear verses non-clear cell) in the sunitinib-treatment group. When stratified by the prognostic group, differences of >7 months (P<0.001) were found between the groups. Therefore, it was concluded that the effectiveness of the treatments was reduced compared to previous studies and differences were found in the PP group when classified by first-line drug and histology. Additionally, the influence of prognostic factors on OS and the value of stratifying patients using these factors have been confirmed.
Resumo:
BACKGROUND Lung cancer remains one of the most prevalent forms of cancer. Radiotherapy, with or without other therapeutic modalities, is an effective treatment. Our objective was to report on the use of radiotherapy for lung cancer, its variability in our region, and to compare our results with the previous study done in 2004 (VARA-I) in our region and with other published data. METHODS We reviewed the clinical records and radiotherapy treatment sheets of all patients undergoing radiotherapy for lung cancer during 2007 in the 12 public hospitals in Andalusia, an autonomous region of Spain. Data were gathered on hospital, patient type and histological type, radiotherapy treatment characteristics, and tumor stage. RESULTS 610 patients underwent initial radiotherapy. 37% of cases had stage III squamous cell lung cancer and were treated with radical therapy. 81% of patients with non-small and small cell lung cancer were treated with concomitant chemo-radiotherapy and the administered total dose was ≥60 Gy and ≥45 Gy respectively. The most common regimen for patients treated with palliative intent (44.6%) was 30 Gy. The total irradiation rate was 19.6% with significant differences among provinces (range, 8.5-25.6%; p<0.001). These differences were significantly correlated with the geographical distribution of radiation oncologists (r=0.78; p=0.02). Our results were similar to other published data and previous study VARA-I. CONCLUSIONS Our results shows no differences according to the other published data and data gathered in the study VARA-I. There is still wide variability in the application of radiotherapy for lung cancer in our setting that significantly correlates with the geographical distribution of radiation oncologists.
Resumo:
BACKGROUND Excess body weight, physical activity, smoking, alcohol consumption and certain dietary factors are individually related to colorectal cancer (CRC) risk; however, little is known about their joint effects. The aim of this study was to develop a healthy lifestyle index (HLI) composed of five potentially modifiable lifestyle factors - healthy weight, physical activity, non-smoking, limited alcohol consumption and a healthy diet, and to explore the association of this index with CRC incidence using data collected within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. METHODS In the EPIC cohort, a total of 347,237 men and women, 25- to 70-years old, provided dietary and lifestyle information at study baseline (1992 to 2000). Over a median follow-up time of 12 years, 3,759 incident CRC cases were identified. The association between a HLI and CRC risk was evaluated using Cox proportional hazards regression models and population attributable risks (PARs) have been calculated. RESULTS After accounting for study centre, age, sex and education, compared with 0 or 1 healthy lifestyle factors, the hazard ratio (HR) for CRC was 0.87 (95% confidence interval (CI): 0.44 to 0.77) for two factors, 0.79 (95% CI: 0.70 to 0.89) for three factors, 0.66 (95% CI: 0.58 to 0.75) for four factors and 0.63 (95% CI: 0.54 to 0.74) for five factors; P-trend <0.0001. The associations were present for both colon and rectal cancers, HRs, 0.61 (95% CI: 0.50 to 0.74; P for trend <0.0001) for colon cancer and 0.68 (95% CI: 0.53 to 0.88; P-trend <0.0001) for rectal cancer, respectively (P-difference by cancer sub-site = 0.10). Overall, 16% of the new CRC cases (22% in men and 11% in women) were attributable to not adhering to a combination of all five healthy lifestyle behaviours included in the index. CONCLUSIONS Combined lifestyle factors are associated with a lower incidence of CRC in European populations characterized by western lifestyles. Prevention strategies considering complex targeting of multiple lifestyle factors may provide practical means for improved CRC prevention.
Resumo:
INTRODUCTION Tolerability and convenience are crucial aspects for the long-term success of combined antiretroviral therapy (cART). The aim of this study was to investigate the impact in routine clinical practice of switching to the single tablet regimen (STR) RPV/FTC/TDF in patients with intolerance to previous cART, in terms of patients' well-being, assessed by several validated measures. METHODS Prospective, multicenter study. Adult HIV-infected patients with viral load under 1.000 copies/mL while receiving a stable ART for at least the last three months and switched to RPV/FTC/TDF due to intolerance of previous regimen, were included. Analyses were performed by ITT. Presence/magnitude of symptoms (ACTG-HIV Symptom Index), quality of life (EQ-5D, EUROQoL & MOS-HIV), adherence (SMAQ), preference of treatment and perceived ease of medication (ESTAR) through 48 weeks were performed. RESULTS Interim analysis of 125 patients with 16 weeks of follow up was performed. 100 (80%) were male, mean age 46 years. Mean CD4 at baseline was 629.5±307.29 and 123 (98.4%) had viral load <50 copies/mL; 15% were HCV co-infected. Ninety two (73.6%) patients switched from a NNRTI (84.8% from EFV/FTC/TDF) and 33 (26.4%) from a PI/r. The most frequent reasons for switching were psychiatric disorders (51.2%), CNS adverse events (40.8%), gastrointestinal (19.2%) and metabolic disorders (19.2%). At the time of this analysis (week 16), four patients (3.2%) discontinued treatment: one due to adverse events, two virologic failures and one with no data. A total of 104 patients (83.2%) were virologically suppressed (<50 copies/mL). The average degree of discomfort in the ACTG-HIV Symptom Index significantly decreased from baseline (21±15.55) to week 4 (10.89±12.36) & week 16 (10.81±12.62), p<0.001. In all the patients, quality of life tools showed a significant benefit in well-being of the patients (Table 1). Adherence to therapy significantly and progressively increased (SMAQ) from baseline (54.4%) to week 4 (68%), p<0.001 and to week 16 (72.0%), p<0.001. CONCLUSIONS Switching to RPV/FTC/TDF from another ARV regimen due to toxicity, significantly improved the quality of life of HIV-infected patients, both in mental and physical components, and improved adherence to therapy while maintaining a good immune and virological response.
Resumo:
BACKGROUND: Phase-IV, open-label, single-arm study (NCT01203917) to assess efficacy and safety/tolerability of first-line gefitinib in Caucasian patients with stage IIIA/B/IV, epidermal growth factor receptor (EGFR) mutation-positive non-small-cell lung cancer (NSCLC). METHODS: TREATMENT: gefitinib 250 mg day(-1) until progression. Primary endpoint: objective response rate (ORR). Secondary endpoints: disease control rate (DCR), progression-free survival (PFS), overall survival (OS) and safety/tolerability. Pre-planned exploratory objective: EGFR mutation analysis in matched tumour and plasma samples. RESULTS: Of 1060 screened patients with NSCLC (859 known mutation status; 118 positive, mutation frequency 14%), 106 with EGFR sensitising mutations were enrolled (female 70.8%; adenocarcinoma 97.2%; never-smoker 64.2%). At data cutoff: ORR 69.8% (95% confidence interval (CI) 60.5-77.7), DCR 90.6% (95% CI 83.5-94.8), median PFS 9.7 months (95% CI 8.5-11.0), median OS 19.2 months (95% CI 17.0-NC; 27% maturity). Most common adverse events (AEs; any grade): rash (44.9%), diarrhoea (30.8%); CTC (Common Toxicity Criteria) grade 3/4 AEs: 15%; SAEs: 19%. Baseline plasma 1 samples were available in 803 patients (784 known mutation status; 82 positive; mutation frequency 10%). Plasma 1 EGFR mutation test sensitivity: 65.7% (95% CI 55.8-74.7). CONCLUSION: First-line gefitinib was effective and well tolerated in Caucasian patients with EGFR mutation-positive NSCLC. Plasma samples could be considered for mutation analysis if tumour tissue is unavailable.
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
BACKGROUND Respiratory syncytial virus (RSV) is an important pathogen in lower respiratory tract infections (LRTI) in infants, but there are limited data concerning patients with underlying conditions and children older than 2 years of age. METHODS We have designed a prospective observational multicenter national study performed in 26 Spanish hospitals (December 2011-March 2012). Investigational cases were defined as children with underlying chronic diseases and were compared with a group of previously healthy children (proportion 1:2). Clinical data were compared between the groups. RESULTS A total of 1763 children hospitalized due to RSV infection during the inclusion period were analyzed. Of them, 225 cases and 460 healthy children were enrolled in the study. Underlying diseases observed were respiratory (64%), cardiovascular (25%), and neurologic (12%), as well as chromosomal abnormalities (7·5%), immunodeficiencies (6·7%), and inborn errors of metabolism (3·5%). Cases were statistically older than previously healthy children (average age: 16·3 versus 5·5 months). Cases experienced hypoxemia more frequently (P < 0·001), but patients with respiratory diseases required oxygen therapy more often (OR: 2·99; 95% CI: 1·03-8·65). Mechanical ventilation was used more in patients with cardiac diseases (OR: 3·0; 95% CI: 1·07-8·44) and in those with inborn errors of metabolism (OR: 12·27; 95% CI: 2·11-71·47). This subgroup showed a higher risk of admission to the PICU (OR: 6·7, 95% CI: 1·18-38·04). Diagnosis of pneumonia was more frequently found in cases (18·2% versus 9·3%; P < 0·01). CONCLUSIONS A significant percentage of children with RSV infection have underlying diseases and the illness severity is higher than in healthy children.
Resumo:
BACKGROUND Few epidemiological studies have examined the association between dietary trans fatty acids and weight gain, and the evidence remains inconsistent. The main objective of the study was to investigate the prospective association between biomarker of industrial trans fatty acids and change in weight within the large study European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. METHODS Baseline plasma fatty acid concentrations were determined in a representative EPIC sample from the 23 participating EPIC centers. A total of 1,945 individuals were followed for a median of 4.9 years to monitor weight change. The association between elaidic acid level and percent change of weight was investigated using a multinomial logistic regression model, adjusted by length of follow-up, age, energy, alcohol, smoking status, physical activity, and region. RESULTS In women, doubling elaidic acid was associated with a decreased risk of weight loss (odds ratio (OR) = 0.69, 95% confidence interval (CI) = 0.55-0.88, p = 0.002) and a trend was observed with an increased risk of weight gain during the 5-year follow-up (OR = 1.23, 95% CI = 0.97-1.56, p = 0.082) (p-trend<.0001). In men, a trend was observed for doubling elaidic acid level and risk of weight loss (OR = 0.82, 95% CI = 0.66-1.01, p = 0.062) while no significant association was found with risk of weight gain during the 5-year follow-up (OR = 1.08, 95% CI = 0.88-1.33, p = 0.454). No association was found for saturated and cis-monounsaturated fatty acids. CONCLUSIONS These data suggest that a high intake of industrial trans fatty acids may decrease the risk of weight loss, particularly in women. Prevention of obesity should consider limiting the consumption of highly processed foods, the main source of industrially-produced trans fatty acids.