1000 resultados para spanish cohort
Resumo:
OBJECTIVES To evaluate the rate of hospitalization for acute respiratory tract infection in children less than 24 months with haemodynamically significant congenital cardiac disease, and to describe associated risk factors, preventive measures, aetiology, and clinical course. MATERIALS AND METHODS We followed 760 subjects from October 2004 through April 2005 in an epidemiological, multicentric, observational, follow-up, prospective study involving 53 Spanish hospitals. RESULTS Of our cohort, 79 patients (10.4%, 95% CI: 8.2%-12.6%) required a total of 105 admissions to hospital related to respiratory infections. The incidence rate was 21.4 new admissions per 1000 patients-months. Significant associated risk factors for hospitalization included, with odds ratios and 95% confidence intervals shown in parentheses: 22q11 deletion (8.2, 2.5-26.3), weight below the 10th centile (5.2, 1.6-17.4), previous respiratory disease (4.5, 2.3-8.6), incomplete immunoprophylaxis against respiratory syncytial virus (2.2, 1.2-3.9), trisomy 21 (2.1, 1.1-4.2), cardiopulmonary bypass (2.0, 1.1-3.4), and siblings aged less than 11 years old (1.7, 1.1-2.9). Bronchiolitis (51.4%), upper respiratory tract infections (25.7%), and pneumonia (20%) were the main diagnoses. An infectious agent was found in 37 cases (35.2%): respiratory syncytial virus in 25, Streptococcus pneumoniae in 5, and Haemophilus influenzae in 4. The odds ratio for hospitalization due to infection by the respiratory syncytial virus increases by 3.05 (95% CI: 2.14 to 4.35) in patients with incomplete prophylaxis. The median length of hospitalization was 7 days. In 18 patients (17.1%), the clinical course of respiratory infection was complicated and 2 died. CONCLUSIONS Hospital admissions for respiratory infection in young children with haemodynamically significant congenital cardiac disease are mainly associated with non-cardiac conditions, which may be genetic, malnutrition, or respiratory, and to cardiopulmonary bypass. Respiratory syncytial virus was the most commonly identified infectious agent. Incomplete immunoprophylaxis against the virus increased the risk of hospitalization.
Resumo:
Escherichia coli, Klebsiella pneumoniae, and Enterobacter spp. are a major cause of infections in hospitalised patients. The aim of our study was to evaluate rates and trends of resistance to third-generation cephalosporins and fluoroquinolones in infected patients, the trends in use for these antimicrobials, and to assess the potential correlation between both trends. The database of national point prevalence study series of infections and antimicrobial use among patients hospitalised in Spain over the period from 1999 to 2010 was analysed. On average 265 hospitals and 60,000 patients were surveyed per year yielding a total of 19,801 E. coli, 3,004 K. pneumoniae and 3,205 Enterobacter isolates. During the twelve years period, we observed significant increases for the use of fluoroquinolones (5.8%-10.2%, p<0.001), but not for third-generation cephalosporins (6.4%-5.9%, p=NS). Resistance to third-generation cephalosporins increased significantly for E. coli (5%-15%, p<0.01) and for K. pneumoniae infections (4%-21%, p<0.01) but not for Enterobacter spp. (24%). Resistance to fluoroquinolones increased significantly for E. coli (16%30%, p<0.01), for K. pneumoniae (5%-22%, p<0.01), and for Enterobacter spp. (6%-15%, p<0.01). We found strong correlations between the rate of fluoroquinolone use and the resistance to fluoroquinolones, third-generation cephalosporins, or co-resistance to both, for E. coli (R=0.97, p<0.01, R=0.94, p<0.01, and R=0.96, p<0.01, respectively), and for K. pneumoniae (R=0.92, p<0.01, R=0.91, p<0.01, and R=0.92, p<0.01, respectively). No correlation could be found between the use of third-generation cephalosporins and resistance to any of the latter antimicrobials. No significant correlations could be found for Enterobacter spp.. Knowledge of the trends in antimicrobial resistance and use of antimicrobials in the hospitalised population at the national level can help to develop prevention strategies.
Resumo:
BACKGROUND AND OBJECTIVE Patients from a previous study of neuropathic pain (NP) in the Spanish primary care setting still had symptoms despite treatment. Subsequently, patients were treated as prescribed by their physician and followed up for 3 months. Since pregabalin has been shown to be effective in NP, including refractory cases, the objective of this study was to assess the effectiveness of pregabalin therapy in patients with NP refractory to previous treatments. METHODS This was a post hoc analysis of pregabalin-naïve NP patients treated with pregabalin in a 3-month follow-up observational multicenter study to assess symptoms and satisfaction with treatment. Patients were evaluated with the Douleur Neuropathique en 4 questions (DN4), the Brief Pain Inventory (BPI) and the Treatment Satisfaction for Medication Questionnaire (SATMED-Q) overall satisfaction domain. RESULTS 1,670 patients (mean age 58 years, 59 % women), previously untreated or treated with ≥1 drug other than pregabalin, were treated with pregabalin (37 % on monotherapy). At 3 months, pain intensity and its interference with activities decreased by half (p < 0.0001), while the number of days with no or mild pain increased by a mean of 4.5 days (p < 0.0001). Treatment satisfaction increased twofold (p < 0.0001). Patients with a shorter history of pain and those with neuralgia and peripheral nerve compression syndrome (PCS) as etiologies had the highest proportion on monotherapy and showed the greatest improvements in pain-related parameters in their respective group categories. CONCLUSION Treatment with pregabalin (as monotherapy or combination therapy) provides benefits in pain and treatment satisfaction in patients with NP, including refractory cases. Shorter disease progression and neuralgia and PCS etiologies are favorable factors for pregabalin treatment response.
Resumo:
Objectives. To study the utility of the Mini-Cog test for detection of patients with cognitive impairment (CI) in primary care (PC). Methods. We pooled data from two phase III studies conducted in Spain. Patients with complaints or suspicion of CI were consecutively recruited by PC physicians. The cognitive diagnosis was performed by an expert neurologist, after formal neuropsychological evaluation. The Mini-Cog score was calculated post hoc, and its diagnostic utility was evaluated and compared with the utility of the Mini-Mental State (MMS), the Clock Drawing Test (CDT), and the sum of the MMS and the CDT (MMS + CDT) using the area under the receiver operating characteristic curve (AUC). The best cut points were obtained on the basis of diagnostic accuracy (DA) and kappa index. Results. A total sample of 307 subjects (176 CI) was analyzed. The Mini-Cog displayed an AUC (±SE) of 0.78 ± 0.02, which was significantly inferior to the AUC of the CDT (0.84 ± 0.02), the MMS (0.84 ± 0.02), and the MMS + CDT (0.86 ± 0.02). The best cut point of the Mini-Cog was 1/2 (sensitivity 0.60, specificity 0.90, DA 0.73, and kappa index 0.48 ± 0.05). Conclusions. The utility of the Mini-Cog for detection of CI in PC was very modest, clearly inferior to the MMS or the CDT. These results do not permit recommendation of the Mini-Cog in PC.
Resumo:
Although the relationship between personality and depressive illness is complex (Shea, 2005), there is empirical evidence that some personality features such as neuroticism, harm avoidance, introversion, dependency, self-criticism or perfectionism are related to depressive illness risk (Gunderson et al. 1999).
Resumo:
TRAIL and TRAIL Receptor genes have been implicated in Multiple Sclerosis pathology as well as in the response to IFN beta therapy. The objective of our study was to evaluate the association of these genes in relation to the age at disease onset (AAO) and to the clinical response upon IFN beta treatment in Spanish MS patients. We carried out a candidate gene study of TRAIL, TRAILR-1, TRAILR-2, TRAILR-3 and TRAILR-4 genes. A total of 54 SNPs were analysed in 509 MS patients under IFN beta treatment, and an additional cohort of 226 MS patients was used to validate the results. Associations of rs1047275 in TRAILR-2 and rs7011559 in TRAILR-4 genes with AAO under an additive model did not withstand Bonferroni correction. In contrast, patients with the TRAILR-1 rs20576-CC genotype showed a better clinical response to IFN beta therapy compared with patients carrying the A-allele (recessive model: p = 8.88×10(-4), pc = 0.048, OR = 0.30). This SNP resulted in a non synonymous substitution of Glutamic acid to Alanine in position 228 (E228A), a change previously associated with susceptibility to different cancer types and risk of metastases, suggesting a lack of functionality of TRAILR-1. In order to unravel how this amino acid change in TRAILR-1 would affect to death signal, we performed a molecular modelling with both alleles. Neither TRAIL binding sites in the receptor nor the expression levels of TRAILR-1 in peripheral blood mononuclear cell subsets (monocytes, CD4+ and CD8+ T cells) were modified, suggesting that this SNP may be altering the death signal by some other mechanism. These findings show a role for TRAILR-1 gene variations in the clinical outcome of IFN beta therapy that might have relevance as a biomarker to predict the response to IFN beta in MS.
Resumo:
BACKGROUND The effect of the macronutrient composition of the usual diet on long term weight maintenance remains controversial. METHODS 373,803 subjects aged 25-70 years were recruited in 10 European countries (1992-2000) in the PANACEA project of the EPIC cohort. Diet was assessed at baseline using country-specific validated questionnaires and weight and height were measured at baseline and self-reported at follow-up in most centers. The association between weight change after 5 years of follow-up and the iso-energetic replacement of 5% of energy from one macronutrient by 5% of energy from another macronutrient was assessed using multivariate linear mixed-models. The risk of becoming overweight or obese after 5 years was investigated using multivariate Poisson regressions stratified according to initial Body Mass Index. RESULTS A higher proportion of energy from fat at the expense of carbohydrates was not significantly associated with weight change after 5 years. However, a higher proportion of energy from protein at the expense of fat was positively associated with weight gain. A higher proportion of energy from protein at the expense of carbohydrates was also positively associated with weight gain, especially when carbohydrates were rich in fibre. The association between percentage of energy from protein and weight change was slightly stronger in overweight participants, former smokers, participants ≥60 years old, participants underreporting their energy intake and participants with a prudent dietary pattern. Compared to diets with no more than 14% of energy from protein, diets with more than 22% of energy from protein were associated with a 23-24% higher risk of becoming overweight or obese in normal weight and overweight subjects at baseline. CONCLUSION Our results show that participants consuming an amount of protein above the protein intake recommended by the American Diabetes Association may experience a higher risk of becoming overweight or obese during adult life.
Resumo:
BACKGROUND Identifying individuals at high risk of excess weight gain may help targeting prevention efforts at those at risk of various metabolic diseases associated with weight gain. Our aim was to develop a risk score to identify these individuals and validate it in an external population. METHODS We used lifestyle and nutritional data from 53°758 individuals followed for a median of 5.4 years from six centers of the European Prospective Investigation into Cancer and Nutrition (EPIC) to develop a risk score to predict substantial weight gain (SWG) for the next 5 years (derivation sample). Assuming linear weight gain, SWG was defined as gaining ≥ 10% of baseline weight during follow-up. Proportional hazards models were used to identify significant predictors of SWG separately by EPIC center. Regression coefficients of predictors were pooled using random-effects meta-analysis. Pooled coefficients were used to assign weights to each predictor. The risk score was calculated as a linear combination of the predictors. External validity of the score was evaluated in nine other centers of the EPIC study (validation sample). RESULTS Our final model included age, sex, baseline weight, level of education, baseline smoking, sports activity, alcohol use, and intake of six food groups. The model's discriminatory ability measured by the area under a receiver operating characteristic curve was 0.64 (95% CI = 0.63-0.65) in the derivation sample and 0.57 (95% CI = 0.56-0.58) in the validation sample, with variation between centers. Positive and negative predictive values for the optimal cut-off value of ≥ 200 points were 9% and 96%, respectively. CONCLUSION The present risk score confidently excluded a large proportion of individuals from being at any appreciable risk to develop SWG within the next 5 years. Future studies, however, may attempt to further refine the positive prediction of the score.
Resumo:
BACKGROUND Observational studies implicate higher dietary energy density (DED) as a potential risk factor for weight gain and obesity. It has been hypothesized that DED may also be associated with risk of type 2 diabetes (T2D), but limited evidence exists. Therefore, we investigated the association between DED and risk of T2D in a large prospective study with heterogeneity of dietary intake. METHODOLOGY/PRINCIPAL FINDINGS A case-cohort study was nested within the European Prospective Investigation into Cancer (EPIC) study of 340,234 participants contributing 3.99 million person years of follow-up, identifying 12,403 incident diabetes cases and a random subcohort of 16,835 individuals from 8 European countries. DED was calculated as energy (kcal) from foods (except beverages) divided by the weight (gram) of foods estimated from dietary questionnaires. Prentice-weighted Cox proportional hazard regression models were fitted by country. Risk estimates were pooled by random effects meta-analysis and heterogeneity was evaluated. Estimated mean (sd) DED was 1.5 (0.3) kcal/g among cases and subcohort members, varying across countries (range 1.4-1.7 kcal/g). After adjustment for age, sex, smoking, physical activity, alcohol intake, energy intake from beverages and misreporting of dietary intake, no association was observed between DED and T2D (HR 1.02 (95% CI: 0.93-1.13), which was consistent across countries (I(2) = 2.9%). CONCLUSIONS/SIGNIFICANCE In this large European case-cohort study no association between DED of solid and semi-solid foods and risk of T2D was observed. However, despite the fact that there currently is no conclusive evidence for an association between DED and T2DM risk, choosing low energy dense foods should be promoted as they support current WHO recommendations to prevent chronic diseases.
Resumo:
The presence of transmitted human immunodeficiency virus (HIV)-1 drug-resistance (TDR) at the time of antiretroviral therapy initiation is associated with failure to achieve viral load (VL) suppression. Here, we report TDR surveillance in a specific population of men who have sex with men (MSM) in Belo Horizonte, Brazil. In this study, the rate of TDR was evaluated in 64 HIV-infected individuals from a cohort of MSM between 1996-June 2012. Fifty-four percent had a documented recent HIV infection, with a seroconversion time of less than 12 months. The median CD4+T lymphocyte count and VL were 531 cells/mm3and 17,746 copies/mL, respectively. Considering the surveillance drug resistance mutation criteria, nine (14.1%) patients presented TDR, of which three (4.7%), five (7.8%) and four (6.2%) had protease inhibitors, resistant against nucleos(t)ide transcriptase inhibitors and against non-nucleoside reverse-transcriptase inhibitors mutations, respectively. Two of the patients had multi-drug-resistant HIV-1. The most prevalent viral subtype was B (44, 68.8%), followed by subtype F (11, 17.2%). This study shows that TDR may vary according to the population studied and it may be higher in clusters of MSM.
Resumo:
INTRODUCTION The Rasch model is increasingly used in the field of rehabilitation because it improves the accuracy of measurements of patient status and their changes after therapy. OBJECTIVE To determine the long-term effectiveness of a holistic neuropsychological rehabilitation program for Spanish outpatients with acquired brain injury (ABI) using Rasch analysis. METHODS Eighteen patients (ten with long evolution - patients who started the program > 6 months after ABI- and eight with short evolution) and their relatives attended the program for 6 months. Patients' and relatives' answers to the European Brain Injury Questionnaire and the Frontal Systems Behavior Scale at 3 time points (pre-intervention. post-intervention and 12 month follow-up) were transformed into linear measures called logits. RESULTS The linear measures revealed significant improvements with large effects at the follow-up assessment on cognitive and executive functioning, social and emotional self-regulation, apathy and mood. At follow-up, the short evolution group achieved greater improvements in mood and cognitive functioning than the long evolution patients. CONCLUSIONS The program showed long-term effectiveness for most of the variables, and it was more effective for mood and cognitive functioning when patients were treated early. Relatives played a key role in the effectiveness of the rehabilitation program.
Resumo:
INTRODUCTION Selenium is an essential micronutrient for human health, being a cofactor for enzymes with antioxidant activity that protect the organism from oxidative damage. An inadequate intake of this mineral has been associated with the onset and progression of chronic diseases such as hypertension, diabetes, coronary diseases, asthma, and cancer. For this reason, knowledge of the plasma and erythrocyte selenium levels of a population makes a relevant contribution to assessment of its nutritional status. OBJECTIVE The objective of the present study was to determine the nutritional status of selenium and risk of selenium deficiency in a healthy adult population in Spain by examining food and nutrient intake and analyzing biochemical parameters related to selenium metabolism, including plasma and erythrocyte levels and selenium-dependent glutathione peroxidase (GPx) enzymatic activity. MATERIAL AND METHODS We studied 84 healthy adults (31 males and 53 females) from the province of Granada, determining their plasma and erythrocyte selenium concentrations and the association of these levels with the enzymatic activity of glutathione peroxidase (GPx) and with life style factors. We also gathered data on their food and nutrient intake and the results of biochemical analyses. Correlations were studied among all of these variables. RESULTS The mean plasma selenium concentration was 76.6 ± 17.3 μg/L (87.3 ± 17.4 μg/L in males, 67.3 ± 10.7 μg/L in females), whereas the mean erythrocyte selenium concentration was 104.6 μg/L (107.9 ± 26.1 μg/L in males and 101.7 ± 21.7 μg/L in females). The nutritional status of selenium was defined by the plasma concentration required to reach maximum GPx activity, establishing 90 μg/L as reference value. According to this criterion, 50% of the men and 53% of the women were selenium deficient. CONCLUSIONS Selenium is subjected to multiple regulation mechanisms. Erythrocyte selenium is a good marker of longer term selenium status, while plasma selenium appears to be a marker of short-term nutritional status. The present findings indicate a positive correlation between plasma selenium concentration and the practice of physical activity. Bioavailability studies are required to establish appropriate reference levels of this mineral for the Spanish population.
Resumo:
BACKGROUND The purpose of this multicenter Spanish study was to evaluate the response to immediate-release methylphenidate by children and adults diagnosed with attention-deficit/hyperactivity disorder (ADHD), as well as to obtain information on current therapy patterns and safety characteristics. METHODS This multicenter, observational, retrospective, noninterventional study included 730 patients aged 4-65 years with a diagnosis of ADHD. Information was obtained based on a review of medical records for the years 2002-2006 in sequential order. RESULTS The ADHD predominantly inattentive subtype affected 29.7% of patients, ADHD predominantly hyperactive-impulsive was found in 5.2%, and the combined subtype in 65.1%. Overall, a significant lower Clinical Global Impression (CGI) score and mean number of DSM-IV TR (Diagnostic and Statistical Manual of Mental Disorders Fourth Edition, Text Revision) symptoms by subtype were found after one year of treatment with immediate-release methylphenidate; CGI decreased from 4.51 to 1.69, symptoms of inattention from 7.90 to 4.34, symptoms of hyperactivity from 6.73 to 3.39, and combined subtype symptoms from 14.62 to 7.7. Satisfaction with immediate-release methylphenidate after one year was evaluated as "very satisfied" or "satisfied" by 86.90% of the sample; 25.75% of all patients reported at least one adverse effect. At the end of the study, 41.47% of all the patients treated with immediate-release methylphenidate were still receiving it, with a mean time of 3.80 years on therapy. CONCLUSION Good efficacy and safety results were found for immediate-release methylphenidate in patients with ADHD.
Resumo:
Epstein-Barr virus (EBV) is associated with several types of cancers including Hodgkin's lymphoma (HL) and nasopharyngeal carcinoma (NPC). EBV-encoded latent membrane protein 1 (LMP1), a multifunctional oncoprotein, is a powerful activator of the transcription factor NF-κB, a property that is essential for EBV-transformed lymphoblastoid cell survival. Previous studies reported LMP1 sequence variations and induction of higher NF-κB activation levels compared to the prototype B95-8 LMP1 by some variants. Here we used biopsies of EBV-associated cancers and blood of individuals included in the Swiss HIV Cohort Study (SHCS) to analyze LMP1 genetic diversity and impact of sequence variations on LMP1-mediated NF-κB activation potential. We found that a number of variants mediate higher NF-κB activation levels when compared to B95-8 LMP1 and mapped three single polymorphisms responsible for this phenotype: F106Y, I124V and F144I. F106Y was present in all LMP1 isolated in this study and its effect was variant dependent, suggesting that it was modulated by other polymorphisms. The two polymorphisms I124V and F144I were present in distinct phylogenetic groups and were linked with other specific polymorphisms nearby, I152L and D150A/L151I, respectively. The two sets of polymorphisms, I124V/I152L and F144I/D150A/L151I, which were markers of increased NF-κB activation in vitro, were not associated with EBV-associated HL in the SHCS. Taken together these results highlighted the importance of single polymorphisms for the modulation of LMP1 signaling activity and demonstrated that several groups of LMP1 variants, through distinct mutational paths, mediated enhanced NF-κB activation levels compared to B95-8 LMP1.
Resumo:
BACKGROUND The purpose of this study was to assess the incidence of neurological complications in patients with infective endocarditis, the risk factors for their development, their influence on the clinical outcome, and the impact of cardiac surgery. METHODS AND RESULTS This was a retrospective analysis of prospectively collected data on a multicenter cohort of 1345 consecutive episodes of left-sided infective endocarditis from 8 centers in Spain. Cox regression models were developed to analyze variables predictive of neurological complications and associated mortality. Three hundred forty patients (25%) experienced such complications: 192 patients (14%) had ischemic events, 86 (6%) had encephalopathy/meningitis, 60 (4%) had hemorrhages, and 2 (1%) had brain abscesses. Independent risk factors associated with all neurological complications were vegetation size ≥3 cm (hazard ratio [HR] 1.91), Staphylococcus aureus as a cause (HR 2.47), mitral valve involvement (HR 1.29), and anticoagulant therapy (HR 1.31). This last variable was particularly related to a greater incidence of hemorrhagic events (HR 2.71). Overall mortality was 30%, and neurological complications had a negative impact on outcome (45% of deaths versus 24% in patients without these complications; P<0.01), although only moderate to severe ischemic stroke (HR 1.63) and brain hemorrhage (HR 1.73) were significantly associated with a poorer prognosis. Antimicrobial treatment reduced (by 33% to 75%) the risk of neurological complications. In patients with hemorrhage, mortality was higher when surgery was performed within 4 weeks of the hemorrhagic event (75% versus 40% in later surgery). CONCLUSIONS Moderate to severe ischemic stroke and brain hemorrhage were found to have a significant negative impact on the outcome of infective endocarditis. Early appropriate antimicrobial treatment is critical, and transitory discontinuation of anticoagulant therapy should be considered.