49 resultados para Nonparametric confidence interval
Resumo:
Objective To compute the burden of cancer attributable to current and former alcohol consumption in eight European countries based on direct relative risk estimates from a cohort study. Design Combination of prospective cohort study with representative population based data on alcohol exposure. Setting Eight countries (France, Italy, Spain, United Kingdom, the Netherlands, Greece, Germany, Denmark) participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Participants 109 118 men and 254 870 women, mainly aged 37-70. Main outcome measures Hazard rate ratios expressing the relative risk of cancer incidence for former and current alcohol consumption among EPIC participants. Hazard rate ratios combined with representative information on alcohol consumption to calculate alcohol attributable fractions of causally related cancers by country and sex. Partial alcohol attributable fractions for consumption higher than the recommended upper limit (two drinks a day for men with about 24 g alcohol, one for women with about 12 g alcohol) and the estimated total annual number of cases of alcohol attributable cancer. Results If we assume causality, among men and women, 10% (95% confidence interval 7 to 13%) and 3% (1 to 5%) of the incidence of total cancer was attributable to former and current alcohol consumption in the selected European countries. For selected cancers the figures were 44% (31 to 56%) and 25% (5 to 46%) for upper aerodigestive tract, 33% (11 to 54%) and 18% (−3 to 38%) for liver, 17% (10 to 25%) and 4% (−1 to 10%) for colorectal cancer for men and women, respectively, and 5.0% (2 to 8%) for female breast cancer. A substantial part of the alcohol attributable fraction in 2008 was associated with alcohol consumption higher than the recommended upper limit: 33 037 of 178 578 alcohol related cancer cases in men and 17 470 of 397 043 alcohol related cases in women. Conclusions In western Europe, an important proportion of cases of cancer can be attributable to alcohol consumption, especially consumption higher than the recommended upper limits. These data support current political efforts to reduce or to abstain from alcohol consumption to reduce the incidence of cancer.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
Background. The study of the severity of occupational injuries is very important for the establishment of prevention plans. The aim of this paper is to analyze the distribution of occupational injuries by a) individual factors b) work place characteristics and c) working conditions and to analyze the severity of occupational injuries by this characteristics in men and women in Andalusia. Methods. Injury data came from the accident registry of the Ministry of Labor and Social Issues in 2003. Dependent variable: the severity of the injury: slight, serious, very serious and fatal; the independent variables: the characteristics of the worker, company data, and the accident itself. Bivariate and multivariate analysis were done to estimate the probability of serious, very serious and fatal injury, related to other variables, through odds ratio (OR), and using a 95% confidence interval (CI 95%). Results. The 82,4% of the records were men and 17,6% were women, of whom the 78,1% are unskilled manual workers, compared to 44,9% of men. The men belonging to class I have a higher probability of more severe lesions (OR = 1,67, 95% CI = 1,17 – 2,38). Conclusions. The severity of the injury is associated with sex, age and type of injury. In men it is also related with the professional situation, the place where the accident happened, an unusual job, the size and the characteristics of the company and the social class, and in women with the sector
Resumo:
Despite medical advances, mortality in infective endocarditis (IE) is still very high. Previous studies on prognosis in IE have observed conflicting results. The aim of this study was to identify predictors of in-hospital mortality in a large multicenter cohort of left-sided IE.Methods An observational multicenter study was conducted from January 1984 to December 2006 in seven hospitals in Andalusia, Spain. Seven hundred and five left-side IE patients were included. The main outcome measure was in-hospital mortality. Several prognostic factors were analysed by univariate tests and then by multilogistic regression model. Results.The overall mortality was 29.5% (25.5% from 1984 to 1995 and 31.9% from 1996 to 2006; Odds Ratio 1.25; 95% Confidence Interval: 0.97-1.60; p = 0.07). In univariate analysis, age, comorbidity, especially chronic liver disease, prosthetic valve, virulent microorganism such as Staphylococcus aureus, Streptococcus agalactiae and fungi, and complications (septic shock, severe heart failure, renal insufficiency, neurologic manifestations and perivalvular extension) were related with higher mortality. Independent factors for mortality in multivariate analysis were: Charlson comorbidity score (OR: 1.2; 95% CI: 1.1-1.3), prosthetic endocarditis (OR: 1.9; CI: 1.2-3.1), Staphylococcus aureus aetiology (OR: 2.1; CI: 1.3-3.5), severe heart failure (OR: 5.4; CI: 3.3-8.8), neurologic manifestations (OR: 1.9; CI: 1.2-2.9), septic shock (OR: 4.2; CI: 2.3-7.7), perivalvular extension (OR: 2.4; CI: 1.3-4.5) and acute renal failure (OR: 1.69; CI: 1.0-2.6). Conversely, Streptococcus viridans group etiology (OR: 0.4; CI: 0.2-0.7) and surgical treatment (OR: 0.5; CI: 0.3-0.8) were protective factors.Conclusions Several characteristics of left-sided endocarditis enable selection of a patient group at higher risk of mortality. This group may benefit from more specialised attention in referral centers and should help to identify those patients who might benefit from more aggressive diagnostic and/or therapeutic procedures.
Resumo:
Objective: To determine the values of, and study the relationships among, central corneal thickness (CCT), intraocular pressure (IOP), and degree of myopia (DM) in an adult myopic population aged 20 to 40 years in Almeria (southeast Spain). To our knowledge this is first study of this kind in this region. Methods: An observational, descriptive, cross-sectional study was done in which a sample of 310 myopic patients (620 eyes) aged 20 to 40 years was selected by gender- and age-stratified sampling, which was proportionally fixed to the size of the population strata for which a 20% prevalence of myopia, 5% epsilon, and a 95% confidence interval were hypothesized. We studied IOP, CCT, and DM and their relationships by calculating the mean, standard deviation, 95% confidence interval for the mean, median, Fisher’s asymmetry coefficient, range (maximum, minimum), and the Brown-Forsythe’s robust test for each variable (IOP, CCT, and DM). Results: In the adult myopic population of Almeria aged 20 to 40 years (mean of 29.8), the mean overall CCT was 550.12 μm. The corneas of men were thicker than those of women (P = 0.014). CCT was stable as no significant differences were seen in the 20- to 40-year-old subjects’ CCT values. The mean overall IOP was 13.60 mmHg. Men had a higher IOP than women (P = 0.002). Subjects over 30 years (13.83) had a higher IOP than those under 30 (13.38) (P = 0.04). The mean overall DM was −4.18 diopters. Men had less myopia than women (P < 0.001). Myopia was stable in the 20- to 40-year-old study population (P = 0.089). A linear relationship was found between CCT and IOP (R2 = 0.152, P ≤ 0.001). CCT influenced the IOP value by 15.2%. However no linear relationship between DM and IOP, or between CCT and DM, was found. Conclusions: CCT was found to be similar to that reported in other studies in different populations. IOP tends to increase after the age of 30 and is not accounted for by alterations in CCT values.
Resumo:
Goal: To learn more about the social support available to patients participating in a prison methadone maintenance program (PMM). Methodology: Descriptive, with controls. Setting: A penitentiary in Albolote (Granada) Population Sample: The total prison population was 1,579; 364 patients were included in the PMM; 35 were female and 329 were male. 60 patients, 7 women and 53 men, were used as cases. 30 non-drug dependent prisoners, 3 women and 27 men, were the control group. They had no antecedents of problems with drug addiction. Interventions: Interviews with cases and controls to learn about their addictive antecedents, family structure, socio-economic level, and a hetero-applied MOS questionnaire was completed. Percentages of each social support variable were obtained and compared using the chi-squared technique. Results: The overall support received is low in 38 cases (74.5%) and in 9 controls (30%): p = 0.0001. OR 0.1466, confidence interval at 95% (0.0538-0.3989). Support received is normal in 13 cases (25%) and 21 controls (70%): p = 0.0007. OR 0.69, confidence interval at 95% (0.44-0.93). All of the variables were statistically significant for non-drug addicts, except for emotional support, which was the same for both groups. Conclusion: The perception of inmates participating in the methadone maintenance program was that they received less social support than the non-drug dependent inmates.
Resumo:
To evaluate the long-term impact of successive interventions on rates of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection and MRSA bacteremia in an endemic hospital-wide situation. DESIGN:Quasi-experimental, interrupted time-series analysis. The impact of the interventions was analyzed by use of segmented regression. Representative MRSA isolates were typed by use of pulsed-field gel electrophoresis. SETTING:A 950-bed teaching hospital in Seville, Spain. PATIENTS:All patients admitted to the hospital during the period from 1995 through 2008. METHODS:Three successive interventions were studied: (1) contact precautions, with no active surveillance for MRSA; (2) targeted active surveillance for MRSA in patients and healthcare workers in specific wards, prioritized according to clinical epidemiology data; and (3) targeted active surveillance for MRSA in patients admitted from other medical centers. RESULTS:Neither the preintervention rate of MRSA colonization or infection (0.56 cases per 1,000 patient-days [95% confidence interval {CI}, 0.49-0.62 cases per 1,000 patient-days]) nor the slope for the rate of MRSA colonization or infection changed significantly after the first intervention. The rate decreased significantly to 0.28 cases per 1,000 patient-days (95% CI, 0.17-0.40 cases per 1,000 patient-days) after the second intervention and to 0.07 cases per 1,000 patient-days (95% CI, 0.06-0.08 cases per 1,000 patient-days) after the third intervention, and the rate remained at a similar level for 8 years. The MRSA bacteremia rate decreased by 80%, whereas the rate of bacteremia due to methicillin-susceptible S. aureus did not change. Eighty-three percent of the MRSA isolates identified were clonally related. All MRSA isolates obtained from healthcare workers were clonally related to those recovered from patients who were in their care. CONCLUSION:Our data indicate that long-term control of endemic MRSA is feasible in tertiary care centers. The use of targeted active surveillance for MRSA in patients and healthcare workers in specific wards (identified by means of analysis of clinical epidemiology data) and the use of decolonization were key to the success of the program.
Resumo:
BACKGROUND To assess and compare the effectiveness and costs of Phototest, Mini Mental State Examination (MMSE), and Memory Impairment Screen (MIS) to screen for dementia (DEM) and cognitive impairment (CI). METHODS A phase III study was conducted over one year in consecutive patients with suspicion of CI or DEM at four Primary Care (PC) centers. After undergoing all screening tests at the PC center, participants were extensively evaluated by researchers blinded to screening test results in a Cognitive-Behavioral Neurology Unit (CBNU). The gold standard diagnosis was established by consensus of expert neurologists. Effectiveness was assessed by the proportion of correct diagnoses (diagnostic accuracy [DA]) and by the kappa index of concordance between test results and gold standard diagnoses. Costs were based on public prices and hospital accounts. RESULTS The study included 140 subjects (48 with DEM, 37 with CI without DEM, and 55 without CI). The MIS could not be applied to 23 illiterate subjects (16.4%). For DEM, the maximum effectiveness of the MMSE was obtained with different cutoff points as a function of educational level [k = 0.31 (95% Confidence interval [95%CI], 0.19-0.43), DA = 0.60 (95%CI, 0.52-0.68)], and that of the MIS with a cutoff of 3/4 [k = 0.63 (95%CI, 0.48-0.78), DA = 0.83 (95%CI, 0.80-0.92)]. Effectiveness of the Phototest [k = 0.71 (95%CI, 0.59-0.83), DA = 0.87 (95%CI, 0.80-0.92)] was similar to that of the MIS and higher than that of the MMSE. Costs were higher with MMSE (275.9 ± 193.3€ [mean ± sd euros]) than with Phototest (208.2 ± 196.8€) or MIS (201.3 ± 193.4€), whose costs did not significantly differ. For CI, the effectiveness did not significantly differ between MIS [k = 0.59 (95%CI, 0.45-0.74), DA = 0.79 (95%CI, 0.64-0.97)] and Phototest [k = 0.58 (95%CI, 0.45-0.74), DA = 0.78 (95%CI, 0.64-0.95)] and was lowest for the MMSE [k = 0.27 (95%CI, 0.09-0.45), DA = 0.69 (95%CI, 0.56-0.84)]. Costs were higher for MMSE (393.4 ± 121.8€) than for Phototest (287.0 ± 197.4€) or MIS (300.1 ± 165.6€), whose costs did not significantly differ. CONCLUSION MMSE is not an effective instrument in our setting. For both DEM and CI, the Phototest and MIS are more effective and less costly, with no difference between them. However, MIS could not be applied to the appreciable percentage of our population who were illiterate.
Resumo:
Introduction. Critically ill patients suffer from oxidative stress caused by reactive oxygen species (ROS) and reactive nitrogen species (RNS). Although ROS/RNS are constantly produced under normal circumstances, critical illness can drastically increase their production. These patients have reduced plasma and intracellular levels of antioxidants and free electron scavengers or cofactors, and decreased activity of the enzymatic system involved in ROS detoxification. The pro-oxidant/antioxidant balance is of functional relevance during critical illness because it is involved in the pathogenesis of multiple organ failure. In this study the objective was to evaluate the relation between oxidative stress in critically ill patients and antioxidant vitamin intake and severity of illness. Methods. Spectrophotometry was used to measure in plasma the total antioxidant capacity and levels of lipid peroxide, carbonyl group, total protein, bilirubin and uric acid at two time points: at intensive care unit (ICU) admission and on day seven. Daily diet records were kept and compliance with recommended dietary allowance (RDA) of antioxidant vitamins (A, C and E) was assessed. Results. Between admission and day seven in the ICU, significant increases in lipid peroxide and carbonyl group were associated with decreased antioxidant capacity and greater deterioration in Sequential Organ Failure Assessment score. There was significantly greater worsening in oxidative stress parameters in patients who received antioxidant vitamins at below 66% of RDA than in those who received antioxidant vitamins at above 66% of RDA. An antioxidant vitamin intake from 66% to 100% of RDA reduced the risk for worsening oxidative stress by 94% (ods ratio 0.06, 95% confidence interval 0.010 to 0.39), regardless of change in severity of illness (Sequential Organ Failure Assessment score). Conclusion. The critical condition of patients admitted to the ICU is associated with worsening oxidative stress. Intake of antioxidant vitamins below 66% of RDA and alteration in endogenous levels of substances with antioxidant capacity are related to redox imbalance in critical ill patients. Therefore, intake of antioxidant vitamins should be carefully monitored so that it is as close as possible to RDA.
Resumo:
INTRODUCTION Hemodynamic resuscitation should be aimed at achieving not only adequate cardiac output but also sufficient mean arterial pressure (MAP) to guarantee adequate tissue perfusion pressure. Since the arterial pressure response to volume expansion (VE) depends on arterial tone, knowing whether a patient is preload-dependent provides only a partial solution to the problem. The objective of this study was to assess the ability of a functional evaluation of arterial tone by dynamic arterial elastance (Ea(dyn)), defined as the pulse pressure variation (PPV) to stroke volume variation (SVV) ratio, to predict the hemodynamic response in MAP to fluid administration in hypotensive, preload-dependent patients with acute circulatory failure. METHODS We performed a prospective clinical study in an adult medical/surgical intensive care unit in a tertiary care teaching hospital, including 25 patients with controlled mechanical ventilation who were monitored with the Vigileo(®) monitor, for whom the decision to give fluids was made because of the presence of acute circulatory failure, including arterial hypotension (MAP ≤65 mmHg or systolic arterial pressure <90 mmHg) and preserved preload responsiveness condition, defined as a SVV value ≥10%. RESULTS Before fluid infusion, Ea(dyn) was significantly different between MAP responders (MAP increase ≥15% after VE) and MAP nonresponders. VE-induced increases in MAP were strongly correlated with baseline Ea(dyn) (r(2) = 0.83; P < 0.0001). The only predictor of MAP increase was Ea(dyn) (area under the curve, 0.986 ± 0.02; 95% confidence interval (CI), 0.84-1). A baseline Ea(dyn) value >0.89 predicted a MAP increase after fluid administration with a sensitivity of 93.75% (95% CI, 69.8%-99.8%) and a specificity of 100% (95% CI, 66.4%-100%). CONCLUSIONS Functional assessment of arterial tone by Ea(dyn), measured as the PVV to SVV ratio, predicted arterial pressure response after volume loading in hypotensive, preload-dependent patients under controlled mechanical ventilation.
Resumo:
BACKGROUND Evidence associating exposure to water disinfection by-products with reduced birth weight and altered duration of gestation remains inconclusive. OBJECTIVE We assessed exposure to trihalomethanes (THMs) during pregnancy through different water uses and evaluated the association with birth weight, small for gestational age (SGA), low birth weight (LBW), and preterm delivery. METHODS Mother-child cohorts set up in five Spanish areas during the years 2000-2008 contributed data on water ingestion, showering, bathing, and swimming in pools. We ascertained residential THM levels during pregnancy periods through ad hoc sampling campaigns (828 measurements) and regulatory data (264 measurements), which were modeled and combined with personal water use and uptake factors to estimate personal uptake. We defined outcomes following standard definitions and included 2,158 newborns in the analysis. RESULTS Median residential THM ranged from 5.9 μg/L (Valencia) to 114.7 μg/L (Sabadell), and speciation differed across areas. We estimated that 89% of residential chloroform and 96% of brominated THM uptakes were from showering/bathing. The estimated change of birth weight for a 10% increase in residential uptake was -0.45 g (95% confidence interval: -1.36, 0.45 g) for chloroform and 0.16 g (-1.38, 1.70 g) for brominated THMs. Overall, THMs were not associated with SGA, LBW, or preterm delivery. CONCLUSIONS Despite the high THM levels in some areas and the extensive exposure assessment, results suggest that residential THM exposure during pregnancy driven by inhalation and dermal contact routes is not associated with birth weight, SGA, LBW, or preterm delivery in Spain.
Resumo:
BACKGROUND. A growing body of research suggests that prenatal exposure to air pollution may be harmful to fetal development. We assessed the association between exposure to air pollution during pregnancy and anthropometric measures at birth in four areas within the Spanish Children's Health and Environment (INMA) mother and child cohort study. METHODS. Exposure to ambient nitrogen dioxide (NO2) and benzene was estimated for the residence of each woman (n = 2,337) for each trimester and for the entire pregnancy. Outcomes included birth weight, length, and head circumference. The association between residential outdoor air pollution exposure and birth outcomes was assessed with linear regression models controlled for potential confounders. We also performed sensitivity analyses for the subset of women who spent more time at home during pregnancy. Finally, we performed a combined analysis with meta-analysis techniques. RESULTS. In the combined analysis, an increase of 10 µg/m3 in NO2 exposure during pregnancy was associated with a decrease in birth length of -0.9 mm [95% confidence interval (CI), -1.8 to -0.1 mm]. For the subset of women who spent ≥ 15 hr/day at home, the association was stronger (-0.16 mm; 95% CI, -0.27 to -0.04). For this same subset of women, a reduction of 22 g in birth weight was associated with each 10-µg/m3 increase in NO2 exposure in the second trimester (95% CI, -45.3 to 1.9). We observed no significant relationship between benzene levels and birth outcomes. CONCLUSIONS. NO2 exposure was associated with reductions in both length and weight at birth. This association was clearer for the subset of women who spent more time at home.
Resumo:
BACKGROUND Previous studies have demonstrated the efficacy of treatment for latent tuberculosis infection (TLTBI) in persons infected with the human immunodeficiency virus, but few studies have investigated the operational aspects of implementing TLTBI in the co-infected population.The study objectives were to describe eligibility for TLTBI as well as treatment prescription, initiation and completion in an HIV-infected Spanish cohort and to investigate factors associated with treatment completion. METHODS Subjects were prospectively identified between 2000 and 2003 at ten HIV hospital-based clinics in Spain. Data were obtained from clinical records. Associations were measured using the odds ratio (OR) and its 95% confidence interval (95% CI). RESULTS A total of 1242 subjects were recruited and 846 (68.1%) were evaluated for TLTBI. Of these, 181 (21.4%) were eligible for TLTBI either because they were tuberculin skin test (TST) positive (121) or because their TST was negative/unknown but they were known contacts of a TB case or had impaired immunity (60). Of the patients eligible for TLTBI, 122 (67.4%) initiated TLTBI: 99 (81.1%) were treated with isoniazid for 6, 9 or 12 months; and 23 (18.9%) with short-course regimens including rifampin plus isoniazid and/or pyrazinamide. In total, 70 patients (57.4%) completed treatment, 39 (32.0%) defaulted, 7 (5.7%) interrupted treatment due to adverse effects, 2 developed TB, 2 died, and 2 moved away. Treatment completion was associated with having acquired HIV infection through heterosexual sex as compared to intravenous drug use (OR:4.6; 95% CI:1.4-14.7) and with having taken rifampin and pyrazinamide for 2 months as compared to isoniazid for 9 months (OR:8.3; 95% CI:2.7-24.9). CONCLUSIONS A minority of HIV-infected patients eligible for TLTBI actually starts and completes a course of treatment. Obstacles to successful implementation of this intervention need to be addressed.
Resumo:
BACKGROUND Temporomandibular disorder (TMD) is a multifactorial syndrome related to a critical period of human life. TMD has been associated with psychological dysfunctions, oxidative state and sexual dimorphism with coincidental occurrence along the pubertal development. In this work we study the association between TMD and genetic polymorphisms of folate metabolism, neurotransmission, oxidative and hormonal metabolism. Folate metabolism, which depends on genes variations and diet, is directly involved in genetic and epigenetic variations that can influence the changes of last growing period of development in human and the appearance of the TMD. METHODS A case-control study was designed to evaluate the impact of genetic polymorphisms above described on TMD. A total of 229 individuals (69% women) were included at the study; 86 were patients with TMD and 143 were healthy control subjects. Subjects underwent to a clinical examination following the guidelines by the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). Genotyping of 20 Single Nucleotide Polymorphisms (SNPs), divided in two groups, was performed by multiplex minisequencing preceded by multiplex PCR. Other seven genetic polymorphisms different from SNPs (deletions, insertions, tandem repeat, null genotype) were achieved by a multiplex-PCR. A chi-square test was performed to determine the differences in genotype and allelic frequencies between TMD patients and healthy subjects. To estimate TMD risk, in those polymorphisms that shown significant differences, odds ratio (OR) with a 95% of confidence interval were calculated. RESULTS Six of the polymorphisms showed statistical associations with TMD. Four of them are related to enzymes of folates metabolism: Allele G of Serine Hydoxymethyltransferase 1 (SHMT1) rs1979277 (OR = 3.99; 95%CI 1.72, 9.25; p = 0.002), allele G of SHMT1 rs638416 (OR = 2.80; 95%CI 1.51, 5.21; p = 0.013), allele T of Methylentetrahydrofolate Dehydrogenase (MTHFD) rs2236225 (OR = 3.09; 95%CI 1.27, 7.50; p = 0.016) and allele A of Methionine Synthase Reductase (MTRR) rs1801394 (OR = 2.35; 95CI 1.10, 5.00; p = 0.037). An inflammatory oxidative stress enzyme, Gluthatione S-Tranferase Mu-1(GSTM1), null allele (OR = 2.21; 95%CI 1.24, 4.36; p = 0.030) and a neurotransmission receptor, Dopamine Receptor D4 (DRD4), long allele of 48 bp-repeat (OR = 3.62; 95%CI 0.76, 17.26; p = 0.161). CONCLUSIONS Some genetic polymorphisms related to folates metabolism, inflammatory oxidative stress, and neurotransmission responses to pain, has been significantly associated to TMD syndrome.
Resumo:
BACKGROUND Challenges exist in the clinical diagnosis of drug-induced liver injury (DILI) and in obtaining information on hepatotoxicity in humans. OBJECTIVE (i) To develop a unified list that combines drugs incriminated in well vetted or adjudicated DILI cases from many recognized sources and drugs that have been subjected to serious regulatory actions due to hepatotoxicity; and (ii) to supplement the drug list with data on reporting frequencies of liver events in the WHO individual case safety report database (VigiBase). DATA SOURCES AND EXTRACTION (i) Drugs identified as causes of DILI at three major DILI registries; (ii) drugs identified as causes of drug-induced acute liver failure (ALF) in six different data sources, including major ALF registries and previously published ALF studies; and (iii) drugs identified as being subjected to serious governmental regulatory actions due to their hepatotoxicity in Europe or the US were collected. The reporting frequency of adverse events was determined using VigiBase, computed as Empirical Bayes Geometric Mean (EBGM) with 90% confidence interval for two customized terms, 'overall liver injury' and 'ALF'. EBGM of >or=2 was considered a disproportional increase in reporting frequency. The identified drugs were then characterized in terms of regional divergence, published case reports, serious regulatory actions, and reporting frequency of 'overall liver injury' and 'ALF' calculated from VigiBase. DATA SYNTHESIS After excluding herbs, supplements and alternative medicines, a total of 385 individual drugs were identified; 319 drugs were identified in the three DILI registries, 107 from the six ALF registries (or studies) and 47 drugs that were subjected to suspension or withdrawal in the US or Europe due to their hepatotoxicity. The identified drugs varied significantly between Spain, the US and Sweden. Of the 319 drugs identified in the DILI registries of adjudicated cases, 93.4% were found in published case reports, 1.9% were suspended or withdrawn due to hepatotoxicity and 25.7% were also identified in the ALF registries/studies. In VigiBase, 30.4% of the 319 drugs were associated with disproportionally higher reporting frequency of 'overall liver injury' and 83.1% were associated with at least one reported case of ALF. CONCLUSIONS This newly developed list of drugs associated with hepatotoxicity and the multifaceted analysis on hepatotoxicity will aid in causality assessment and clinical diagnosis of DILI and will provide a basis for further characterization of hepatotoxicity.