30 resultados para Angle of 95% confidence
Resumo:
Objective To compute the burden of cancer attributable to current and former alcohol consumption in eight European countries based on direct relative risk estimates from a cohort study. Design Combination of prospective cohort study with representative population based data on alcohol exposure. Setting Eight countries (France, Italy, Spain, United Kingdom, the Netherlands, Greece, Germany, Denmark) participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Participants 109 118 men and 254 870 women, mainly aged 37-70. Main outcome measures Hazard rate ratios expressing the relative risk of cancer incidence for former and current alcohol consumption among EPIC participants. Hazard rate ratios combined with representative information on alcohol consumption to calculate alcohol attributable fractions of causally related cancers by country and sex. Partial alcohol attributable fractions for consumption higher than the recommended upper limit (two drinks a day for men with about 24 g alcohol, one for women with about 12 g alcohol) and the estimated total annual number of cases of alcohol attributable cancer. Results If we assume causality, among men and women, 10% (95% confidence interval 7 to 13%) and 3% (1 to 5%) of the incidence of total cancer was attributable to former and current alcohol consumption in the selected European countries. For selected cancers the figures were 44% (31 to 56%) and 25% (5 to 46%) for upper aerodigestive tract, 33% (11 to 54%) and 18% (−3 to 38%) for liver, 17% (10 to 25%) and 4% (−1 to 10%) for colorectal cancer for men and women, respectively, and 5.0% (2 to 8%) for female breast cancer. A substantial part of the alcohol attributable fraction in 2008 was associated with alcohol consumption higher than the recommended upper limit: 33 037 of 178 578 alcohol related cancer cases in men and 17 470 of 397 043 alcohol related cases in women. Conclusions In western Europe, an important proportion of cases of cancer can be attributable to alcohol consumption, especially consumption higher than the recommended upper limits. These data support current political efforts to reduce or to abstain from alcohol consumption to reduce the incidence of cancer.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
Objective: To determine the values of, and study the relationships among, central corneal thickness (CCT), intraocular pressure (IOP), and degree of myopia (DM) in an adult myopic population aged 20 to 40 years in Almeria (southeast Spain). To our knowledge this is first study of this kind in this region. Methods: An observational, descriptive, cross-sectional study was done in which a sample of 310 myopic patients (620 eyes) aged 20 to 40 years was selected by gender- and age-stratified sampling, which was proportionally fixed to the size of the population strata for which a 20% prevalence of myopia, 5% epsilon, and a 95% confidence interval were hypothesized. We studied IOP, CCT, and DM and their relationships by calculating the mean, standard deviation, 95% confidence interval for the mean, median, Fisher’s asymmetry coefficient, range (maximum, minimum), and the Brown-Forsythe’s robust test for each variable (IOP, CCT, and DM). Results: In the adult myopic population of Almeria aged 20 to 40 years (mean of 29.8), the mean overall CCT was 550.12 μm. The corneas of men were thicker than those of women (P = 0.014). CCT was stable as no significant differences were seen in the 20- to 40-year-old subjects’ CCT values. The mean overall IOP was 13.60 mmHg. Men had a higher IOP than women (P = 0.002). Subjects over 30 years (13.83) had a higher IOP than those under 30 (13.38) (P = 0.04). The mean overall DM was −4.18 diopters. Men had less myopia than women (P < 0.001). Myopia was stable in the 20- to 40-year-old study population (P = 0.089). A linear relationship was found between CCT and IOP (R2 = 0.152, P ≤ 0.001). CCT influenced the IOP value by 15.2%. However no linear relationship between DM and IOP, or between CCT and DM, was found. Conclusions: CCT was found to be similar to that reported in other studies in different populations. IOP tends to increase after the age of 30 and is not accounted for by alterations in CCT values.
Resumo:
To evaluate the long-term impact of successive interventions on rates of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection and MRSA bacteremia in an endemic hospital-wide situation. DESIGN:Quasi-experimental, interrupted time-series analysis. The impact of the interventions was analyzed by use of segmented regression. Representative MRSA isolates were typed by use of pulsed-field gel electrophoresis. SETTING:A 950-bed teaching hospital in Seville, Spain. PATIENTS:All patients admitted to the hospital during the period from 1995 through 2008. METHODS:Three successive interventions were studied: (1) contact precautions, with no active surveillance for MRSA; (2) targeted active surveillance for MRSA in patients and healthcare workers in specific wards, prioritized according to clinical epidemiology data; and (3) targeted active surveillance for MRSA in patients admitted from other medical centers. RESULTS:Neither the preintervention rate of MRSA colonization or infection (0.56 cases per 1,000 patient-days [95% confidence interval {CI}, 0.49-0.62 cases per 1,000 patient-days]) nor the slope for the rate of MRSA colonization or infection changed significantly after the first intervention. The rate decreased significantly to 0.28 cases per 1,000 patient-days (95% CI, 0.17-0.40 cases per 1,000 patient-days) after the second intervention and to 0.07 cases per 1,000 patient-days (95% CI, 0.06-0.08 cases per 1,000 patient-days) after the third intervention, and the rate remained at a similar level for 8 years. The MRSA bacteremia rate decreased by 80%, whereas the rate of bacteremia due to methicillin-susceptible S. aureus did not change. Eighty-three percent of the MRSA isolates identified were clonally related. All MRSA isolates obtained from healthcare workers were clonally related to those recovered from patients who were in their care. CONCLUSION:Our data indicate that long-term control of endemic MRSA is feasible in tertiary care centers. The use of targeted active surveillance for MRSA in patients and healthcare workers in specific wards (identified by means of analysis of clinical epidemiology data) and the use of decolonization were key to the success of the program.
Resumo:
BACKGROUND To assess and compare the effectiveness and costs of Phototest, Mini Mental State Examination (MMSE), and Memory Impairment Screen (MIS) to screen for dementia (DEM) and cognitive impairment (CI). METHODS A phase III study was conducted over one year in consecutive patients with suspicion of CI or DEM at four Primary Care (PC) centers. After undergoing all screening tests at the PC center, participants were extensively evaluated by researchers blinded to screening test results in a Cognitive-Behavioral Neurology Unit (CBNU). The gold standard diagnosis was established by consensus of expert neurologists. Effectiveness was assessed by the proportion of correct diagnoses (diagnostic accuracy [DA]) and by the kappa index of concordance between test results and gold standard diagnoses. Costs were based on public prices and hospital accounts. RESULTS The study included 140 subjects (48 with DEM, 37 with CI without DEM, and 55 without CI). The MIS could not be applied to 23 illiterate subjects (16.4%). For DEM, the maximum effectiveness of the MMSE was obtained with different cutoff points as a function of educational level [k = 0.31 (95% Confidence interval [95%CI], 0.19-0.43), DA = 0.60 (95%CI, 0.52-0.68)], and that of the MIS with a cutoff of 3/4 [k = 0.63 (95%CI, 0.48-0.78), DA = 0.83 (95%CI, 0.80-0.92)]. Effectiveness of the Phototest [k = 0.71 (95%CI, 0.59-0.83), DA = 0.87 (95%CI, 0.80-0.92)] was similar to that of the MIS and higher than that of the MMSE. Costs were higher with MMSE (275.9 ± 193.3€ [mean ± sd euros]) than with Phototest (208.2 ± 196.8€) or MIS (201.3 ± 193.4€), whose costs did not significantly differ. For CI, the effectiveness did not significantly differ between MIS [k = 0.59 (95%CI, 0.45-0.74), DA = 0.79 (95%CI, 0.64-0.97)] and Phototest [k = 0.58 (95%CI, 0.45-0.74), DA = 0.78 (95%CI, 0.64-0.95)] and was lowest for the MMSE [k = 0.27 (95%CI, 0.09-0.45), DA = 0.69 (95%CI, 0.56-0.84)]. Costs were higher for MMSE (393.4 ± 121.8€) than for Phototest (287.0 ± 197.4€) or MIS (300.1 ± 165.6€), whose costs did not significantly differ. CONCLUSION MMSE is not an effective instrument in our setting. For both DEM and CI, the Phototest and MIS are more effective and less costly, with no difference between them. However, MIS could not be applied to the appreciable percentage of our population who were illiterate.
Resumo:
Introduction. Critically ill patients suffer from oxidative stress caused by reactive oxygen species (ROS) and reactive nitrogen species (RNS). Although ROS/RNS are constantly produced under normal circumstances, critical illness can drastically increase their production. These patients have reduced plasma and intracellular levels of antioxidants and free electron scavengers or cofactors, and decreased activity of the enzymatic system involved in ROS detoxification. The pro-oxidant/antioxidant balance is of functional relevance during critical illness because it is involved in the pathogenesis of multiple organ failure. In this study the objective was to evaluate the relation between oxidative stress in critically ill patients and antioxidant vitamin intake and severity of illness. Methods. Spectrophotometry was used to measure in plasma the total antioxidant capacity and levels of lipid peroxide, carbonyl group, total protein, bilirubin and uric acid at two time points: at intensive care unit (ICU) admission and on day seven. Daily diet records were kept and compliance with recommended dietary allowance (RDA) of antioxidant vitamins (A, C and E) was assessed. Results. Between admission and day seven in the ICU, significant increases in lipid peroxide and carbonyl group were associated with decreased antioxidant capacity and greater deterioration in Sequential Organ Failure Assessment score. There was significantly greater worsening in oxidative stress parameters in patients who received antioxidant vitamins at below 66% of RDA than in those who received antioxidant vitamins at above 66% of RDA. An antioxidant vitamin intake from 66% to 100% of RDA reduced the risk for worsening oxidative stress by 94% (ods ratio 0.06, 95% confidence interval 0.010 to 0.39), regardless of change in severity of illness (Sequential Organ Failure Assessment score). Conclusion. The critical condition of patients admitted to the ICU is associated with worsening oxidative stress. Intake of antioxidant vitamins below 66% of RDA and alteration in endogenous levels of substances with antioxidant capacity are related to redox imbalance in critical ill patients. Therefore, intake of antioxidant vitamins should be carefully monitored so that it is as close as possible to RDA.
Resumo:
BACKGROUND Available screening tests for dementia are of limited usefulness because they are influenced by the patient's culture and educational level. The Eurotest, an instrument based on the knowledge and handling of money, was designed to overcome these limitations. The objective of this study was to evaluate the diagnostic accuracy of the Eurotest in identifying dementia in customary clinical practice. METHODS A cross-sectional, multi-center, naturalistic phase II study was conducted. The Eurotest was administered to consecutive patients, older than 60 years, in general neurology clinics. The patients' condition was classified as dementia or no dementia according to DSM-IV diagnostic criteria. We calculated sensitivity (Sn), specificity (Sp) and area under the ROC curves (aROC) with 95% confidence intervals. The influence of social and educational factors on scores was evaluated with multiple linear regression analysis, and the influence of these factors on diagnostic accuracy was evaluated with logistic regression. RESULTS Sixteen neurologists recruited a total of 516 participants: 101 with dementia, 380 without dementia, and 35 who were excluded. Of the 481 participants who took the Eurotest, 38.7% were totally or functionally illiterate and 45.5% had received no formal education. Mean time needed to administer the test was 8.2+/-2.0 minutes. The best cut-off point was 20/21, with Sn = 0.91 (0.84-0.96), Sp = 0.82 (0.77-0.85), and aROC = 0.93 (0.91-0.95). Neither the scores on the Eurotest nor its diagnostic accuracy were influenced by social or educational factors. CONCLUSION This naturalistic and pragmatic study shows that the Eurotest is a rapid, simple and useful screening instrument, which is free from educational influences, and has appropriate internal and external validity.
Resumo:
BACKGROUND Previous studies have demonstrated the efficacy of treatment for latent tuberculosis infection (TLTBI) in persons infected with the human immunodeficiency virus, but few studies have investigated the operational aspects of implementing TLTBI in the co-infected population.The study objectives were to describe eligibility for TLTBI as well as treatment prescription, initiation and completion in an HIV-infected Spanish cohort and to investigate factors associated with treatment completion. METHODS Subjects were prospectively identified between 2000 and 2003 at ten HIV hospital-based clinics in Spain. Data were obtained from clinical records. Associations were measured using the odds ratio (OR) and its 95% confidence interval (95% CI). RESULTS A total of 1242 subjects were recruited and 846 (68.1%) were evaluated for TLTBI. Of these, 181 (21.4%) were eligible for TLTBI either because they were tuberculin skin test (TST) positive (121) or because their TST was negative/unknown but they were known contacts of a TB case or had impaired immunity (60). Of the patients eligible for TLTBI, 122 (67.4%) initiated TLTBI: 99 (81.1%) were treated with isoniazid for 6, 9 or 12 months; and 23 (18.9%) with short-course regimens including rifampin plus isoniazid and/or pyrazinamide. In total, 70 patients (57.4%) completed treatment, 39 (32.0%) defaulted, 7 (5.7%) interrupted treatment due to adverse effects, 2 developed TB, 2 died, and 2 moved away. Treatment completion was associated with having acquired HIV infection through heterosexual sex as compared to intravenous drug use (OR:4.6; 95% CI:1.4-14.7) and with having taken rifampin and pyrazinamide for 2 months as compared to isoniazid for 9 months (OR:8.3; 95% CI:2.7-24.9). CONCLUSIONS A minority of HIV-infected patients eligible for TLTBI actually starts and completes a course of treatment. Obstacles to successful implementation of this intervention need to be addressed.
Resumo:
BACKGROUND Taxanes are among the most active drugs for the treatment of metastatic breast cancer, and, as a consequence, they have also been studied in the adjuvant setting. METHODS After breast cancer surgery, women with lymph node-positive disease were randomly assigned to treatment with fluorouracil, epirubicin, and cyclophosphamide (FEC) or with FEC followed by weekly paclitaxel (FEC-P). The primary endpoint of study-5-year disease-free survival (DFS)-was assessed by Kaplan-Meier analysis. Secondary endpoints included overall survival and analysis of the prognostic and predictive value of clinical and molecular (hormone receptors by immunohistochemistry and HER2 by fluorescence in situ hybridization) markers. Associations and interactions were assessed with a multivariable Cox proportional hazards model for DFS for the following covariates: age, menopausal status, tumor size, lymph node status, type of chemotherapy, tumor size, positive lymph nodes, HER2 status, and hormone receptor status. All statistical tests were two-sided. RESULTS Among the 1246 eligible patients, estimated rates of DFS at 5 years were 78.5% in the FEC-P arm and 72.1% in the FEC arm (difference = 6.4%, 95% confidence interval [CI] = 1.6% to 11.2%; P = .006). FEC-P treatment was associated with a 23% reduction in the risk of relapse compared with FEC treatment (146 relapses in the 614 patients in the FEC-P arm vs 193 relapses in the 632 patients in the FEC arm, hazard ratio [HR] = 0.77, 95% CI = 0.62 to 0.95; P = .022) and a 22% reduction in the risk of death (73 and 95 deaths, respectively, HR = 0.78, 95% CI = 0.57 to 1.06; P = .110). Among the 928 patients for whom tumor samples were centrally analyzed, type of chemotherapy (FEC vs FEC-P) (P = .017), number of involved axillary lymph nodes (P < .001), tumor size (P = .020), hormone receptor status (P = .004), and HER2 status (P = .006) were all associated with DFS. We found no statistically significant interaction between HER2 status and paclitaxel treatment or between hormone receptor status and paclitaxel treatment. CONCLUSIONS Among patients with operable breast cancer, FEC-P treatment statistically significantly reduced the risk of relapse compared with FEC as adjuvant therapy.
Resumo:
BACKGROUND A catheter-based approach after fibrinolysis is recommended if fibrinolysis is likely to be successful in patients with acute ST-elevation myocardial infarction. We designed a 2x2 randomized, open-label, multicenter trial to evaluate the efficacy and safety of the paclitaxel-eluting stent and tirofiban administered after fibrinolysis but before catheterization to optimize the results of this reperfusion strategy. METHODS AND RESULTS We randomly assigned 436 patients with acute ST-elevation myocardial infarction to (1) bare-metal stent without tirofiban, (2) bare-metal stent with tirofiban, (3) paclitaxel-eluting stent without tirofiban, and (4) paclitaxel-eluting stent with tirofiban. All patients were initially treated with tenecteplase and enoxaparin. Tirofiban was started 120 minutes after tenecteplase in those patients randomly assigned to tirofiban. Cardiac catheterization was performed within the first 3 to 12 hours after inclusion, and stenting (randomized paclitaxel or bare stent) was applied to the culprit artery. The primary objectives were the rate of in-segment binary restenosis of paclitaxel-eluting stent compared with that of bare-metal stent and the effect of tirofiban on epicardial and myocardial flow before and after mechanical revascularization. At 12 months, in-segment binary restenosis was similar between paclitaxel-eluting stent and bare-metal stent (10.1% versus 11.3%; relative risk, 1.06; 95% confidence interval, 0.74 to 1.52; P=0.89). However, late lumen loss (0.04+/-0.055 mm versus 0.27+/-0.057 mm, P=0.003) was reduced in the paclitaxel-eluting stent group. No evidence was found of any association between the use of tirofiban and any improvement in the epicardial and myocardial perfusion. Major bleeding was observed in 6.1% of patients receiving tirofiban and in 2.7% of patients not receiving it (relative risk, 2.22; 95% confidence interval, 0.86 to 5.73; P=0.14). CONCLUSIONS This trial does not provide evidence to support the use of tirofiban after fibrinolysis to improve epicardial and myocardial perfusion. Compared with bare-metal stent, paclitaxel-eluting stent significantly reduced late loss but appeared not to reduce in-segment binary restenosis. CLINICAL TRIAL REGISTRATION URL: http://clinicaltrials.gov. Unique identifier: NCT00306228.
Resumo:
BACKGROUND Waist circumference (WC) is a simple and reliable measure of fat distribution that may add to the prediction of type 2 diabetes (T2D), but previous studies have been too small to reliably quantify the relative and absolute risk of future diabetes by WC at different levels of body mass index (BMI). METHODS AND FINDINGS The prospective InterAct case-cohort study was conducted in 26 centres in eight European countries and consists of 12,403 incident T2D cases and a stratified subcohort of 16,154 individuals from a total cohort of 340,234 participants with 3.99 million person-years of follow-up. We used Prentice-weighted Cox regression and random effects meta-analysis methods to estimate hazard ratios for T2D. Kaplan-Meier estimates of the cumulative incidence of T2D were calculated. BMI and WC were each independently associated with T2D, with WC being a stronger risk factor in women than in men. Risk increased across groups defined by BMI and WC; compared to low normal weight individuals (BMI 18.5-22.4 kg/m(2)) with a low WC (<94/80 cm in men/women), the hazard ratio of T2D was 22.0 (95% confidence interval 14.3; 33.8) in men and 31.8 (25.2; 40.2) in women with grade 2 obesity (BMI≥35 kg/m(2)) and a high WC (>102/88 cm). Among the large group of overweight individuals, WC measurement was highly informative and facilitated the identification of a subgroup of overweight people with high WC whose 10-y T2D cumulative incidence (men, 70 per 1,000 person-years; women, 44 per 1,000 person-years) was comparable to that of the obese group (50-103 per 1,000 person-years in men and 28-74 per 1,000 person-years in women). CONCLUSIONS WC is independently and strongly associated with T2D, particularly in women, and should be more widely measured for risk stratification. If targeted measurement is necessary for reasons of resource scarcity, measuring WC in overweight individuals may be an effective strategy, since it identifies a high-risk subgroup of individuals who could benefit from individualised preventive action.
Resumo:
OBJECTIVE To assess the association between consumption of fried foods and risk of coronary heart disease. DESIGN Prospective cohort study. SETTING Spanish cohort of the European Prospective Investigation into Cancer and Nutrition. PARTICIPANTS 40 757 adults aged 29-69 and free of coronary heart disease at baseline (1992-6), followed up until 2004. MAIN OUTCOME MEASURES Coronary heart disease events and vital status identified by record linkage with hospital discharge registers, population based registers of myocardial infarction, and mortality registers. RESULTS During a median follow-up of 11 years, 606 coronary heart disease events and 1135 deaths from all causes occurred. Compared with being in the first (lowest) quarter of fried food consumption, the multivariate hazard ratio of coronary heart disease in the second quarter was 1.15 (95% confidence interval 0.91 to 1.45), in the third quarter was 1.07 (0.83 to 1.38), and in the fourth quarter was 1.08 (0.82 to 1.43; P for trend 0.74). The results did not vary between those who used olive oil for frying and those who used sunflower oil. Likewise, no association was observed between fried food consumption and all cause mortality: multivariate hazard ratio for the highest versus the lowest quarter of fried food consumption was 0.93 (95% confidence interval 0.77 to 1.14; P for trend 0.98). CONCLUSION In Spain, a Mediterranean country where olive or sunflower oil is used for frying, the consumption of fried foods was not associated with coronary heart disease or with all cause mortality.
Resumo:
BACKGROUND Earlier analyses within the EPIC study showed that dietary fibre intake was inversely associated with colorectal cancer risk, but results from some large cohort studies do not support this finding. We explored whether the association remained after longer follow-up with a near threefold increase in colorectal cancer cases, and if the association varied by gender and tumour location. METHODOLOGY/PRINCIPAL FINDINGS After a mean follow-up of 11.0 years, 4,517 incident cases of colorectal cancer were documented. Total, cereal, fruit, and vegetable fibre intakes were estimated from dietary questionnaires at baseline. Hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated using Cox proportional hazards models stratified by age, sex, and centre, and adjusted for total energy intake, body mass index, physical activity, smoking, education, menopausal status, hormone replacement therapy, oral contraceptive use, and intakes of alcohol, folate, red and processed meats, and calcium. After multivariable adjustments, total dietary fibre was inversely associated with colorectal cancer (HR per 10 g/day increase in fibre 0.87, 95% CI: 0.79-0.96). Similar linear associations were observed for colon and rectal cancers. The association between total dietary fibre and risk of colorectal cancer risk did not differ by age, sex, or anthropometric, lifestyle, and dietary variables. Fibre from cereals and fibre from fruit and vegetables were similarly associated with colon cancer; but for rectal cancer, the inverse association was only evident for fibre from cereals. CONCLUSIONS/SIGNIFICANCE Our results strengthen the evidence for the role of high dietary fibre intake in colorectal cancer prevention.
Resumo:
PURPOSE We aimed to ascertain the degree of association between bladder cancer and human papillomavirus (HPV) infection. MATERIALS AND METHODS We performed a meta-analysis of observational studies with cases and controls with publication dates up to January 2011. The PubMed electronic database was searched by using the key words "bladder cancer and virus." Twenty-one articles were selected that met the required methodological criteria. We implemented an internal quality control system to verify the selected search method. We analyzed the pooled effect of all the studies and also analyzed the techniques used as follows: 1) studies with DNA-based techniques, among which we found studies with polymerase chain reaction (PCR)-based techniques and 2) studies with non-PCR-based techniques, and studies with non-DNA-based techniques. RESULTS Taking into account the 21 studies that were included in the meta-analysis, we obtained a heterogeneity chi-squared value of Q(exp)=26.45 (p=0.383). The pooled odds ratio (OR) was 2.13 (95% confidence interval [CI], 1.54 to 2.95), which points to a significant effect between HPV and bladder cancer. Twenty studies assessed the presence of DNA. The overall effect showed a significant relationship between virus presence and bladder cancer, with a pooled OR of 2.19 (95% CI, 1.40 to 3.43). Of the other six studies, four examined the virus's capsid antigen and two detected antibodies in serum by Western blot. The estimated pooled OR in this group was 2.11 (95% CI, 1.27 to 3.51), which confirmed the relationship between the presence of virus and cancer. CONCLUSIONS The pooled OR value showed a moderate relationship between viral infection and bladder tumors.
Resumo:
BACKGROUND The number of copies of the HLA-DRB1 shared epitope, and the minor alleles of the STAT4 rs7574865 and the PTPN22 rs2476601 polymorphisms have all been linked with an increased risk of developing rheumatoid arthritis. In the present study, we investigated the effects of these genetic variants on disease activity and disability in patients with early arthritis. METHODOLOGY AND RESULTS We studied 640 patients with early arthritis (76% women; median age, 52 years), recording disease-related variables every 6 months during a 2-year follow-up. HLA-DRB1 alleles were determined by PCR-SSO, while rs7574865 and rs2476601 were genotyped with the Taqman 5' allelic discrimination assay. Multivariate analysis was performed using generalized estimating equations for repeated measures. After adjusting for confounding variables such as gender, age and ACPA, the TT genotype of rs7574865 in STAT4 was associated with increased disease activity (DAS28) as compared with the GG genotype (β coefficient [95% confidence interval] = 0.42 [0.01-0.83], p = 0.044). Conversely, the presence of the T allele of rs2476601 in PTPN22 was associated with diminished disease activity during follow-up in a dose-dependent manner (CT genotype = -0.27 [-0.56- -0.01], p = 0.042; TT genotype = -0.68 [-1.64- -0.27], p = 0.162). After adjustment for gender, age and disease activity, homozygosity for the T allele of rs7574865 in STAT4 was associated with greater disability as compared with the GG genotype. CONCLUSIONS Our data suggest that patients with early arthritis who are homozygous for the T allele of rs7574865 in STAT4 may develop a more severe form of the disease with increased disease activity and disability.