931 resultados para risk-adjusted return
Resumo:
Cardiovascular disease is one of the leading causes of death around the world. Resting heart rate has been shown to be a strong and independent risk marker for adverse cardiovascular events and mortality, and yet its role as a predictor of risk is somewhat overlooked in clinical practice. With the aim of highlighting its prognostic value, the role of resting heart rate as a risk marker for death and other adverse outcomes was further examined in a number of different patient populations. A systematic review of studies that previously assessed the prognostic value of resting heart rate for mortality and other adverse cardiovascular outcomes was presented. New analyses of nine clinical trials were carried out. Both the original and extended Cox model that allows for analysis of time-dependent covariates were used to evaluate and compare the predictive value of baseline and time-updated heart rate measurements for adverse outcomes in the CAPRICORN, EUROPA, PROSPER, PERFORM, BEAUTIFUL and SHIFT populations. Pooled individual patient meta-analyses of the CAPRICORN, EPHESUS, OPTIMAAL and VALIANT trials, and the BEAUTIFUL and SHIFT trials, were also performed. The discrimination and calibration of the models applied were evaluated using Harrell’s C-statistic and likelihood ratio tests, respectively. Finally, following on from the systematic review, meta-analyses of the relation between baseline and time-updated heart rate, and the risk of death from any cause and from cardiovascular causes, were conducted. Both elevated baseline and time-updated resting heart rates were found to be associated with an increase in the risk of mortality and other adverse cardiovascular events in all of the populations analysed. In some cases, elevated time-updated heart rate was associated with risk of events where baseline heart rate was not. Time-updated heart rate also contributed additional information about the risk of certain events despite knowledge of baseline heart rate or previous heart rate measurements. The addition of resting heart rate to the models where resting heart rate was found to be associated with risk of outcome improved both discrimination and calibration, and in general, the models including time-updated heart rate along with baseline or the previous heart rate measurement had the highest and similar C-statistics, and thus the greatest discriminative ability. The meta-analyses demonstrated that a 5bpm higher baseline heart rate was associated with a 7.9% and an 8.0% increase in the risk of all-cause and cardiovascular death, respectively (both p less than 0.001). Additionally, a 5bpm higher time-updated heart rate (adjusted for baseline heart rate in eight of the ten studies included in the analyses) was associated with a 12.8% (p less than 0.001) and a 10.9% (p less than 0.001) increase in the risk of all-cause and cardiovascular death, respectively. These findings may motivate health care professionals to routinely assess resting heart rate in order to identify individuals at a higher risk of adverse events. The fact that the addition of time-updated resting heart rate improved the discrimination and calibration of models for certain outcomes, even if only modestly, strengthens the case that it be added to traditional risk models. The findings, however, are of particular importance, and have greater implications for the clinical management of patients with pre-existing disease. An elevated, or increasing heart rate over time could be used as a tool, potentially alongside other established risk scores, to help doctors identify patient deterioration or those at higher risk, who might benefit from more intensive monitoring or treatment re-evaluation. Further exploration of the role of continuous recording of resting heart rate, say, when patients are at home, would be informative. In addition, investigation into the cost-effectiveness and optimal frequency of resting heart rate measurement is required. One of the most vital areas for future research is the definition of an objective cut-off value for the definition of a high resting heart rate.
Resumo:
Objective: Liver transplantation has been associated with a high prevalence of osteoporosis, although most data rely on single-center studies with limited sample size, with most of them dating back to late 1990s and early 2000s. The present thesis aims to assess the prevalence of fragility fractures and contributing factors in a large modern cohort of liver transplant recipients managed in a referral Italian Liver Transplant Center. Design and Methods: Paper and electronic medical records of 429 consecutive patients receiving liver transplantation from 1/1/2010 to 31/12/2015 were reviewed, and 366 patients were selected. Clinically obtained electronic radiological images within 6 months from the date of liver transplant surgery, such as lateral views of spine X-rays or CT abdominal scans, were opportunistically reviewed in a blinded fashion to screen for morphometric vertebral fractures. Clinical fragility fractures reported in the medical records, along with information on etiology of cirrhosis and biochemistries at the time of liver surgery were also recorded. Results: Prevalence of fragility fractures in the whole cohort was 155/366 (42.3%), with no significant differences between sexes. Of patients with fractures, most sustained vertebral fractures (145/155, 93.5%), the majority of which were mild or moderate wedges. Multiple vertebral fractures were common (41.3%). Fracture rates were similar across different etiologies of cirrhosis and were also comparable in patients with diabetes or exposed to glucocorticoids. Kidney function was significantly worse in women with fractures. Independent of age, sex, alcohol use, eGFR, etiology of liver disease, lower BMI was the only independent risk factor for fractures (adjusted OR 1,058, 95%CI 1,001-1,118, P=0.046) in this study population. Conclusions: A considerable fracture burden was shown in a large and modern cohort of liver transplant recipients. Given the remarkably high prevalence of fractures, a metabolic bone disease screening should be implemented in every patient awaiting liver transplantation.
Resumo:
Hypertensive patients exhibit higher cardiovascular risk and reduced lung function compared with the general population. Whether this association stems from the coexistence of two highly prevalent diseases or from direct or indirect links of pathophysiological mechanisms is presently unclear. This study investigated the association between lung function and carotid features in non-smoking hypertensive subjects with supposed normal lung function. Hypertensive patients (n = 67) were cross-sectionally evaluated by clinical, hemodynamic, laboratory, and carotid ultrasound analysis. Forced vital capacity, forced expired volume in 1 second and in 6 seconds, and lung age were estimated by spirometry. Subjects with ventilatory abnormalities according to current guidelines were excluded. Regression analysis adjusted for age and prior smoking history showed that lung age and the percentage of predicted spirometric parameters associated with common carotid intima-media thickness, diameter, and stiffness. Further analyses, adjusted for additional potential confounders, revealed that lung age was the spirometric parameter exhibiting the most significant regression coefficients with carotid features. Conversely, plasma C-reactive protein and matrix-metalloproteinases-2/9 levels did not influence this relationship. The present findings point toward lung age as a potential marker of vascular remodeling and indicate that lung and vascular remodeling might share common pathophysiological mechanisms in hypertensive subjects.
Resumo:
To evaluate associations between polymorphisms of the N-acetyltransferase 2 (NAT2), human 8-oxoguanine glycosylase 1 (hOGG1) and X-ray repair cross-complementing protein 1 (XRCC1) genes and risk of upper aerodigestive tract (UADT) cancer. A case-control study involving 117 cases and 224 controls was undertaken. The NAT2 gene polymorphisms were genotyped by automated sequencing and XRCC1 Arg399Gln and hOGG1 Ser326Cys polymorphisms were determined by Polymerase Chain Reaction followed by Restriction Fragment Length Polymorphism (PCR-RFLP) methods. Slow metabolization phenotype was significantly associated as a risk factor for the development of UADT cancer (p=0.038). Furthermore, haplotype of slow metabolization was also associated with UADT cancer (p=0.014). The hOGG1 Ser326Cys polymorphism (CG or GG vs. CC genotypes) was shown as a protective factor against UADT cancer in moderate smokers (p=0.031). The XRCC1 Arg399Gln polymorphism (GA or AA vs. GG genotypes), in turn, was a protective factor against UADT cancer only among never-drinkers (p=0.048). Interactions involving NAT2, XRCC1 Arg399Gln and hOGG1 Ser326Cys polymorphisms may modulate the risk of UADT cancer in this population.
Resumo:
To compare neonatal deaths and complications in infants born at 34-36 weeks and six days (late preterm: LPT) with those born at term (37-41 weeks and six days); to compare deaths of early term (37-38 weeks) versus late term (39-41 weeks and six days) infants; to search for any temporal trend in LPT rate. A retrospective cohort study of live births was conducted in the Campinas State University, Brazil, from January 2004 to December 2010. Multiple pregnancies, malformations and congenital diseases were excluded. Control for confounders was performed. The level of significance was set at p<0.05. After exclusions, there were 17,988 births (1653 late preterm and 16,345 term infants). A higher mortality in LPT versus term was observed, with an adjusted odds ratio (OR) of 5.29 (p<0.0001). Most complications were significantly associated with LPT births. There was a significant increase in LPT rate throughout the study period, but no significant trend in the rate of medically indicated deliveries. A higher mortality was observed in early term versus late term infants, with adjusted OR: 2.43 (p=0.038). LPT and early term infants have a significantly higher risk of death.
Resumo:
To analyze associations between mammographic arterial mammary calcifications in menopausal women and risk factors for cardiovascular disease. This was a cross-sectional retrospective study, in which we analyzed the mammograms and medical records of 197 patients treated between 2004 and 2005. Study variables were: breast arterial calcifications, stroke, acute coronary syndrome, age, obesity, diabetes mellitus, smoking, and hypertension. For statistical analysis, we used the Mann-Whitney, χ2 and Cochran-Armitage tests, and also evaluated the prevalence ratios between these variables and mammary artery calcifications. Data were analyzed with the SAS version 9.1 software. In the group of 197 women, there was a prevalence of 36.6% of arterial calcifications on mammograms. Among the risk factors analyzed, the most frequent were hypertension (56.4%), obesity (31.9%), smoking (15.2%), and diabetes (14.7%). Acute coronary syndrome and stroke presented 5.6 and 2.0% of prevalence, respectively. Among the mammograms of women with diabetes, the odds ratio of mammary artery calcifications was 2.1 (95%CI 1.0-4.1), with p-value of 0.02. On the other hand, the mammograms of smokers showed the low occurrence of breast arterial calcification, with an odds ratio of 0.3 (95%CI 0.1-0.8). Hypertension, obesity, diabetes mellitus, stroke and acute coronary syndrome were not significantly associated with breast arterial calcification. The occurrence of breast arterial calcification was associated with diabetes mellitus and was negatively associated with smoking. The presence of calcification was independent of the other risk factors for cardiovascular disease analyzed.
Resumo:
What is the contribution of the provision, at no cost for users, of long acting reversible contraceptive methods (LARC; copper intrauterine device [IUD], the levonorgestrel-releasing intrauterine system [LNG-IUS], contraceptive implants and depot-medroxyprogesterone [DMPA] injection) towards the disability-adjusted life years (DALY) averted through a Brazilian university-based clinic established over 30 years ago. Over the last 10 years of evaluation, provision of LARC methods and DMPA by the clinic are estimated to have contributed to DALY averted by between 37 and 60 maternal deaths, 315-424 child mortalities, 634-853 combined maternal morbidity and mortality and child mortality, and 1056-1412 unsafe abortions averted. LARC methods are associated with a high contraceptive effectiveness when compared with contraceptive methods which need frequent attention; perhaps because LARC methods are independent of individual or couple compliance. However, in general previous studies have evaluated contraceptive methods during clinical studies over a short period of time, or not more than 10 years. Furthermore, information regarding the estimation of the DALY averted is scarce. We reviewed 50 004 medical charts from women who consulted for the first time looking for a contraceptive method over the period from 2 January 1980 through 31 December 2012. Women who consulted at the Department of Obstetrics and Gynaecology, University of Campinas, Brazil were new users and users switching contraceptive, including the copper IUD (n = 13 826), the LNG-IUS (n = 1525), implants (n = 277) and DMPA (n = 9387). Estimation of the DALY averted included maternal morbidity and mortality, child mortality and unsafe abortions averted. We obtained 29 416 contraceptive segments of use including 25 009 contraceptive segments of use from 20 821 new users or switchers to any LARC method or DMPA with at least 1 year of follow-up. The mean (± SD) age of the women at first consultation ranged from 25.3 ± 5.7 (range 12-47) years in the 1980s, to 31.9 ± 7.4 (range 16-50) years in 2010-2011. The most common contraceptive chosen at the first consultation was copper IUD (48.3, 74.5 and 64.7% in the 1980s, 1990s and 2000s, respectively). For an evaluation over 20 years, the cumulative pregnancy rates (SEM) were 0.4 (0.2), 2.8 (2.1), 4.0 (0.4) and 1.3 (0.4) for the LNG-IUS, the implants, copper IUD and DMPA, respectively and cumulative continuation rates (SEM) were 15.1 (3.7), 3.9 (1.4), 14.1 (0.6) and 7.3 (1.7) for the LNG-IUS, implants, copper IUD and DMPA, respectively (P < 0.001). Over the last 10 years of evaluation, the estimation of the contribution of the clinic through the provision of LARC methods and DMPA to DALY averted was 37-60 maternal deaths; between 315 and 424 child mortalities; combined maternal morbidity and mortality and child mortality of between 634 and 853, and 1056-1412 unsafe abortions averted. The main limitations are the number of women who never returned to the clinic (overall 14% among the four methods under evaluation); consequently the pregnancy rate could be different. Other limitations include the analysis of two kinds of copper IUD and two kinds of contraceptive implants as the same IUD or implant, and the low number of users of implants. In addition, the DALY calculation relies on a number of estimates, which may vary in different parts of the world. LARC methods and DMPA are highly effective and women who were well-counselled used these methods for a long time. The benefit of averting maternal morbidity and mortality, child mortality, and unsafe abortions is an example to health policy makers to implement more family planning programmes and to offer contraceptive methods, mainly LARC and DMPA, at no cost or at affordable cost for the underprivileged population. This study received partial financial support from the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP), grant # 2012/12810-4 and from the National Research Council (CNPq), grant #573747/2008-3. B.F.B., M.P.G., and V.M.C. were fellows from the scientific initiation programme from FAPESP. Since the year 2001, all the TCu380A IUD were donated by Injeflex, São Paulo, Brazil, and from the year 2006 all the LNG-IUS were donated by the International Contraceptive Access Foundation (ICA), Turku, Finland. Both donations are as unrestricted grants. The authors declare that there are no conflicts of interest associated with this study.
Resumo:
There is an increasing rate of papillary thyroid carcinomas that may never progress to cause symptoms or death. Predicting outcome and determining tumour aggressiveness could help diminish the number of patients submitted to aggressive treatments. We aimed to evaluate whether markers of the immune system response and of tumour-associated inflammation could predict outcome of differentiated thyroid cancer (DTC) patients. Retrospective cohort study. We studied 399 consecutive patients, including 325 papillary and 74 follicular thyroid carcinomas. Immune cell markers were evaluated using immunohistochemistry, including tumour-associated macrophages (CD68) and subsets of tumour-infiltrating lymphocytes (TIL), such as CD3, CD4, CD8, CD16, CD20, CD45RO, GRANZYME B, CD69 and CD25. We also investigated the expression of cyclooxygenase 2 (COX2) in tumour cells and the presence of concurrent lymphocytic infiltration characterizing chronic thyroiditis. Concurrent lymphocytic infiltration characterizing chronic thyroiditis was observed in 29% of the cases. Among all the immunological parameters evaluated, only the enrichment of CD8+ lymphocytes (P = 0·001) and expression of COX2 (P =0·01) were associated with recurrence. A multivariate model analysis identified CD8+ TIL/COX2 as independent risk factor for recurrence. A multivariate analysis using Cox's proportional-hazards model adjusted for the presence of concurrent chronic thyroiditis demonstrated that the presence of concurrent chronic thyroiditis had no effect on prognostic prediction mediated by CD8+ TIL and COX2. In conclusion, we suggest the use of a relatively simple pathology tool to help select cases that may benefit of a more aggressive approach sparing the majority of patients from unnecessary procedures.
Resumo:
Urinary tract infection (UTI) is the most common infection posttransplant. However, the risk factors for and the impact of UTIs remain controversial. The aim of this study was to identify the incidence of posttransplant UTIs in a series of renal transplant recipients from deceased donors. Secondary objectives were to identify: (1) the most frequent infectious agents; (2) risk factors related to donor; (3) risk factors related to recipients; and (4) impact of UTI on graft function. This was a retrospective analysis of medical records from renal transplant patients from January to December 2010. Local ethics committee approved the protocol. The incidence of UTI in this series was 34.2%. Risk factors for UTI were older age, (independent of gender), biopsy-proven acute rejection episodes, and kidneys from deceased donors (United Network for Organ Sharing criteria). For female patients, the number of pretransplant pregnancies was an additional risk factor. Recurrent UTI was observed in 44% of patients from the UTI group. The most common infectious agents were Escherichia coli and Klebsiella pneumoniae, for both isolated and recurrent UTI. No difference in renal graft function or immunosuppressive therapy was observed between groups after the 1-year follow-up. In this series, older age, previous pregnancy, kidneys from expanded criteria donors, and biopsy-proven acute rejection episodes were risk factors for posttransplant UTI. Recurrence of UTI was observed in 44%, with no negative impact on graft function or survival.
Resumo:
The aim of this study was to determine the frequency of leukemia in parents of patients with nonsyndromic cleft lip and/or cleft palate (NSCL/P). This case-control study evaluated first-degree family members of 358 patients with NSCL/P and 1,432 subjects without craniofacial alterations or syndromes. Statistical analysis was carried out using Fisher's test. From the 358 subjects with NSCL/P, 3 first-degree parents had history of leukemia, while 2 out of 1,432 subjects from the unaffected group had a family history of leukemia. The frequency of positive family history of leukemia was not significantly increased in first-degree relatives of patients with NSCL/P.
Resumo:
This study tested whether myocardial extracellular volume (ECV) is increased in patients with hypertension and atrial fibrillation (AF) undergoing pulmonary vein isolation and whether there is an association between ECV and post-procedural recurrence of AF. Hypertension is associated with myocardial fibrosis, an increase in ECV, and AF. Data linking these findings are limited. T1 measurements pre-contrast and post-contrast in a cardiac magnetic resonance (CMR) study provide a method for quantification of ECV. Consecutive patients with hypertension and recurrent AF referred for pulmonary vein isolation underwent a contrast CMR study with measurement of ECV and were followed up prospectively for a median of 18 months. The endpoint of interest was late recurrence of AF. Patients had elevated left ventricular (LV) volumes, LV mass, left atrial volumes, and increased ECV (patients with AF, 0.34 ± 0.03; healthy control patients, 0.29 ± 0.03; p < 0.001). There were positive associations between ECV and left atrial volume (r = 0.46, p < 0.01) and LV mass and a negative association between ECV and diastolic function (early mitral annular relaxation [E'], r = -0.55, p < 0.001). In the best overall multivariable model, ECV was the strongest predictor of the primary outcome of recurrent AF (hazard ratio: 1.29; 95% confidence interval: 1.15 to 1.44; p < 0.0001) and the secondary composite outcome of recurrent AF, heart failure admission, and death (hazard ratio: 1.35; 95% confidence interval: 1.21 to 1.51; p < 0.0001). Each 10% increase in ECV was associated with a 29% increased risk of recurrent AF. In patients with AF and hypertension, expansion of ECV is associated with diastolic function and left atrial remodeling and is a strong independent predictor of recurrent AF post-pulmonary vein isolation.
Resumo:
Focal cryoablation (FC), brachytherapy (B) and active surveillance (AS) were offered to patients diagnosed with very low-risk prostate cancer (VLRPC) in an equal access protocol. Comprehensive validated self-report questionnaires accessed patients' erectile (IIEF-5) and voiding (IPSS) functions, Beck scales measured anxiety (BAI), hopelessness (BHS) and depression (BDI), SF-36 reflected patients' quality of life added to the emotional thermometers including five visual analogue scales (distress, anxiety, depression, anger and need for help). Kruskal-Wallis or ANOVA tests and Spearman's correlations were obtained among groups and studied variables. Thirty patients were included, median follow-up 18 months (15-21). Those on AS (n = 11) were older, presented higher hopelessness (BHS) and lower general health perceptions (SF-36) scores than patients opting for FC (n = 10) and B (n = 9), P = 0.0014, P = 0.0268 and P = 0.0168 respectively. Patients on B had higher IPSS scores compared to those under FC and AC, P = 0.0223. For all 30 included patients, Spearman's correlation (rs ) was very strong between BHS and general health perceptions (rs = -0.800, P < 0.0001), and weak/moderate between age and BHS (rs = 0.405, P = 0.026) and age and general health perceptions (rs = -0.564, P = 0.001). The sample power was >60%. To be considered in patients' counselling and care, current study supports the hypothesis that even VLRPC when untreated undermines psychosocial domains.
Resumo:
Current Brazilian law regarding water fluoridation classification is dichotomous with respect to the risks of and benefits for oral diseases, and fluoride (F) concentrations less than 0.6 or above 0.8 mg F/L are considered outside the normal limits. Thus, the law does not consider that both caries and fluorosis are dependent on the dosage and duration of fluoride exposure because they are both chronic diseases. Therefore, this study evaluated the quality of water fluoridation in Maringá, PR, Brazil, considering a new classification for the concentration of F in water the supply, based on the anticaries benefit and risk of fluorosis (CECOL/USP, 2011). Water samples (n = 325) were collected monthly over one year from 28 distribution water networks: 20 from treatment plants and 8 from artesian wells. F concentrations were determined using a specific ion electrode. The average F concentration was 0.77 mg F/L (ppm F), ranging from 0.44 to 1.22 mg F/L. Considering all of the water samples analyzed, 83.7% of them presented from 0.55 to 0.84 mg F/L, and according to the new classification used, they would provide maximum anticaries benefit with a low risk of fluorosis. This percentage was lower (75.4%) in the water samples supplied from artesian wells than from those distributed by the treatment plant (86%). In conclusion, based on the new classification of water F concentrations, the quality of water fluoridation in Maringá is adequate and is within the range of the best balance between risk and benefit.
Resumo:
A retrospective cohort. To report the incidence rates of shoulder injuries diagnosed with magnetic resonance imaging (MRI) in tetraplegic athletes and sedentary tetraplegic individuals. To evaluate whether sport practice increases the risk of shoulder injuries in tetraplegic individuals. Campinas, Sao Paulo, Brazil. Ten tetraplegic athletes with traumatic spinal cord injury were selected among quad rugby athletes and had both the shoulders evaluated by MRI. They were compared with 10 sedentary tetraplegic individuals who were submitted to the same radiological protocol. All athletes were male with a mean age of 32.1 years (range 25-44 years, s.d.=6.44). Time since injury ranged from 6 to 17 years, with a mean value of 9.7 years and s.d. of 3.1 years. All sedentary individuals were male with a mean age of 35.9 years (range 22-47 years, s.d.=8.36). Statistical analysis showed a protective effect of sport in the development of shoulder injuries, with a weak correlation for infraspinatus and subscapularis tendinopathy (P=0.09 and P=0.08, respectively) and muscle atrophy (P=0.08). There was a strong correlation for acromioclavicular joint (ACJ) and labrum injuries (P=0.04), with sedentary individuals at a higher risk for these injuries. Tetraplegic athletes and sedentary individuals have a high incidence of supraspinatus tendinosis, bursitis and ACJ degeneration. Statistical analysis showed that there is a possible protective effect of sport in the development of shoulder injuries. Weak evidence was encountered for infraspinatus and subscapularis tendinopathy and muscle atrophy (P=0.09, P=0.08 and P=0.08, respectively). Strong evidence with P=0.04 suggests that sedentary tetraplegic individuals are at a greater risk for ACJ and labrum injuries.Spinal Cord advance online publication, 17 March 2015; doi:10.1038/sc.2014.248.
Resumo:
Polycystic ovary syndrome (PCOS) has been associated with an autoimmune origin, either per se or favoring the onset of autoimmune diseases, from a stimulatory action on the inflammatory response. Thus, autoimmune thyroiditis (AIT) could be more prevalent among women with PCOS. To evaluate the prevalence of AIT in women with PCOS. It was a cross-sectional study, in a tertiary center, including 65 women with PCOS and 65 women without this condition. Clinical and laboratory parameters were evaluated and a thyroid ultrasound scan was performed. Levels of thyroid-stimulating hormone (TSH), free thyroxine (FT4), free triiodothyronine (FT3), anti-thyroid peroxidase (anti-TPO) antibodies, anti-thyroglobulin (anti-TG) antibodies, and thyroid ultrasound findings were evaluated. The prevalence of subclinical hypothyroidism (SCH) in women with PCOS was 16.9% and 6.2% in the non-PCOS group. AIT was more common in the PCOS group compared with the non-PCOS group (43.1% versus 26.2%). But, when it was adjusted by weight and insulin resistance, the difference in the thyroiditis risk was not observed (OR 0.78, CI 0.28-2.16). AIT risk was similar in the PCOS and the non-PCOS group. SCH are more common in women with PCOS, highlighting a need for periodic monitoring of thyroid function.