938 resultados para RISK PATIENTS
Resumo:
Background: CAH patients have an increased risk of cardiovascular disease, and it remains unknown if lifelong glucocorticoid (GC) treatment is a contributing factor. In the general population, glucocorticoid receptor gene (NR3C1) polymorphisms are associated with an adverse metabolic profile. Our aim was to analyze the association between the NR3C1 polymorphisms and the metabolic profile of CAH patients. Methodology: Sixty-eight adult patients (34SV/34SW) with a mean age of 28.4 +/- 9 years received dexamethasone (mean 0.27 +/- 0.11 mg/day) to obtain normal androgen levels. SW patients also received fludrocortisone (50 mu g/day). Metabolic syndrome (MetS) was defined by the NCEP ATPIII criteria and obesity by BMI >= 30 kg/m(2). NR3C1 alleles were genotyped, and association analyses with phenotype were carried out with Chi-square, t-test and regression analysis. Results: Obesity and MetS were observed in 23.5% and 7.3% of patients, respectively, and were not correlated with GC doses and treatment duration. BMI was positively correlated with blood pressure (BP), triglycerides (TG), LDL-c levels and HOMA-IR and inversely correlated with HDL-c levels. BclI and A3669G variants were found in 26.4% and 9.6% of alleles, respectively. Heterozygotes for the BclI polymorphism presented with higher BMI (29 kg/m(2) +/- 5.3 vs. 26 kg/m(2) +/- 5.3, respectively) and waist circumference (89 cm +/- 12.7 vs. 81 cm +/- 13, respectively) compared to wild-type subjects. Hypertension was found in 12% of patients and heterozygotes for the BclI polymorphism presented higher systolic BP than wild type subjects. Low HDL-c and high TG levels were identified in 30% and 10% of patients, respectively, and were not associated with the NR3C1 polymorphisms. A3669G carriers and non-carriers did not differ. Conclusion: In addition to GC therapy, the BclI GR variant might play an important role in obesity susceptibility in CAH patients. Genotyping of GR polymorphisms could result in the identification of a subgroup at risk patients, allowing for the establishment of personalized treatment and the avoidance of long-term adverse consequences.
Resumo:
Objective: To analyze the association between maternal obesity and postnatal infectious complications in high-risk pregnancies. Methods: Prospective study from August 2009 through August 2010 with the following inclusion criteria: women up to the 5th postpartum day; age L 18 years; high-risk pregnancy; singleton pregnancy with live fetus at labor onset; delivery at the institution; maternal weight measured on day of delivery. The nutritional status in late pregnancy was assessed by the body mass index (BMI), with the application of the Atalah et al. curve. Patients were graded as underweight, adequate weight, overweight, or obese. Postpartum complications investigated during the hospital stay and 30 days post-discharge were: surgical wound infection and/or secretion, urinary infection, postpartum infection, fever, hospitalization, antibiotic use, and composite morbidity (at least one of the complications mentioned). Results: 374 puerperal women were included, graded according to the final BMI as: underweight (n = 54, 14.4%); adequate weight (n = 126, 33.7%); overweight (n = 105, 28.1%); and obese (n = 89, 23.8%). Maternal obesity was shown to have a significant association with the following postpartum complications: surgical wound infection (16.7%, p = 0.042), urinary infection (9.0%, p = 0.004), antibiotic use (12.3%, p < 0.001), and composite morbidity (25.6%, p = 0.016). By applying the logistic regression model, obesity in late pregnancy was found to be an independent variable regardless of the composite morbidity predicted (OR: 2.09; 95% CI: 1.15-3.80, p = 0.015). Conclusion: Maternal obesity during late pregnancy in high-risk patients is independently associated with postpartum infectious complications, which demonstrates the need for a closer follow-up of maternal weight gain in these pregnancies.
Resumo:
Background: Ankle-brachial index (ABI) can access peripheral artery disease and predict mortality in prevalent patients on hemodialysis. However, ABI has not yet been tested in incident patients, who present significant mortality. Typically, ABI is measured by Doppler, which is not always available, limiting its use in most patients. We therefore hypothesized that ABI, evaluated by a simplified method, can predict mortality in an incident hemodialysis population. Methodology/Principal Findings: We studied 119 patients with ESRD who had started hemodialysis three times weekly. ABI was calculated by using two oscillometric blood pressure devices simultaneously. Patients were followed until death or the end of the study. ABI was categorized in two groups normal (0.9-1.3) or abnormal (<0.9 and >1.3). There were 33 deaths during a median follow-up of 12 months (from 3 to 24 months). Age (1 year) (hazard of ratio, 1.026; p = 0.014) and ABI abnormal (hazard ratio, 3.664; p = 0.001) were independently related to mortality in a multiple regression analysis. Conclusions: An easy and inexpensive technique to measure ABI was tested and showed to be significant in predicting mortality. Both low and high ABI were associated to mortality in incident patients on hemodialysis. This technique allows nephrologists to identify high-risk patients and gives the opportunity of early intervention that could alter the natural progression of this population.
Resumo:
Background: We evaluated whether the advantages conferred by renal transplantation encompass all individuals or whether they favor more specific groups of patients. Methods: One thousand and fifty-eight patients on the transplant waiting list and 270 receiving renal transplant were studied. End points were the composite incidence of CV events and death. Patients were followed up from date of placement on the list until transplantation, CV event, or death (dialysis patients), or from the date of transplantation, CV event, return to dialysis, or death (transplant patients). Results: Younger patients with no comorbidities had a lower incidence of CV events and death independently of the treatment modality (log-rank = 0.0001). Renal transplantation was associated with better prognosis only in high-risk patients (p = 0.003). Conclusions: Age and comorbidities influenced the prevalence of CV complications and death independently of the treatment modality. A positive effect of renal transplantation was documented only in high-risk patients. These findings suggest that age and comorbidities should be considered indication for early transplantation even considering that, as a group, such patients have a shorter survival compared with low-risk individuals.
Resumo:
Background-Patients with acute coronary syndromes and history of stroke or transient ischemic attack (TIA) have an increased rate of recurrent cardiac events and intracranial hemorrhages. Methods and Results-We evaluated treatment effects of ticagrelor versus clopidogrel in patients with acute coronary syndrome with and without a history of prior stroke or TIA in the PLATelet inhibition and patient Outcomes (PLATO) trial. Of the 18 624 randomized patients, 1152 (6.2%) had a history of stroke or TIA. Such patients had higher rates of myocardial infarction (11.5% versus 6.0%), death (10.5% versus 4.9%), stroke (3.4% versus 1.2%), and intracranial bleeding (0.8% versus 0.2%) than patients without prior stroke or TIA. Among patients with a history of stroke or TIA, the reduction of the primary composite outcome and total mortality at 1 year with ticagrelor versus clopidogrel was consistent with the overall trial results: 19.0% versus 20.8% (hazard ratio, 0.87; 95% confidence interval, 0.66-1.13; interaction P=0.84) and 7.9% versus 13.0% (hazard ratio, 0.62; 95% confidence interval, 0.42-0.91). The overall PLATO-defined bleeding rates were similar: 14.6% versus 14.9% (hazard ratio, 0.99; 95% confidence interval, 0.71-1.37), and intracranial bleeding occurred infrequently (4 versus 4 cases, respectively). Conclusions-Patients with acute coronary syndrome with a prior history of ischemic stroke or TIA had higher rates of clinical outcomes than patients without prior stroke or TIA. However, the efficacy and bleeding results of ticagrelor in these high-risk patients were consistent with the overall trial population, with a favorable clinical net benefit and associated impact on mortality.
Resumo:
The supraclavicular island flap has been widely used in head and neck reconstruction, providing an alternative to the traditional techniques like regional or free flaps, mainly because of its thin skin island tissue and reliable vascularity. Head and neck patients who require large reconstructions usually present poor clinical and healing conditions. An early experience using this flap for late-stage head and neck tumour treatment is reported. Forty-seven supraclavicular artery flaps were used to treat head and neck oncologic defects after cutaneous, intraoral and pharyngeal tumour resections. Dissection time, complications, donor and reconstructed area outcomes were assessed. The mean time for harvesting the flaps was 50 min by the senior author. All donor sites were closed primarily. Three cases of laryngopharyngectomy reconstruction developed a small controlled (salivary) leak that was resolved with conservative measures. Small or no strictures were detected on radiologic swallowing examinations and all patients regained normal swallowing function. Five patients developed donor site dehiscence. These wounds were treated with regular dressing until healing was complete. There were four distal flap necroses in this series. These necroses were debrided and closed primarily. The supraclavicular flap is pliable for head and neck oncologic reconstruction in late-stage patients. High-risk patients and modified radical neck dissection are not contraindications for its use. The absence of the need to isolate the pedicle offers quick and reliable harvesting. The arc of rotation on the base of the neck provides adequate length for pharyngeal, oral lining and to reconstruct the middle and superior third of the face. (C) 2012 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Resumo:
About 5-10% of breast and ovarian carcinomas are hereditary and most of these result from germline mutations in the BRCA1 and BRCA2 genes. In women of Ashkenazi Jewish ascendance, up to 30% of breast and ovarian carcinomas may be attributable to mutations in these genes, where 3 founder mutations, c.68_69del (185delAG) and c.5266dup (5382insC) in BRCA1 and c.5946del (6174delT) in BRCA2, are commonly encountered. It has been suggested by some authors that screening for founder mutations should be undertaken in all Brazilian women with breast cancer. Thus, the goal of this study was to determine the prevalence of three founder mutations, commonly identified in Ashkenazi individuals in a sample of non-Ashkenazi cancer-affected Brazilian women with clearly defined risk factors for hereditary breast and ovarian cancer (HBOC) syndrome. Among 137 unrelated Brazilian women from HBOC families, the BRCA1c.5266dup mutation was identified in seven individuals (5%). This prevalence is similar to that encountered in non-Ashkenazi HBOC families in other populations. However, among patients with bilateral breast cancer, the frequency of c.5266dup was significantly higher when compared to patients with unilateral breast tumors (12.1% vs 1.2%, p = 0.023). The BRCA1 c.68_69del and BRCA2 c.5946del mutations did not occur in this sample. We conclude that screening non-Ashkenazi breast cancer-affected women from the ethnically heterogeneous Brazilian populations for the BRCA1 c.68_69del and BRCA2 c.5946del is not justified, and that screening for BRCA1c.5266dup should be considered in high risk patients, given its prevalence as a single mutation. In high-risk patients, a negative screening result should always be followed by comprehensive BRCA gene testing. The finding of a significantly higher frequency of BRCA1 c.5266dup in women with bilateral breast cancer, as well as existence of other as yet unidentified founder mutations in this population, should be further assessed in a larger well characterized high-risk cohort.
Resumo:
Introduction The survival of patients admitted to an emergency department is determined by the severity of acute illness and the quality of care provided. The high number and the wide spectrum of severity of illness of admitted patients make an immediate assessment of all patients unrealistic. The aim of this study is to evaluate a scoring system based on readily available physiological parameters immediately after admission to an emergency department (ED) for the purpose of identification of at-risk patients. Methods This prospective observational cohort study includes 4,388 consecutive adult patients admitted via the ED of a 960-bed tertiary referral hospital over a period of six months. Occurrence of each of seven potential vital sign abnormalities (threat to airway, abnormal respiratory rate, oxygen saturation, systolic blood pressure, heart rate, low Glasgow Coma Scale and seizures) was collected and added up to generate the vital sign score (VSS). VSSinitial was defined as the VSS in the first 15 minutes after admission, VSSmax as the maximum VSS throughout the stay in ED. Occurrence of single vital sign abnormalities in the first 15 minutes and VSSinitial and VSSmax were evaluated as potential predictors of hospital mortality. Results Logistic regression analysis identified all evaluated single vital sign abnormalities except seizures and abnormal respiratory rate to be independent predictors of hospital mortality. Increasing VSSinitial and VSSmax were significantly correlated to hospital mortality (odds ratio (OR) 2.80, 95% confidence interval (CI) 2.50 to 3.14, P < 0.0001 for VSSinitial; OR 2.36, 95% CI 2.15 to 2.60, P < 0.0001 for VSSmax). The predictive power of VSS was highest if collected in the first 15 minutes after ED admission (log rank Chi-square 468.1, P < 0.0001 for VSSinitial;,log rank Chi square 361.5, P < 0.0001 for VSSmax). Conclusions Vital sign abnormalities and VSS collected in the first minutes after ED admission can identify patients at risk of an unfavourable outcome.
Resumo:
Background: The Geneva Prognostic Score (GPS), the Pulmonary Embolism Severity Index (PESI), and its simplified version (sPESI) are well known clinical prognostic scores for pulmonary embolism (PE).Objectives: To compare the prognostic performance of these scores in elderly patients with PE. Patients/Methods: In a multicenter Swiss cohort of elderly patients with venous thromboembolism, we prospectively studied 449 patients aged ≥65 years with symptomatic PE. The outcome was 30-day overall mortality. We dichotomized patients as low- vs. higher-risk in all three scores using the following thresholds: GPS scores ≤2 vs. >2, PESI risk classes I-II vs. III-V, and sPESI scores 0 vs. ≥1. We compared 30-day mortality in low- vs. higher-risk patients and the areas under the receiver operating characteristic curve (ROC). Results: Overall, 3.8% of patients (17/449) died within 30 days. The GPS classified a greater proportion of patients as low risk (92% [413/449]) than the PESI (36.3% [163/449]) and the sPESI (39.6% [178/449]) (P<0.001 for each comparison). Low-risk patients based on the sPESI had a mortality of 0% (95% confidence interval [CI] 0-2.1%) compared to 0.6% (95% CI 0-3.4%) for low-risk patients based on the PESI and 3.4% (95% CI 1.9-5.6%) for low-risk patients based on the GPS. The areas under the ROC curves were 0.77 (95%CI 0.72-0.81), 0.76 (95% CI 0.72-0.80), and 0.71 (95% CI 0.66-0.75), respectively (P=0.47). Conclusions: In this cohort of elderly patients with PE, the GPS identified a higher proportion of patients as low-risk but the PESI and sPESI were more accurate in predicting mortality.
Resumo:
OBJECTIVES: To assess the safety and cardiopulmonary adaptation to high altitude exposure among patients with coronary artery disease. METHODS: 22 patients (20 men and 2 women), mean age 57 (SD 7) years, underwent a maximal, symptom limited exercise stress test in Bern, Switzerland (540 m) and after a rapid ascent to the Jungfraujoch (3454 m). The study population comprised 15 patients after ST elevation myocardial infarction and 7 after a non-ST elevation myocardial infarction 12 (SD 4) months after the acute event. All patients were revascularised either by percutaneous coronary angioplasty (n = 15) or by coronary artery bypass surgery (n = 7). Ejection fraction was 60 (SD 8)%. beta blocking agents were withheld for five days before exercise testing. RESULTS: At 3454 m, peak oxygen uptake decreased by 19% (p < 0.001), maximum work capacity by 15% (p < 0.001) and exercise time by 16% (p < 0.001); heart rate, ventilation and lactate were significantly higher at every level of exercise, except at maximum exertion. No ECG signs of myocardial ischaemia or significant arrhythmias were noted. CONCLUSIONS: Although oxygen demand and lactate concentrations are higher during exercise at high altitude, a rapid ascent and submaximal exercise can be considered safe at an altitude of 3454 m for low risk patients six months after revascularisation for an acute coronary event and a normal exercise stress test at low altitude.
Resumo:
BACKGROUND: The epidemiology of liver disease in patients admitted to emergency rooms is largely unknown. The current study aimed to measure the prevalence of viral hepatitis B and C infection and pathological laboratory values of liver disease in such a population, and to study factors associated with these measurements. METHODS: Cross-sectional study in patients admitted to the emergency room of a university hospital. No formal exclusion criteria. Determination of anti-HBs, anti-HCV, transferrin saturation, alanine aminotransferase, and obtaining answers from a study-specific questionnaire. RESULTS: The study included 5'036 patients, representing a 14.9% sample of the target population during the study period. Prevalence of anti-HBc and anti-HCV was 6.7% (95%CI 6.0% to 7.4%) and 2.7% (2.3% to 3.2%), respectively. Factors independently associated with positive anti-HBc were intravenous drug abuse (OR 18.3; 11.3 to 29.7), foreign country of birth (3.4; 2.6 to 4.4), non-white ethnicity (2.7; 1.9 to 3.8) and age > or =60 (2.0; 1.5 to 2.8). Positive anti-HCV was associated with intravenous drug abuse (78.9; 43.4 to 143.6), blood transfusion (1.7; 1.1 to 2.8) and abdominal pain (2.7; 1.5 to 4.8). 75% of all participants were not vaccinated against hepatitis B or did not know their vaccination status. Among anti-HCV positive patients only 49% knew about their infection and 51% reported regular alcohol consumption. Transferrin saturation was elevated in 3.3% and was associated with fatigue (prevalence ratio 1.9; 1.2 to 2.8). CONCLUSION: Emergency rooms should be considered as targets for public health programs that encourage vaccination, patient education and screening of high-risk patients for liver disease with subsequent referral for treatment if indicated.
Resumo:
Historically, patients with high risk prostate cancer were considered poor candidates for radical prostatectomy (RP) due to the likelihood of positive pelvic lymph nodes and decreased long term survival. Although there is still no consensus on the optimal therapy for this group of patients, there is increasing evidence that surgery could play a role. Cancer specific survival (CSS) rates after RP for locally advanced disease at 10 year follow up range from 29 to 72%, depending on tumor differentiation. The role of pelvic lymph node dissection (PLND) in prostate cancer remains a controversial topic. Nonetheless, in conjunction with RRP extended PLND (ePLND) should be performed as extended lymph node dissection in lieu of standard PLND may increase staging accuracy, influence decision making with respect to adjuvant therapy and possibly impact outcome. High risk patients with organ confined prostate cancer and low volume (micro)metastatic disease may be the ones to profit most from this approach.
Resumo:
BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
BACKGROUND There is ongoing debate on the optimal drug-eluting stent (DES) in diabetic patients with coronary artery disease. Biodegradable polymer drug-eluting stents (BP-DES) may potentially improve clinical outcomes in these high-risk patients. We sought to compare long-term outcomes in patients with diabetes treated with biodegradable polymer DES vs. durable polymer sirolimus-eluting stents (SES). METHODS We pooled individual patient-level data from 3 randomized clinical trials (ISAR-TEST 3, ISAR-TEST 4 and LEADERS) comparing biodegradable polymer DES with durable polymer SES. Clinical outcomes out to 4years were assessed. The primary end point was the composite of cardiac death, myocardial infarction and target-lesion revascularization. Secondary end points were target lesion revascularization and definite or probable stent thrombosis. RESULTS Of 1094 patients with diabetes included in the present analysis, 657 received biodegradable polymer DES and 437 durable polymer SES. At 4years, the incidence of the primary end point was similar with BP-DES versus SES (hazard ratio=0.95, 95% CI=0.74-1.21, P=0.67). Target lesion revascularization was also comparable between the groups (hazard ratio=0.89, 95% CI=0.65-1.22, P=0.47). Definite or probable stent thrombosis was significantly reduced among patients treated with BP-DES (hazard ratio=0.52, 95% CI=0.28-0.96, P=0.04), a difference driven by significantly lower stent thrombosis rates with BP-DES between 1 and 4years (hazard ratio=0.15, 95% CI=0.03-0.70, P=0.02). CONCLUSIONS In patients with diabetes, biodegradable polymer DES, compared to durable polymer SES, were associated with comparable overall clinical outcomes during follow-up to 4years. Rates of stent thrombosis were significantly lower with BP-DES.
Resumo:
OBJECTIVE: The objective of this study was to evaluate the impact of newer therapies on the highest risk patients with congenital diaphragmatic hernia (CDH), those with agenesis of the diaphragm. SUMMARY BACKGROUND DATA: CDH remains a significant cause of neonatal mortality. Many novel therapeutic interventions have been used in these infants. Those children with large defects or agenesis of the diaphragm have the highest mortality and morbidity. METHODS: Twenty centers from 5 countries collected data prospectively on all liveborn infants with CDH over a 10-year period. The treatment and outcomes in these patients were examined. Patients were followed until death or hospital discharge. RESULTS: A total of 1,569 patients with CDH were seen between January 1995 and December 2004 in 20 centers. A total of 218 patients (14%) had diaphragmatic agenesis and underwent repair. The overall survival for all patients was 68%, while survival was 54% in patients with agenesis. When patients with diaphragmatic agenesis from the first 2 years were compared with similar patients from the last 2 years, there was significantly less use of ECMO (75% vs. 52%) and an increased use of inhaled nitric oxide (iNO) (30% vs. 80%). There was a trend toward improved survival in patients with agenesis from 47% in the first 2 years to 59% in the last 2 years. The survivors with diaphragmatic agenesis had prolonged hospital stays compared with patients without agenesis (median, 68 vs. 30 days). For the last 2 years of the study, 36% of the patients with agenesis were discharged on tube feedings and 22% on oxygen therapy. CONCLUSIONS: There has been a change in the management of infants with CDH with less frequent use of ECMO and a greater use of iNO in high-risk patients with a potential improvement in survival. However, the mortality, hospital length of stay, and morbidity in agenesis patients remain significant.