934 resultados para logistic regression analysis
Resumo:
INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
OBJECTIVES Severe neurological deficit (ND) due to acute aortic dissection type A (AADA) was considered a contraindication for surgery because of poor prognosis. Recently, more aggressive indication for surgery despite neurological symptoms has shown acceptable postoperative clinical results. The aim of this study was to evaluate early and mid-term outcomes of patients with AADA presenting with acute ND. METHODS Data from 53 patients with new-onset ND who received surgical repair for AADA between 2005 and 2012 at our institution were retrospectively reviewed. ND was defined as focal motor or sensory deficit, hemiplegia, paraplegia, convulsions or coma. Neurological symptoms were evaluated preoperatively using the Glasgow Coma Scale (GCS) and modified Rankin Scale (mRS), and at discharge as well as 3-6 months postoperatively using the mRS and National Institutes of Health Stroke Scale. Involvement of carotid arteries was assessed in the pre- and postoperative computed tomography. Logistic regression analysis was performed to detect predictive factors for recovery of ND. RESULTS Of the 53 patients, 29 (54.7%) showed complete recovery from focal ND at follow-up. Neurological symptoms persisted in 24 (45.3%) patients, of which 8 (33%) died without neurological assessment at follow-up. Between the two groups (patients with recovery and those with persisting ND), there was no significant difference regarding the duration of hypothermic circulatory arrest (28 ± 14 vs 36 ± 20 min) or severely reduced consciousness (GCS <8). Multivariate analysis showed significant differences for the preoperative mRS between the two groups (P < 0.007). A high preoperative mRS was associated with persistence of neurological symptoms (P < 0.02). Cardiovascular risk factors, age or involvement of supra-aortic branches were not predictive for persistence of ND. CONCLUSION More than half of our patients recovered completely from ND due to AADA after surgery. Severity of clinical symptoms had a predictive value. Patients suffering from AADA and presenting with ND before surgery should not be excluded from emergency surgery.
Resumo:
AIM MRI and PET with 18F-fluoro-ethyl-tyrosine (FET) have been increasingly used to evaluate patients with gliomas. Our purpose was to assess the additive value of MR spectroscopy (MRS), diffusion imaging and dynamic FET-PET for glioma grading. PATIENTS, METHODS 38 patients (42 ± 15 aged, F/M: 0.46) with untreated histologically proven brain gliomas were included. All underwent conventional MRI, MRS, diffusion sequences, and FET-PET within 3±4 weeks. Performances of tumour FET time-activity-curve, early-to-middle SUVmax ratio, choline / creatine ratio and ADC histogram distribution pattern for gliomas grading were assessed, as compared to histology. Combination of these parameters and respective odds were also evaluated. RESULTS Tumour time-activity-curve reached the best accuracy (67%) when taken alone to distinguish between low and high-grade gliomas, followed by ADC histogram analysis (65%). Combination of time-activity-curve and ADC histogram analysis improved the sensitivity from 67% to 86% and the specificity from 63-67% to 100% (p < 0.008). On multivariate logistic regression analysis, negative slope of the tumour FET time-activity-curve however remains the best predictor of high-grade glioma (odds 7.6, SE 6.8, p = 0.022). CONCLUSION Combination of dynamic FET-PET and diffusion MRI reached good performance for gliomas grading. The use of FET-PET/MR may be highly relevant in the initial assessment of primary brain tumours.
Resumo:
The association between helmet use during alpine skiing and incidence and severity of head injuries was analyzed. All patients admitted to a level 1 trauma center for traumatic brain injuries (TBIs) sustained from skiing accidents during the seasons 2000-2001 and 2010-2011 were eligible. Primary outcome was the association between helmet use and severity of TBI measured by Glasgow Coma Scale (GCS), computed tomography (CT) results, and necessity of neurosurgical intervention. Of 1362 patients injured during alpine skiing, 245 (18%) sustained TBI and were included. TBI was fatal in 3%. Head injury was in 76% minor (Glasgow Coma Scale, 13-15), 6% moderate, and 14% severe. Number and percentage of TBI patients showed no significant trend over the investigated seasons. Forty-five percent of the 245 patients had pathological CT findings and 26% of these required neurosurgical intervention. Helmet use increased from 0% in 2000-2001 to 71% in 2010-2011 (p<0.001). The main analysis, comparing TBI in patients with or without a helmet, showed an adjusted odds ratio (OR) of 1.44 (p=0.430) for suffering moderate-to-severe head injury in helmet users. Analyses comparing off-piste to on-slope skiers revealed a significantly increased OR among off-piste skiers of 7.62 (p=0.004) for sustaining a TBI requiring surgical intervention. Despite increases in helmet use, we found no decrease in severe TBI among alpine skiers. Logistic regression analysis showed no significant difference in TBI with regard to helmet use, but increased risk for off-piste skiers. The limited protection of helmets and dangers of skiing off-piste should be targeted by prevention programs.
Resumo:
OBJECTIVE The aim of this study was to explore the risk of incident gout in patients with type 2 diabetes mellitus (T2DM) in association with diabetes duration, diabetes severity and antidiabetic drug treatment. METHODS We conducted a case-control study in patients with T2DM using the UK-based Clinical Practice Research Datalink (CPRD). We identified case patients aged ≥18 years with an incident diagnosis of gout between 1990 and 2012. We matched to each case patient one gout-free control patient. We used conditional logistic regression analysis to calculate adjusted ORs (adj. ORs) with 95% CIs and adjusted our analyses for important potential confounders. RESULTS The study encompassed 7536 T2DM cases with a first-time diagnosis of gout. Compared to a diabetes duration <1 year, prolonged diabetes duration (1-3, 3-6, 7-9 and ≥10 years) was associated with decreased adj. ORs of 0.91 (95% CI 0.79 to 1.04), 0.76 (95% CI 0.67 to 0.86), 0.70 (95% CI 0.61 to 0.86), and 0.58 (95% CI 0.51 to 0.66), respectively. Compared to a reference A1C level of <7%, the risk estimates of increasing A1C levels (7.0-7.9, 8.0-8.9 and ≥9%) steadily decreased with adj. ORs of 0.79 (95% CI 0.72 to 0.86), 0.63 (95% CI 0.55 to 0.72), and 0.46 (95% CI 0.40 to 0.53), respectively. Neither use of insulin, metformin, nor sulfonylureas was associated with an altered risk of incident gout. CONCLUSIONS Increased A1C levels, but not use of antidiabetic drugs, was associated with a decreased risk of incident gout among patients with T2DM.
Resumo:
QUESTIONS UNDER STUDY: Patient characteristics and risk factors for death of Swiss trauma patients in the Trauma Audit and Research Network (TARN). METHODS: Descriptive analysis of trauma patients (≥16 years) admitted to a level I trauma centre in Switzerland (September 1, 2009 to August 31, 2010) and entered into TARN. Multivariable logistic regression analysis was used to identify predictors of 30-day mortality. RESULTS: Of 458 patients 71% were male. The median age was 50.5 years (inter-quartile range [IQR] 32.2-67.7), median Injury Severity Score (ISS) was 14 (IQR 9-20) and median Glasgow Coma Score (GCS) was 15 (IQR 14-15). The ISS was >15 for 47%, and 14% had an ISS >25. A total of 17 patients (3.7%) died within 30 days of trauma. All deaths were in patients with ISS >15. Most injuries were due to falls <2 m (35%) or road traffic accidents (29%). Injuries to the head (39%) were followed by injuries to the lower limbs (33%), spine (28%) and chest (27%). The time of admission peaked between 12:00 and 22:00, with a second peak between 00:00 and 02:00. A total of 64% of patients were admitted directly to our trauma centre. The median time to CT was 30 min (IQR 18-54 min). Using multivariable regression analysis, the predictors of mortality were older age, higher ISS and lower GCS. CONCLUSIONS: Characteristics of Swiss trauma patients derived from TARN were described for the first time, providing a detailed overview of the institutional trauma population. Based on these results, patient management and hospital resources (e.g. triage of patients, time to CT, staffing during night shifts) could be evaluated as a further step.
Resumo:
BACKGROUND Early identification of patients at risk of developing persistent low back pain (LBP) is crucial. OBJECTIVE Aim of this study was to identify in patients with a new episode of LBP the time point at which those at risk of developing persistent LBP can be best identified.METHODS: Prospective cohort study of 315 patients presenting to a health practitioner with a first episode of acute LBP. Primary outcome measure was functional limitation. Patients were assessed at baseline, three, six, twelve weeks and six months looking at factors of maladaptive cognition as potential predictors. Multivariate logistic regression analysis was performed for all time points. RESULTS The best time point to predict the development of persistent LBP at six months was the twelve-week follow-up (sensitivity 78%; overall predictive value 90%). Cognitions assessed at first visit to a health practitioner were not predictive. CONCLUSIONS Maladaptive cognitions at twelve weeks appear to be suitable predictors for a transition from acute to persistent LBP. Already three weeks after patients present to a health practitioner with acute LBP cognitions might influence the development of persistent LBP. Therefore, cognitive-behavioral interventions should be considered as early adjuvant LBP treatment in patients at risk of developing persistent LBP.
Resumo:
BACKGROUND Acute mesenteric ischemia (AMI) is an emergency with a mortality rate up to 50 %. Detecting AMI continues to be a major challenge. This study assed the correlation of repeated preoperative serum lactate with bowel necrosis and to identify risk factors for a lethal outcome in patients with AMI. METHODS A retrospective study of 91 patients with clinically and pathologically confirmed AMI from January 2006 to December 2012 was performed. RESULTS In-hospital mortality rate was 42.9 %. Two hundred nine preoperative lactate measurements were analyzed (2.3 ± 1.1 values per patient). Less than or equal to six hours prior to surgery, the mean serum lactate level was significantly higher (4.97 ± 4.21 vs. 3.24 ± 3.05 mmol/L, p = 0.006) and the mean pH significantly lower (7.28 ± 0.12 vs. 7.37 ± 0.08, p = 0.001) compared to >6 h before surgery. Thirty-four patients had at least two lactate measurements within 24 h prior to surgery. In this subgroup, 17 patients (50 %) exhibited an increase, 17 patients (50 %) a decrease in lactate levels. Forward logistic regression analysis showed that length of necrotic bowel and the highest lactate value 24 h prior to surgery were independent risk factors for mortality (r (2) = 0.329). CONCLUSION The value of serial lactate and pH measurements to predict the length of necrotic bowel is very limited. Length of necrotic bowel and lactate values are independent risk factors for mortality.
Resumo:
BACKGROUND Closed reduction and pinning is the accepted treatment choice for dislocated supracondylar humeral fractures in children (SCHF). Rates of open reduction, complications and outcome are reported to be dependent on delay of surgery. We investigated whether delay of surgery had influence on the incidence of open reduction, complications and outcome of surgical treatment of SCHFs in the authors' institution. METHODS Three hundred and forty-one children with 343 supracondylar humeral fractures (Gartland II: 144; Gartland III: 199) who underwent surgery between 2000 and 2009 were retrospectively analysed. The group consisted of 194 males and 149 females. The average age was 6.3 years. Mean follow-up was 6.2 months. Time interval between trauma and surgical intervention was determined using our institutional database. Clinical and radiographical data were collected for each group. Influence of delay of treatment on rates of open reduction, complications and outcome was calculated using logistic regression analysis. Furthermore, patients were grouped into 4 groups of delay (<6 h, n = 166; 6-12 h, n = 95; 12-24 h, n = 68; >24 h, n = 14) and the aforementioned variables were compared among these groups. RESULTS The incidence of open procedures in 343 supracondylar humeral fractures was 2.6 %. Complication rates were similar to the literature (10.8 %) primarily consisting of transient neurological impairments (9.0 %) which all were fully reversible by conservative treatment. Poor outcome was seen in 1.7 % of the patients. Delay of surgical treatment had no influence on rates of open surgery (p = 0.662), complications (p = 0.365) or poor outcome (p = 0.942). CONCLUSIONS In this retrospective study delay of treatment of SCHF did not have significant influence on the incidence of open reduction, complications, and outcome. Therefore, in SCHF with sufficient blood perfusion and nerve function, elective treatment is reasonable to avoid surgical interventions in the middle of the night which are stressful and wearing both for patients and for surgeons. LEVEL OF EVIDENCE III (retrospective comparative study).
Resumo:
OBJECTIVES The aim of this study was to quantify loss to follow-up (LTFU) in HIV care after delivery and to identify risk factors for LTFU, and implications for HIV disease progression and subsequent pregnancies. METHODS We used data on pregnancies within the Swiss HIV Cohort Study from 1996 to 2011. A delayed clinical visit was defined as > 180 days and LTFU as no visit for > 365 days after delivery. Logistic regression analysis was used to identify risk factors for LTFU. RESULTS A total of 695 pregnancies in 580 women were included in the study, of which 115 (17%) were subsequent pregnancies. Median maternal age was 32 years (IQR 28-36 years) and 104 (15%) women reported any history of injecting drug use (IDU). Overall, 233 of 695 (34%) women had a delayed visit in the year after delivery and 84 (12%) women were lost to follow-up. Being lost to follow-up was significantly associated with a history of IDU [adjusted odds ratio (aOR) 2.79; 95% confidence interval (CI) 1.32-5.88; P = 0.007] and not achieving an undetectable HIV viral load (VL) at delivery (aOR 2.42; 95% CI 1.21-4.85; P = 0.017) after adjusting for maternal age, ethnicity and being on antiretroviral therapy (ART) at conception. Forty-three of 84 (55%) women returned to care after LTFU. Half of them (20 of 41) with available CD4 had a CD4 count < 350 cells/μL and 15% (six of 41) a CD4 count < 200 cells/μL at their return. CONCLUSIONS A history of IDU and detectable HIV VL at delivery were associated with LTFU. Effective strategies are warranted to retain women in care beyond pregnancy and to avoid CD4 cell count decline. ART continuation should be advised especially if a subsequent pregnancy is planned.
Resumo:
Background: Access to hepatitis B viral load (VL) testing is poor in sub-Saharan Africa (SSA) due toeconomic and logistical reasons.Objectives: To demonstrate the feasibility of testing dried blood spots (DBS) for hepatitis B virus (HBV)VL in a laboratory in Lusaka, Zambia, and to compare HBV VLs between DBS and plasma samples.Study design: Paired plasma and DBS samples from HIV-HBV co-infected Zambian adults were analyzedfor HBV VL using the COBAS AmpliPrep/COBAS TaqMan HBV test (Version 2.0) and for HBV genotypeby direct sequencing. We used Bland-Altman analysis to compare VLs between sample types and bygenotype. Logistic regression analysis was conducted to assess the probability of an undetectable DBSresult by plasma VL.Results: Among 68 participants, median age was 34 years, 61.8% were men, and median plasma HBV VLwas 3.98 log IU/ml (interquartile range, 2.04–5.95). Among sequenced viruses, 28 were genotype A1 and27 were genotype E. Bland–Altman plots suggested strong agreement between DBS and plasma VLs. DBSVLs were on average 1.59 log IU/ml lower than plasma with 95% limits of agreement of −2.40 to −0.83 logIU/ml. At a plasma VL ≥2,000 IU/ml, the probability of an undetectable DBS result was 1.8% (95% CI:0.5–6.6). At plasma VL ≥20,000 IU/ml this probability reduced to 0.2% (95% CI: 0.03–1.7).
Resumo:
Catheter ablation of complex fractionated atrial electrograms (CFAE), also known as defragmentation ablation, may be considered for the treatment of persistent atrial fibrillation (AF) beyond pulmonary vein isolation (PVI). Concomitant antiarrhythmic drug (AAD) therapy is common, but the relevance of AAD administration and its optimal timing during ablation remain unclear. Therefore, we investigated the use and timing of AADs during defragmentation ablation and their possible implications for AF termination and ablation success in a large cohort of patients. Retrospectively, we included 200 consecutive patients (age: 61 ± 12 years, LA diameter: 47 ± 8 mm) with persistent AF (episode duration 47 ± 72 weeks) who underwent de novo ablation including CFAE ablation. In all patients, PVI was performed prior to CFAE ablation. The use and timing of AADs were registered. The follow-ups consisted of Holter ECGs and clinical visits. Termination of AF was achieved in 132 patients (66 %). Intraprocedural AADs were administered in 168/200 patients (84 %) 45 ± 27 min after completion of PVI. Amiodarone was used in the majority of the patients (160/168). The timing of AAD administration was predicted by the atrial fibrillation cycle length (AFCL). At follow-up, 88 patients (46 %) were free from atrial arrhythmia. Multivariate logistic regression analysis revealed that administration of AAD early after PVI, LA size, duration of AF history, sex and AFCL were predictors of AF termination. The administration of AAD and its timing were not predictive of outcome, and age was the sole independent predictor of AF recurrence. The administration of AAD during ablation was common in this large cohort of persistent AF patients. The choice to administer AAD therapy and the timing of the administration during ablation were influenced by AFCL, and these factors did not significantly influence the moderate single procedure success rate in this retrospective analysis.
Resumo:
PURPOSE This study aimed at assessing the cement leakage rate and the filling pattern in patients treated with vertebroplasty, kyphoplasty and stentoplasty with and without a newly developed lavage technique. STUDY DESIGN Retrospective clinical case-control study. METHODS A newly developed bipedicular lavage technique prior to cement application was applied in 64 patients (45.1 %) with 116 vertebrae, ("lavage" group). A conventional bipedicular cement injection technique was used in 78 patients (54.9 %) with 99 levels ("controls"). The outcome measures were filling patterns and leakage rates. RESULTS The overall leakage rate (venous, cortical defect, intradiscal) was 37.9 % in the lavage and 83.8 % in the control group (p < 0.001). Venous leakage (lavage 12.9 % vs. controls 31.3 %; p = 0.001) and cortical defect leakage (lavage 17.2 % vs. controls 63.3 %; p < 0.001) were significantly lower in the lavage group compared to "controls," whereas intradiscal leakages were similar in both groups (lavage 12.1 % vs. controls 15.2 %; p = 0.51). For venous leakage multivariate logistic regression analysis showed lavage to be the only independent predictor. Lavage was associated with 0.33-times (95 % CI 0.16-0.65; p = 0.001) lower likelihood for leakage in compared to controls. CONCLUSIONS Vertebral body lavage prior to cement augmentation is a safe technique to reduce cement leakage in a clinical setting and has the potential to prevent pulmonary fat embolism. Moreover, a better filling pattern can be achieved.
Resumo:
BACKGROUND Inferolateral early repolarization (ER) is highly prevalent and is associated with idiopathic ventricular fibrillation (VF). OBJECTIVE The purpose of this study was to evaluate the potential role of T-wave parameters to differentiate between malignant and benign ER. METHODS We compared the ECGs of patients with ER and VF (n = 92) with control subjects with asymptomatic ER (n = 247). We assessed J-wave amplitude, QTc interval, T-wave/R-wave (T/R) ratio in leads II and V5, and presence of low-amplitude T waves (T-wave amplitude <0.1 mV and <10% of R-wave amplitude in lead I, II, or V4-V6). RESULTS Compared to controls, the VF group had longer QTc intervals (388 ms vs 377 ms, P = .001), higher J-wave amplitudes (0.23 mV vs 0.17 mV, P <.001), higher prevalence of low-amplitude T waves (29% vs 3%, P <.001), and lower T/R ratio (0.18 vs 0.30, P <.001). Logistic regression analysis demonstrated that QTc interval (odds ratio [OR] per 10 ms: 1.15, 95% confidence interval [CI} 1.02-1.30), maximal J-wave amplitude (OR per 0.1 mV: 1.68, 95% CI 1.23-2.31), lower T/R ratio (OR per 0.1 unit: 0.62, 95% CI 0.47-0.81), presence of low-amplitude T waves (OR 3.53, 95% CI 1.26-9.88). and presence of J waves in the inferior leads (OR 2.58, 95% CI 1.18-5.65) were associated with malignant ER. CONCLUSION Patients with malignant ER have a higher prevalence of low-amplitude T waves, lower T/R ratio (lead II or V5), and longer QTc interval. The combination of these parameters with J-wave amplitude and distribution of J waves may allow for improved identification of malignant ER.
Resumo:
OBJECTIVE The aim of this study was to elucidate the relationship between the echogenicity of carotid artery plaques and the following risk factors: circulating oxLDL, hsCRP, the metabolic syndrome (MetS), and several of the traditional cardiovascular (CV) risk factors. MATERIAL AND METHODS A cross-sectional population-based study of 513 sixty-one-year-old men. The levels of circulating oxLDL were determined in plasma samples by sandwich ELISA utilizing a specific murine monoclonal antibody (mAb-4E6). High-sensitivity CRP was measured in plasma by ELISA. Plaque occurrence, size and echogenicity were evaluated from B-mode ultrasound registrations in the carotid arteries. Plaque echogenicity was assessed based on a four-graded classification scale. RESULTS A higher frequency of echolucent carotid plaques was observed with increasing levels of oxLDL and systolic blood pressure (p = 0.008 and p = 0.041, respectively). Subjects with the MetS had a significantly higher frequency of echogenic plaques than subjects without the MetS (p = 0.009). In a multiple logistic regression analysis, oxLDL turned out to be independently associated with echolucent carotid plaques. CONCLUSIONS The occurrence of echolucent carotid plaques was associated with oxLDL and systolic blood pressure, and oxLDL was associated with echolucent carotid plaques independently of systolic blood pressure.