50 resultados para Adjusted Average Time to Signal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Xu and colleagues evaluated the impact of increasing mean arterial blood pressure levels through norepinephrine administration on systemic hemodynamics, tissue perfusion, and sublingual microcirculation of septic shock patients with chronic hypertension. The authors concluded that, although increasing arterial blood pressure improved sublingual microcirculation parameters, no concomitant improvement in systemic tissue perfusion indicators was found. Here, we discuss why resuscitation targets may need to be individualized, taking into account the patient's baseline condition, and present directions for future research in this field.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The purpose was to study the emergency management of patients with suspected meningitis to identify potential areas for improvement. METHODS: All patients who underwent cerebrospinal fluid puncture at the emergency department of the University Hospital of Bern from January 31, 2004, to October 30, 2008, were included. A total of 396 patients were included in the study. For each patient, we analyzed the sequence and timing for the following management steps: first contact with medical staff, administration of the first antibiotic dose, lumbar puncture (LP), head imaging, and blood cultures. The results were analyzed in relation to clinical characteristics and the referral diagnosis on admission. RESULTS: Of the 396 patient analyzed, 15 (3.7%) had a discharge diagnosis of bacterial meningitis, 119 (30%) had nonbacterial meningitis, and 262 (66.3%) had no evidence of meningitis. Suspicion of meningitis led to earlier antibiotic therapy than suspicion of an acute cerebral event or nonacute cerebral event (P < .0001). In patients with bacterial meningitis, the average time to antibiotics was 136 minutes, with a range of 0 to 340 minutes. Most patients (60.1%) had brain imaging studies performed before LP. On the other hand, half of the patients with a referral diagnosis of meningitis (50%) received antibiotics before performance of an LP. CONCLUSIONS: Few patients with suspected meningitis received antimicrobial therapy within the first 30 minutes after arrival, but most patients with pneumococcal meningitis and typical symptoms were treated early; patients with bacterial meningitis who received treatment late had complex medical histories or atypical presentations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Medial ankle joint pain with localized cartilage degeneration due to medial joint overload in varus malalignment of the hindfoot lends itself to treatment by lateral closing wedge supramalleolar osteotomy. METHODS: From 1998 to 2003, nine patients between the ages of 21 to 59 years were operated. The etiology of the malalignment and degeneration was posttraumatic in eight and childhood osteomyelitis in one. Preoperative and postoperative standing radiographs were analyzed to determine the correction of the deformity and the grade of degeneration. Function and pain were assessed using the American Orthopaedic Foot and Ankle Society (AOFAS) Ankle-Hindfoot Scale. The average followup was 56 (range 15 to 88) months. RESULTS: The average time to osseous union was 10 +/- 3.31 weeks. There were no operative or postoperative complications. The average AOFAS score improved from 48 +/- 16.0 preoperatively to 74 +/- 11.7 postoperatively (p<0.004). The average pain subscore improved from 16 +/- 8.8 to 30 +/- 7.1 (p<0.008). The average tibial-ankle surface angle improved from 6.9 +/- 3.8 degrees of varus preoperatively to 0.6 +/- 1.9 degrees of valgus postoperatively (p<0.004). In the sagittal plane, the tibial-lateral-surface angle remained unchanged. At the final followup, two patients showed progression of radiographic ankle arthrosis grades. In one patient, it rose from grade 0 to I. In the other patient it advanced from grade II to III, with subsequent ankle arthrodesis required 16 months after the index procedure. Seven patients returned to their previous work. CONCLUSIONS: Lateral supramalleolar closing wedge osteotomy was an easy and safe procedure, effectively correcting hindfoot malalignment, relieving pain, restoring function, and halting progression of the degeneration in the short-term to mid-term in seven of nine patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: The aim of this study was to examine mechanical, microbiologic, and morphologic changes of the appendicle rim to assess if it is appropriate to dissect the appendix with the ultrasound-activated scalpel (UAS) during laparoscopic appendectomy. Materials and Methods: After laparoscopic resection of the appendix, using conventional Roeder slings, we investigated 50 appendicle rims with an in vitro procedure. The overall time of dissection of the mesoappendix with UAS was noted. Following removal, the appendix was dissected in vitro with the UAS one cme from the resection rim. Seal-burst pressures were recorded. Bacterial cultures of the UAS-resected rim were compared with those of the scissors resected rim. Tissue changes were quantified histologically with hematoxylin and eosin (HE) stains. Results: The average time to dissect the mesoappendix was 228 seconds (25-900). Bacterial culture growths were less in the UAS-resected probes (7 versus 36 positive probes; (p > 0.01). HE-stained tissues revealed mean histologic changes in the lamina propria muscularis externa of 2 mm depth. The seal-burst pressure levels of the appendicle lumen had a mean of 420 mbar. Seal-burst pressures and depths of histologic changes were not dependent on the different stages of appendicitis investigated, gender, or age groups. Seal-burst pressure levels were not related to different depths of tissue changes (P = 0.64). Conclusions: The UAS is a rapid instrument for laparoscopic appendectomy and appears to be safe with respect to stability, sterility and tissue changes. It avoids complex time consuming instrument change manoeuvres and current transmission, which may induce intra- and postoperative complications. Our results suggest that keeping a safety margin of at least 5 mm from the bowel would be sufficient to avoid thermal damage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: In search of an optimal compression therapy for venous leg ulcers, a systematic review and meta-analysis was performed of randomized controlled trials (RCT) comparing compression systems based on stockings (MCS) with divers bandages. METHODS: RCT were retrieved from six sources and reviewed independently. The primary endpoint, completion of healing within a defined time frame, and the secondary endpoints, time to healing, and pain were entered into a meta-analysis using the tools of the Cochrane Collaboration. Additional subjective endpoints were summarized. RESULTS: Eight RCT (published 1985-2008) fulfilled the predefined criteria. Data presentation was adequate and showed moderate heterogeneity. The studies included 692 patients (21-178/study, mean age 61 years, 56% women). Analyzed were 688 ulcerated legs, present for 1 week to 9 years, sizing 1 to 210 cm(2). The observation period ranged from 12 to 78 weeks. Patient and ulcer characteristics were evenly distributed in three studies, favored the stocking groups in four, and the bandage group in one. Data on the pressure exerted by stockings and bandages were reported in seven and two studies, amounting to 31-56 and 27-49 mm Hg, respectively. The proportion of ulcers healed was greater with stockings than with bandages (62.7% vs 46.6%; P < .00001). The average time to healing (seven studies, 535 patients) was 3 weeks shorter with stockings (P = .0002). In no study performed bandages better than MCS. Pain was assessed in three studies (219 patients) revealing an important advantage of stockings (P < .0001). Other subjective parameters and issues of nursing revealed an advantage of MCS as well. CONCLUSIONS: Leg compression with stockings is clearly better than compression with bandages, has a positive impact on pain, and is easier to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: We sought to characterize the impact that hepatitis C virus (HCV) infection has on CD4 cells during the first 48 weeks of antiretroviral therapy (ART) in previously ART-naive human immunodeficiency virus (HIV)-infected patients. METHODS: The HIV/AIDS Drug Treatment Programme at the British Columbia Centre for Excellence in HIV/AIDS distributes all ART in this Canadian province. Eligible individuals were those whose first-ever ART included 2 nucleoside reverse transcriptase inhibitors and either a protease inhibitor or a nonnucleoside reverse transcriptase inhibitor and who had a documented positive result for HCV antibody testing. Outcomes were binary events (time to an increase of > or = 75 CD4 cells/mm3 or an increase of > or = 10% in the percentage of CD4 cells in the total T cell population [CD4 cell fraction]) and continuous repeated measures. Statistical analyses used parametric and nonparametric methods, including multivariate mixed-effects linear regression analysis and Cox proportional hazards analysis. RESULTS: Of 1186 eligible patients, 606 (51%) were positive and 580 (49%) were negative for HCV antibodies. HCV antibody-positive patients were slower to have an absolute (P<.001) and a fraction (P = .02) CD4 cell event. In adjusted Cox proportional hazards analysis (controlling for age, sex, baseline absolute CD4 cell count, baseline pVL, type of ART initiated, AIDS diagnosis at baseline, adherence to ART regimen, and number of CD4 cell measurements), HCV antibody-positive patients were less likely to have an absolute CD4 cell event (adjusted hazard ratio [AHR], 0.84 [95% confidence interval [CI], 0.72-0.98]) and somewhat less likely to have a CD4 cell fraction event (AHR, 0.89 [95% CI, 0.70-1.14]) than HCV antibody-negative patients. In multivariate mixed-effects linear regression analysis, HCV antibody-negative patients had increases of an average of 75 cells in the absolute CD4 cell count and 4.4% in the CD4 cell fraction, compared with 20 cells and 1.1% in HCV antibody-positive patients, during the first 48 weeks of ART, after adjustment for time-updated pVL, number of CD4 cell measurements, and other factors. CONCLUSION: HCV antibody-positive HIV-infected patients may have an altered immunologic response to ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECT: Ultrasound may be a reliable but simpler alternative to intraoperative MR imaging (iMR imaging) for tumor resection control. However, its reliability in the detection of tumor remnants has not been definitely proven. The aim of the study was to compare high-field iMR imaging (1.5 T) and high-resolution 2D ultrasound in terms of tumor resection control. METHODS: A prospective comparative study of 26 consecutive patients was performed. The following parameters were compared: the existence of tumor remnants after presumed radical removal and the quality of the images. Tumor remnants were categorized as: detectable with both imaging modalities or visible only with 1 modality. RESULTS: Tumor remnants were detected in 21 cases (80.8%) with iMR imaging. All large remnants were demonstrated with both modalities, and their image quality was good. Two-dimensional ultrasound was not as effective in detecting remnants<1 cm. Two remnants detected with iMR imaging were missed by ultrasound. In 2 cases suspicious signals visible only on ultrasound images were misinterpreted as remnants but turned out to be a blood clot and peritumoral parenchyma. The average time for acquisition of an ultrasound image was 2 minutes, whereas that for an iMR image was approximately 10 minutes. Neither modality resulted in any procedure-related complications or morbidity. CONCLUSIONS: Intraoperative MR imaging is more precise in detecting small tumor remnants than 2D ultrasound. Nevertheless, the latter may be used as a less expensive and less time-consuming alternative that provides almost real-time feedback information. Its accuracy is highest in case of more confined, deeply located remnants. In cases of more superficially located remnants, its role is more limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Diagnosing supraventricular arrhythmias by conventional long-term ECG can be cumbersome because of poor p-waves. Esophageal long-term electrocardiography (eECG) has an excellent sensitivity for atrial signals and may overcome this limitation. However, the optimal lead insertion depth (OLID) is not known. METHODS We registered eECGs at different lead insertion depths in 27 patients and analyzed 199,716 atrial complexes with respect to signal amplitude and slope. Correlation and regression analyses were used to find a criterion for OLID. RESULTS Atrial signal amplitudes and slopes significantly depend on lead insertion depth. OLID correlates with body height (rSpearman=0.71) and can be estimated by OLID [cm]=0.25*body height[cm]-7cm. At this insertion depth, we recorded the largest esophageal atrial signal amplitudes (1.27±0.86mV), which were much larger compared to conventional surface lead II (0.19±0.10mV, p<0.0001). CONCLUSION The OLID depends on body height and can be calculated by a simple regression formula.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In Chopart-level amputations the heel often deviates into equinus and varus when, due to the lack of healthy anterior soft tissue, rebalancing tendon transfers to the talar head are not possible. Consequently, anterior and lateral wound dehiscence and ulceration may occur requiring higher-level amputation to achieve wound closure, with considerable loss of function for the patients. METHODS Twenty-four consecutive patients (15 diabetes, 6 trauma, and 3 tumor) had Chopart's amputation and simultaneous or delayed additional ankle dorsiflexion arthrodesis to allow for tension-free wound closure or soft tissue reconstruction, or to treat secondary recurrent ulcerations. Percutaneous Achilles tendon lengthening and subtalar arthrodesis were added as needed. Wound healing problems, time to fusion and full weight-bearing in the prosthesis, complications in the prosthesis, and the ambulatory status were assessed. Satisfaction and function were evaluated by the AmpuPro score and the validated Prosthesis Evaluation Questionnaire scale. RESULTS Five patients had successful soft tissue healing and fusions but died of their underlying disease 2 to 46 months after the operation. Two diabetic patients required a transtibial amputation. The other 17 patients were followed for 27 months (range, 13-63). The average age of the 4 women and 13 men was 53.9 years (range, 16-87). Postoperative complications included minor wound healing problems in 8 patients, wound breakdown requiring revision in 4, phantom pain in 3, residual equinus in 1, and adjacent scar carcinoma in 1 patient. The time to full weight-bearing in the prosthesis ranged from 6 to 24 weeks (mean 10). The mean AmpuPro score was 107 points (of 120), and the mean Prosthesis Evaluation Questionnaire scale was 147 points (of 200). No complications occurred with the prosthesis. Twelve patients lost 1 to 2 mobility classes (mean 0.9). The arthrodeses all healed within 2.5 months (range, 1.5 to 5 months). CONCLUSION Adding an ankle arthrodesis to a Chopart's amputation either immediately or in a delayed fashion to treat anterior soft tissue complications was a successful salvage in most patients at this amputation level. It enabled the patients to preserve the advantages of a full-length limb with terminal weight-bearing. LEVEL OF EVIDENCE Level IV, retrospective case series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Epidemiological studies show that elevated levels of particulate matter in ambient air are highly correlated with respiratory and cardiovascular diseases. Atmospheric particles originate from a large number of sources and have a highly complex and variable composition. An assessment of their potential health risks and the identification of the most toxic particle sources would require a large number of investigations. Due to ethical and economic reasons, it is desirable to reduce the number of in vivo studies and to develop suitable in vitro systems for the investigation of cell-particle interactions. METHODS We present the design of a new particle deposition chamber in which aerosol particles are deposited onto cell cultures out of a continuous air flow. The chamber allows for a simultaneous exposure of 12 cell cultures. RESULTS Physiological conditions within the deposition chamber can be sustained constantly at 36-37°C and 90-95% relative humidity. Particle deposition within the chamber and especially on the cell cultures was determined in detail, showing that during a deposition time of 2 hr 8.4% (24% relative standard deviation) of particles with a mean diameter of 50 nm [mass median diameter of 100 nm (geometric standard deviation 1.7)] are deposited on the cell cultures, which is equal to 24-34% of all charged particles. The average well-to-well variability of particles deposited simultaneously in the 12 cell cultures during an experiment is 15.6% (24.7% relative standard deviation). CONCLUSIONS This particle deposition chamber is a new in vitro system to investigate realistic cell-particle interactions at physiological conditions, minimizing stress on the cell cultures other than from deposited particles. A detailed knowledge of particle deposition characteristics on the cell cultures allows evaluating reliable dose-response relationships. The compact and portable design of the deposition chamber allows for measurements at any particle sources of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the association between exposure to radio-frequency electromagnetic fields (RF-EMFs) from broadcast transmitters and childhood cancer. First, we conducted a time-to-event analysis including children under age 16 years living in Switzerland on December 5, 2000. Follow-up lasted until December 31, 2008. Second, all children living in Switzerland for some time between 1985 and 2008 were included in an incidence density cohort. RF-EMF exposure from broadcast transmitters was modeled. Based on 997 cancer cases, adjusted hazard ratios in the time-to-event analysis for the highest exposure category (>0.2 V/m) as compared with the reference category (<0.05 V/m) were 1.03 (95% confidence interval (CI): 0.74, 1.43) for all cancers, 0.55 (95% CI: 0.26, 1.19) for childhood leukemia, and 1.68 (95% CI: 0.98, 2.91) for childhood central nervous system (CNS) tumors. Results of the incidence density analysis, based on 4,246 cancer cases, were similar for all types of cancer and leukemia but did not indicate a CNS tumor risk (incidence rate ratio = 1.03, 95% CI: 0.73, 1.46). This large census-based cohort study did not suggest an association between predicted RF-EMF exposure from broadcasting and childhood leukemia. Results for CNS tumors were less consistent, but the most comprehensive analysis did not suggest an association.