27 resultados para Time Optimal
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Objective: To compare clinical outcomes after laparoscopic cholecystectomy (LC) for acute cholecystitis performed at various time-points after hospital admission. Background: Symptomatic gallstones represent an important public health problem with LC the treatment of choice. LC is increasingly offered for acute cholecystitis, however, the optimal time-point for LC in this setting remains a matter of debate. Methods: Analysis was based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery and included patients undergoing emergency LC for acute cholecystitis between 1995 and 2006, grouped according to the time-points of LC since hospital admission (admission day (d0), d1, d2, d3, d4/5, d ≥6). Linear and generalized linear regression models assessed the effect of timing of LC on intra- or postoperative complications, conversion and reoperation rates and length of postoperative hospital stay. Results: Of 4113 patients, 52.8% were female, median age was 59.8 years. Delaying LC resulted in significantly higher conversion rates (from 11.9% at d0 to 27.9% at d ≥6 days after admission, P < 0.001), surgical postoperative complications (5.7% to 13%, P < 0.001) and re-operation rates (0.9% to 3%, P = 0.007), with a significantly longer postoperative hospital stay (P < 0.001). Conclusions: Delaying LC for acute cholecystitis has no advantages, resulting in significantly increased conversion/re-operation rate, postoperative complications and longer postoperative hospital stay. This investigation—one of the largest in the literature—provides compelling evidence that acute cholecystitis merits surgery within 48 hours of hospital admission if impact on the patient and health care system is to be minimized.
Is there an optimal scan time for 6-[F-18]fluoro-L-DOPA PET in pheochromocytomas and paragangliomas?
Resumo:
To define the appropriate scan time for fluorine-18-labeled dihydroxyphenylalanine (F-18 DOPA) PET in oncological imaging of pheochromocytomas and paragangliomas.
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
The precise timing of events in the brain has consequences for intracellular processes, synaptic plasticity, integration and network behaviour. Pyramidal neurons, the most widespread excitatory neuron of the neocortex have multiple spike initiation zones, which interact via dendritic and somatic spikes actively propagating in all directions within the dendritic tree. For these neurons, therefore, both the location and timing of synaptic inputs are critical. The time window for which the backpropagating action potential can influence dendritic spike generation has been extensively studied in layer 5 neocortical pyramidal neurons of rat somatosensory cortex. Here, we re-examine this coincidence detection window for pyramidal cell types across the rat somatosensory cortex in layers 2/3, 5 and 6. We find that the time-window for optimal interaction is widest and shifted in layer 5 pyramidal neurons relative to cells in layers 6 and 2/3. Inputs arriving at the same time and locations will therefore differentially affect spike-timing dependent processes in the different classes of pyramidal neurons.
Resumo:
Objective: We compare the prognostic strength of the lymph node ratio (LNR), positive lymph nodes (+LNs) and collected lymph nodes (LNcoll) using a time-dependent analysis in colorectal cancer patients stratified by mismatch repair (MMR) status. Method: 580 stage III-IV patients were included. Multivariable Cox regression analysis and time-dependent receiver operating characteristic (tROC) curve analysis were performed. The Area under the Curve (AUC) over time was compared for the three features. Results were validated on a second cohort of 105 stage III-IV patients. Results: The AUC for the LNR was 0.71 and outperformed + LNs and LNcoll by 10–15 % in both MMR-proficient and deficient cancers. LNR and + LNs were both significant (p<0.0001) in multivariable analysis but the effect was considerably stronger for the LNR [LNR: HR=5.18 (95 % CI: 3.5–7.6); +LNs=1.06 (95 % CI: 1.04–1.08)]. Similar results were obtained for patients with >12 LNcoll. An optimal cut off score for LNR=0.231 was validated on the second cohort (p<0.001). Conclusion: The LNR outperforms the + LNs and LNcoll even in patients with >12 LNcoll. Its clinical value is not confounded by MMR status. A cut-of score of 0.231 may best stratify patients into prognostic subgroups and could be a basis for the future prospective analysis of the LNR.
Resumo:
Endocrine treatments have been used in breast cancer since 1896, when Beatson reported on the results of oophorectomy for advanced breast cancer. In the second half of the last century, different endocrine-based compounds were developed and, in this review, the role of the selective estrogen receptor modulators (SERMs) and selective estrogen receptor down regulators (SERDs) in the postmenopausal setting are discussed. Tamoxifen is the most investigated and most widely used representative of these agents, and has been introduced in the advanced disease, in the neoadjuvant and adjuvant setting, and for the prevention of the disease. Its role has been challenged in recent years by the introduction of third-generation aromatase inhibitors that have proven higher activities than tamoxifen with different toxicity patterns. Several other SERMs have been investigated, but none have been clearly superior to tamoxifen. SERDs act as pure estrogen antagonists and should compare favourably to tamoxifen. For the time being, they have been used in the treatment of advanced breast cancers and their role in other settings still needs investigation. The increased use of aromatase inhibitors as first-line endocrine therapy has resulted in new discussions regarding the role that tamoxifen and other SERMs or SERDs may play in breast cancer. The sequencing of endocrine therapies in hormone-sensitive breast cancer remains a very important research issue.
Resumo:
The optimal temporal window of intravenous (IV) computed tomography (CT) cholangiography was prospectively determined. Fifteen volunteers (eight women, seven men; mean age, 38 years) underwent dynamic CT cholangiography. Two unenhanced images were acquired at the porta hepatis. Starting 5 min after initiation of IV contrast infusion (20 ml iodipamide meglumine 52%), 15 pairs of images at 5-min intervals were obtained. Attenuation of the extrahepatic bile duct (EBD) and the liver parenchyma was measured. Two readers graded visualization of the higher-order biliary branches. The first biliary opacification in the EBD occurred between 15 and 25 min (mean, 22.3 min +/- 3.2) after initiation of the contrast agent. Biliary attenuation plateaued between the 35- and the 75-min time points. Maximum hepatic parenchymal enhancement was 18.5 HU +/- 2.7. Twelve subjects demonstrated poor or non-visualization of higher-order biliary branches; three showed good or excellent visualization. Body weight and both biliary attenuation and visualization of the higher-order biliary branches correlated significantly (P<0.05). For peak enhancement of the biliary tree, CT cholangiography should be performed no earlier than 35 min after initiation of IV infusion. For a fixed contrast dose, superior visualization of the biliary system is achieved in subjects with lower body weight.
Resumo:
OBJECTIVE: To investigate predictors of continued HIV RNA viral load suppression in individuals switched to abacavir (ABC), lamivudine (3TC) and zidovudine (ZDV) after successful previous treatment with a protease inhibitor or non-nucleoside reverse transcriptase inhibitor-based combination antiretroviral therapy. DESIGN AND METHODS: An observational cohort study, which included individuals in the Swiss HIV Cohort Study switching to ABC/3TC/ZDV following successful suppression of viral load. The primary endpoint was time to treatment failure defined as the first of the following events: two consecutiveviral load measurements > 400 copies/ml under ABC/3TC/ZDV, one viral load measurement > 400 copies/ml and subsequent discontinuation of ABC/3TC/ZDV within 3 months, AIDS or death. RESULTS: We included 495 individuals; 47 experienced treatment failure in 1459 person-years of follow-up [rate = 3.22 events/100 person-years; 95% confidence interval (95% CI), 2.30-4.14]. Of all failures, 62% occurred in the first year after switching to ABC/3TC/ZDV. In a Cox regression analysis, treatment failure was independently associated with earlier exposure to nucleoside reverse transcriptase inhibitor (NRTI) mono or dual therapy [hazard ratio (HR), 8.02; 95% CI, 4.19-15.35) and low CD4 cell count at the time of the switch (HR, 0.66; 95% CI, 0.51-0.87 by +100 cells/microl up to 500 cells/microl). In patients without earlier exposure to mono or dual therapy, AIDS prior to switch to simplified maintenance therapy was an additional risk factor. CONCLUSIONS: The failure rate was low in patients with suppressed viral load and switch to ABC/3TC/ZDV treatment. Patients with earlier exposure to mono or dual NRTI therapy, low CD4 cell count at time of switch, or AIDS are at increased risk of treatment failure, limiting the use of ABC/3TC/ZDV in these patient groups.
Resumo:
OBJECTIVE: In search of an optimal compression therapy for venous leg ulcers, a systematic review and meta-analysis was performed of randomized controlled trials (RCT) comparing compression systems based on stockings (MCS) with divers bandages. METHODS: RCT were retrieved from six sources and reviewed independently. The primary endpoint, completion of healing within a defined time frame, and the secondary endpoints, time to healing, and pain were entered into a meta-analysis using the tools of the Cochrane Collaboration. Additional subjective endpoints were summarized. RESULTS: Eight RCT (published 1985-2008) fulfilled the predefined criteria. Data presentation was adequate and showed moderate heterogeneity. The studies included 692 patients (21-178/study, mean age 61 years, 56% women). Analyzed were 688 ulcerated legs, present for 1 week to 9 years, sizing 1 to 210 cm(2). The observation period ranged from 12 to 78 weeks. Patient and ulcer characteristics were evenly distributed in three studies, favored the stocking groups in four, and the bandage group in one. Data on the pressure exerted by stockings and bandages were reported in seven and two studies, amounting to 31-56 and 27-49 mm Hg, respectively. The proportion of ulcers healed was greater with stockings than with bandages (62.7% vs 46.6%; P < .00001). The average time to healing (seven studies, 535 patients) was 3 weeks shorter with stockings (P = .0002). In no study performed bandages better than MCS. Pain was assessed in three studies (219 patients) revealing an important advantage of stockings (P < .0001). Other subjective parameters and issues of nursing revealed an advantage of MCS as well. CONCLUSIONS: Leg compression with stockings is clearly better than compression with bandages, has a positive impact on pain, and is easier to use.
Resumo:
Background Cardiac arrests are handled by teams rather than by individual health-care workers. Recent investigations demonstrate that adherence to CPR guidelines can be less than optimal, that deviations from treatment algorithms are associated with lower survival rates, and that deficits in performance are associated with shortcomings in the process of team-building. The aim of this study was to explore and quantify the effects of ad-hoc team-building on the adherence to the algorithms of CPR among two types of physicians that play an important role as first responders during CPR: general practitioners and hospital physicians. Methods To unmask team-building this prospective randomised study compared the performance of preformed teams, i.e. teams that had undergone their process of team-building prior to the onset of a cardiac arrest, with that of teams that had to form ad-hoc during the cardiac arrest. 50 teams consisting of three general practitioners each and 50 teams consisting of three hospital physicians each, were randomised to two different versions of a simulated witnessed cardiac arrest: the arrest occurred either in the presence of only one physician while the remaining two physicians were summoned to help ("ad-hoc"), or it occurred in the presence of all three physicians ("preformed"). All scenarios were videotaped and performance was analysed post-hoc by two independent observers. Results Compared to preformed teams, ad-hoc forming teams had less hands-on time during the first 180 seconds of the arrest (93 ± 37 vs. 124 ± 33 sec, P < 0.0001), delayed their first defibrillation (67 ± 42 vs. 107 ± 46 sec, P < 0.0001), and made less leadership statements (15 ± 5 vs. 21 ± 6, P < 0.0001). Conclusion Hands-on time and time to defibrillation, two performance markers of CPR with a proven relevance for medical outcome, are negatively affected by shortcomings in the process of ad-hoc team-building and particularly deficits in leadership. Team-building has thus to be regarded as an additional task imposed on teams forming ad-hoc during CPR. All physicians should be aware that early structuring of the own team is a prerequisite for timely and effective execution of CPR.
Resumo:
The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.
Resumo:
OBJECTIVE Standard stroke CT protocols start with non-enhanced CT followed by perfusion-CT (PCT) and end with CTA. We aimed to evaluate the influence of the sequence of PCT and CTA on quantitative perfusion parameters, venous contrast enhancement and examination time to save critical time in the therapeutic window in stroke patients. METHODS AND MATERIALS Stroke CT data sets of 85 patients, 47 patients with CTA before PCT (group A) and 38 with CTA after PCT (group B) were retrospectively analyzed by two experienced neuroradiologists. Parameter maps of cerebral blood flow, cerebral blood volume, time to peak and mean transit time and contrast enhancements (arterial and venous) were compared. RESULTS Both readers rated contrast of brain-supplying arteries to be equal in both groups (p=0.55 (intracranial) and p=0.73 (extracranial)) although the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). Quantitative perfusion parameters did not significantly differ between the groups (all p>0.18), while the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). The time to complete the diagnostic CT examination was significantly shorter for group A (p<0.01). CONCLUSION Performing CTA directly after NECT has no significant effect on PCT parameters and avoids venous preloading in CTA, while examination times were significantly shorter.
Resumo:
The paper analyzes how to comply with an emission constraint, which restricts the use of an established energy technique, given the two options to save energy and to invest in two alternative energy techniques. These techniques differ in their deterioration rates and the investment lags of the corresponding capital stocks. Thus, the paper takes a medium-term perspective on climate change mitigation, where the time horizon is too short for technological change to occur, but long enough for capital stocks to accumulate and deteriorate. It is shown that, in general, only one of the two alternative techniques prevails in the stationary state, although, both techniques might be utilized during the transition phase. Hence, while in a static economy only one technique is efficient, this is not necessarily true in a dynamic economy.
Resumo:
OBJECTIVES The aim of this phantom study was to minimize the radiation dose by finding the best combination of low tube current and low voltage that would result in accurate volume measurements when compared to standard CT imaging without significantly decreasing the sensitivity of detecting lung nodules both with and without the assistance of CAD. METHODS An anthropomorphic chest phantom containing artificial solid and ground glass nodules (GGNs, 5-12 mm) was examined with a 64-row multi-detector CT scanner with three tube currents of 100, 50 and 25 mAs in combination with three tube voltages of 120, 100 and 80 kVp. This resulted in eight different protocols that were then compared to standard CT sensitivity (100 mAs/120 kVp). For each protocol, at least 127 different nodules were scanned in 21-25 phantoms. The nodules were analyzed in two separate sessions by three independent, blinded radiologists and computer-aided detection (CAD) software. RESULTS The mean sensitivity of the radiologists for identifying solid lung nodules on a standard CT was 89.7% ± 4.9%. The sensitivity was not significantly impaired when the tube and current voltage were lowered at the same time, except at the lowest exposure level of 25 mAs/80 kVp [80.6% ± 4.3% (p = 0.031)]. Compared to the standard CT, the sensitivity for detecting GGNs was significantly lower at all dose levels when the voltage was 80 kVp; this result was independent of the tube current. The CAD significantly increased the radiologists' sensitivity for detecting solid nodules at all dose levels (5-11%). No significant volume measurement errors (VMEs) were documented for the radiologists or the CAD software at any dose level. CONCLUSIONS Our results suggest a CT protocol with 25 mAs and 100 kVp is optimal for detecting solid and ground glass nodules in lung cancer screening. The use of CAD software is highly recommended at all dose levels.