28 resultados para Optimal frame-level timing estimator
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The aim of this study was to assess the potential of monoenergetic computed tomography (CT) images to reduce beam hardening artifacts in comparison to standard CT images of dental restoration on dental post-mortem CT (PMCT). Thirty human decedents (15 male, 58 ± 22 years) with dental restorations were examined using standard single-energy CT (SECT) and dual-energy CT (DECT). DECT data were used to generate monoenergetic CT images, reflecting the X-ray attenuation at energy levels of 64, 69, 88 keV, and at an individually adjusted optimal energy level called OPTkeV. Artifact reduction and image quality of SECT and monoenergetic CT were assessed objectively and subjectively by two blinded readers. Subjectively, beam artifacts decreased visibly in 28/30 cases after monoenergetic CT reconstruction. Inter- and intra-reader agreement was good (k = 0.72, and k = 0.73 respectively). Beam hardening artifacts decreased significantly with increasing monoenergies (repeated-measures ANOVA p < 0.001). Artifact reduction was greatest on monoenergetic CT images at OPTkeV. Mean OPTkeV was 108 ± 17 keV. OPTkeV yielded the lowest difference between CT numbers of streak artifacts and reference tissues (-163 HU). Monoenergetic CT reconstructions significantly reduce beam hardening artifacts from dental restorations and improve image quality of post-mortem dental CT.
Resumo:
OBJECTIVE Standard stroke CT protocols start with non-enhanced CT followed by perfusion-CT (PCT) and end with CTA. We aimed to evaluate the influence of the sequence of PCT and CTA on quantitative perfusion parameters, venous contrast enhancement and examination time to save critical time in the therapeutic window in stroke patients. METHODS AND MATERIALS Stroke CT data sets of 85 patients, 47 patients with CTA before PCT (group A) and 38 with CTA after PCT (group B) were retrospectively analyzed by two experienced neuroradiologists. Parameter maps of cerebral blood flow, cerebral blood volume, time to peak and mean transit time and contrast enhancements (arterial and venous) were compared. RESULTS Both readers rated contrast of brain-supplying arteries to be equal in both groups (p=0.55 (intracranial) and p=0.73 (extracranial)) although the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). Quantitative perfusion parameters did not significantly differ between the groups (all p>0.18), while the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). The time to complete the diagnostic CT examination was significantly shorter for group A (p<0.01). CONCLUSION Performing CTA directly after NECT has no significant effect on PCT parameters and avoids venous preloading in CTA, while examination times were significantly shorter.
Resumo:
BACKGROUND Surgical site infections are the most common hospital-acquired infections among surgical patients. The administration of surgical antimicrobial prophylaxis reduces the risk of surgical site infections . The optimal timing of this procedure is still a matter of debate. While most studies suggest that it should be given as close to the incision time as possible, others conclude that this may be too late for optimal prevention of surgical site infections. A large observational study suggests that surgical antimicrobial prophylaxis should be administered 74 to 30 minutes before surgery. The aim of this article is to report the design and protocol of a randomized controlled trial investigating the optimal timing of surgical antimicrobial prophylaxis.Methods/design: In this bi-center randomized controlled trial conducted at two tertiary referral centers in Switzerland, we plan to include 5,000 patients undergoing general, oncologic, vascular and orthopedic trauma procedures. Patients are randomized in a 1:1 ratio into two groups: one receiving surgical antimicrobial prophylaxis in the anesthesia room (75 to 30 minutes before incision) and the other receiving surgical antimicrobial prophylaxis in the operating room (less than 30 minutes before incision). We expect a significantly lower rate of surgical site infections with surgical antimicrobial prophylaxis administered more than 30 minutes before the scheduled incision. The primary outcome is the occurrence of surgical site infections during a 30-day follow-up period (one year with an implant in place). When assuming a 5 surgical site infection risk with administration of surgical antimicrobial prophylaxis in the operating room, the planned sample size has an 80% power to detect a relative risk reduction for surgical site infections of 33% when administering surgical antimicrobial prophylaxis in the anesthesia room (with a two-sided type I error of 5%). We expect the study to be completed within three years. DISCUSSION The results of this randomized controlled trial will have an important impact on current international guidelines for infection control strategies in the hospital. Moreover, the results of this randomized controlled trial are of significant interest for patient safety and healthcare economics.Trial registration: This trial is registered on ClinicalTrials.gov under the identifier NCT01790529.
Resumo:
Many attempts have already been made to detect exomoons around transiting exoplanets, but the first confirmed discovery is still pending. The experiences that have been gathered so far allow us to better optimize future space telescopes for this challenge already during the development phase. In this paper we focus on the forthcoming CHaraterising ExOPlanet Satellite (CHEOPS), describing an optimized decision algorithm with step-by-step evaluation, and calculating the number of required transits for an exomoon detection for various planet moon configurations that can be observable by CHEOPS. We explore the most efficient way for such an observation to minimize the cost in observing time. Our study is based on PTV observations (photocentric transit timing variation) in simulated CHEOPS data, but the recipe does not depend on the actual detection method, and it can be substituted with, e.g., the photodynamical method for later applications. Using the current state-of-the-art level simulation of CHEOPS data we analyzed transit observation sets for different star planet moon configurations and performed a bootstrap analysis to determine their detection statistics. We have found that the detection limit is around an Earth-sized moon. In the case of favorable spatial configurations, systems with at least a large moon and a Neptune-sized planet, an 80% detection chance requires at least 5-6 transit observations on average. There is also a nonzero chance in the case of smaller moons, but the detection statistics deteriorate rapidly, while the necessary transit measurements increase quickly. After the CoRoT and Kepler spacecrafts, CHEOPS will be the next dedicated space telescope that will observe exoplanetary transits and characterize systems with known Doppler-planets. Although it has a smaller aperture than Kepler (the ratio of the mirror diameters is about 1/3) and is mounted with a CCD that is similar to Kepler's, it will observe brighter stars and operate with larger sampling rate; therefore, the detection limit for an exomoon can be the same as or better, which will make CHEOPS a competitive instruments in the quest for exomoons.
Resumo:
Objective: To compare clinical outcomes after laparoscopic cholecystectomy (LC) for acute cholecystitis performed at various time-points after hospital admission. Background: Symptomatic gallstones represent an important public health problem with LC the treatment of choice. LC is increasingly offered for acute cholecystitis, however, the optimal time-point for LC in this setting remains a matter of debate. Methods: Analysis was based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery and included patients undergoing emergency LC for acute cholecystitis between 1995 and 2006, grouped according to the time-points of LC since hospital admission (admission day (d0), d1, d2, d3, d4/5, d ≥6). Linear and generalized linear regression models assessed the effect of timing of LC on intra- or postoperative complications, conversion and reoperation rates and length of postoperative hospital stay. Results: Of 4113 patients, 52.8% were female, median age was 59.8 years. Delaying LC resulted in significantly higher conversion rates (from 11.9% at d0 to 27.9% at d ≥6 days after admission, P < 0.001), surgical postoperative complications (5.7% to 13%, P < 0.001) and re-operation rates (0.9% to 3%, P = 0.007), with a significantly longer postoperative hospital stay (P < 0.001). Conclusions: Delaying LC for acute cholecystitis has no advantages, resulting in significantly increased conversion/re-operation rate, postoperative complications and longer postoperative hospital stay. This investigation—one of the largest in the literature—provides compelling evidence that acute cholecystitis merits surgery within 48 hours of hospital admission if impact on the patient and health care system is to be minimized.
Resumo:
The objective of this study was to determine the optimal time interval for a repeated Chlamydia trachomatis (chlamydia) test.
Resumo:
Decompressive craniectomy (DC) due to intractably elevated intracranial pressure mandates later cranioplasty (CP). However, the optimal timing of CP remains controversial. We therefore analyzed our prospectively conducted database concerning the timing of CP and associated post-operative complications. From October 1999 to August 2011, 280 cranioplasty procedures were performed at the authors' institution. Patients were stratified into two groups according to the time from DC to cranioplasty (early, ≤2 months, and late, >2 months). Patient characteristics, timing of CP, and CP-related complications were analyzed. Overall CP was performed early in 19% and late in 81%. The overall complication rate was 16.4%. Complications after CP included epidural or subdural hematoma (6%), wound healing disturbance (5.7%), abscess (1.4%), hygroma (1.1%), cerebrospinal fluid fistula (1.1%), and other (1.1%). Patients who underwent early CP suffered significantly more often from complications compared to patients who underwent late CP (25.9% versus 14.2%; p=0.04). Patients with ventriculoperitoneal (VP) shunt had a significantly higher rate of complications after CP compared to patients without VP shunt (p=0.007). On multivariate analysis, early CP, the presence of a VP shunt, and intracerebral hemorrhage as underlying pathology for DC, were significant predictors of post-operative complications after CP. We provide detailed data on surgical timing and complications for cranioplasty after DC. The present data suggest that patients who undergo late CP might benefit from a lower complication rate. This might influence future surgical decision making regarding optimal timing of cranioplasty.
Resumo:
BACKGROUND: The deletion of three adjacent nucleotides in an exon may cause the lack of a single amino acid, while the protein sequence remains otherwise unchanged. Only one such in-frame deletion is known in the two RH genes, represented by the RHCE allele ceBP expressing a "very weak e antigen." STUDY DESIGN AND METHODS: Blood donor samples were recognized because of discrepant results of D phenotyping. Six samples came from Switzerland and one from Northern Germany. The molecular structures were determined by genomic DNA nucleotide sequencing of RHD. RESULTS: Two different variant D antigens were explained by RHD alleles harboring one in-frame triplet deletion each. Both single-amino-acid deletions led to partial D phenotypes with weak D antigen expression. Because of their D category V-like phenotypes, the RHD(Arg229del) allele was dubbed DVL-1 and the RHD(Lys235del) allele DVL-2. These in-frame triplet deletions are located in GAGAA or GAAGA repeats of the RHD exon 5. CONCLUSION: Partial D may be caused by a single-amino-acid deletion in RhD. The altered RhD protein segments in DVL types are adjacent to the extracellular loop 4, which constitutes one of the most immunogenic parts of the D antigen. These RhD protein segments are also altered in all DV, which may explain the similarity in phenotype. At the nucleotide level, the triplet deletions may have resulted from replication slippage. A total of nine amino acid positions in an Rhesus protein may be affected by this mechanism.
Resumo:
Early prenatal diagnosis and in utero therapy of certain fetal diseases have the potential to reduce fetal morbidity and mortality. The intrauterine transplantation of stem cells provides in some instances a therapeutic option before definitive organ failure occurs. Clinical experiences show that certain diseases, such as immune deficiencies or inborn errors of metabolism, can be successfully treated using stem cells derived from bone marrow. However, a remaining problem is the low level of engraftment that can be achieved. Efforts are made in animal models to optimise the graft and study the recipient's microenvironment to increase long-term engraftment levels. Our experiments in mice show similar early homing of allogeneic and xenogeneic stem cells and reasonable early engraftment of allogeneic murine fetal liver cells (17.1% donor cells in peripheral blood 4 weeks after transplantation), whereas xenogeneic HSC are rapidly diminished due to missing self-renewal and low differentiation capacities in the host's microenvironment. Allogeneic murine fetal liver cells have very good long-term engraftment (49.9% donor cells in peripheral blood 16 weeks after transplantation). Compared to the rodents, the sheep model has the advantage of body size and gestation comparable to the human fetus. Here, ultrasound-guided injection techniques significantly decreased fetal loss rates. In contrast to the murine in utero model, the repopulation capacities of allogeneic ovine fetal liver cells are lower (0.112% donor cells in peripheral blood 3 weeks after transplantation). The effect of MHC on engraftment levels seems to be marginal, since no differences could be observed between autologous and allogeneic transplantation (0.117% donor cells vs 0.112% donor cells in peripheral blood 1 to 2 weeks after transplantation). Further research is needed to study optimal timing and graft composition as well as immunological aspects of in utero transplantation.
Resumo:
Adaptation does not necessarily lead to traits which are optimal for the population. This is because selection is often the strongest at the individual or gene level. The evolution of selfishness can lead to a 'tragedy of the commons', where traits such as aggression or social cheating reduce population size and may lead to extinction. This suggests that species-level selection will result whenever species differ in the incentive to be selfish. We explore this idea in a simple model that combines individual-level selection with ecology in two interacting species. Our model is not influenced by kin or trait-group selection. We find that individual selection in combination with competitive exclusion greatly increases the likelihood that selfish species go extinct. A simple example of this would be a vertebrate species that invests heavily into squabbles over breeding sites, which is then excluded by a species that invests more into direct reproduction. A multispecies simulation shows that these extinctions result in communities containing species that are much less selfish. Our results suggest that species-level selection and community dynamics play an important role in regulating the intensity of conflicts in natural populations.
Resumo:
OBJECTIVE: To obtain precise information on the optimal time window for surgical antimicrobial prophylaxis. SUMMARY BACKGROUND DATA: Although perioperative antimicrobial prophylaxis is a well-established strategy for reducing the risk of surgical site infections (SSI), the optimal timing for this procedure has yet to be precisely determined. Under today's recommendations, antibiotics may be administered within the final 2 hours before skin incision, ideally as close to incision time as possible. METHODS: In this prospective observational cohort study at Basel University Hospital we analyzed the incidence of SSI by the timing of antimicrobial prophylaxis in a consecutive series of 3836 surgical procedures. Surgical wounds and resulting infections were assessed to Centers for Disease Control and Prevention standards. Antimicrobial prophylaxis consisted in single-shot administration of 1.5 g of cefuroxime (plus 500 mg of metronidazole in colorectal surgery). RESULTS: The overall SSI rate was 4.7% (180 of 3836). In 49% of all procedures antimicrobial prophylaxis was administered within the final half hour. Multivariable logistic regression analyses showed a significant increase in the odds of SSI when antimicrobial prophylaxis was administered less than 30 minutes (crude odds ratio = 2.01; adjusted odds ratio = 1.95; 95% confidence interval, 1.4-2.8; P < 0.001) and 120 to 60 minutes (crude odds ratio = 1.75; adjusted odds ratio = 1.74; 95% confidence interval, 1.0-2.9; P = 0.035) as compared with the reference interval of 59 to 30 minutes before incision. CONCLUSIONS: When cefuroxime is used as a prophylactic antibiotic, administration 59 to 30 minutes before incision is more effective than administration during the last half hour.
Resumo:
OBJECTIVE: In search of an optimal compression therapy for venous leg ulcers, a systematic review and meta-analysis was performed of randomized controlled trials (RCT) comparing compression systems based on stockings (MCS) with divers bandages. METHODS: RCT were retrieved from six sources and reviewed independently. The primary endpoint, completion of healing within a defined time frame, and the secondary endpoints, time to healing, and pain were entered into a meta-analysis using the tools of the Cochrane Collaboration. Additional subjective endpoints were summarized. RESULTS: Eight RCT (published 1985-2008) fulfilled the predefined criteria. Data presentation was adequate and showed moderate heterogeneity. The studies included 692 patients (21-178/study, mean age 61 years, 56% women). Analyzed were 688 ulcerated legs, present for 1 week to 9 years, sizing 1 to 210 cm(2). The observation period ranged from 12 to 78 weeks. Patient and ulcer characteristics were evenly distributed in three studies, favored the stocking groups in four, and the bandage group in one. Data on the pressure exerted by stockings and bandages were reported in seven and two studies, amounting to 31-56 and 27-49 mm Hg, respectively. The proportion of ulcers healed was greater with stockings than with bandages (62.7% vs 46.6%; P < .00001). The average time to healing (seven studies, 535 patients) was 3 weeks shorter with stockings (P = .0002). In no study performed bandages better than MCS. Pain was assessed in three studies (219 patients) revealing an important advantage of stockings (P < .0001). Other subjective parameters and issues of nursing revealed an advantage of MCS as well. CONCLUSIONS: Leg compression with stockings is clearly better than compression with bandages, has a positive impact on pain, and is easier to use.
Resumo:
BACKGROUND Timing is critical for efficient hepatitis A vaccination in high endemic areas as high levels of maternal IgG antibodies against the hepatitis A virus (HAV) present in the first year of life may impede the vaccine response. OBJECTIVES To describe the kinetics of the decline of anti-HAV maternal antibodies, and to estimate the time of complete loss of maternal antibodies in infants in León, Nicaragua, a region in which almost all mothers are anti-HAV seropositive. METHODS We collected cord blood samples from 99 healthy newborns together with 49 corresponding maternal blood samples, as well as further blood samples at 2 and 7 months of age. Anti-HAV IgG antibody levels were measured by enzyme immunoassay (EIA). We predicted the time when antibodies would fall below 10 mIU/ml, the presumed lowest level of seroprotection. RESULTS Seroprevalence was 100% at birth (GMC 8392 mIU/ml); maternal and cord blood antibody concentrations were similar. The maternal antibody levels of the infants decreased exponentially with age and the half-life of the maternal antibody was estimated to be 40 days. The relationship between the antibody concentration at birth and time until full waning was described as: critical age (months)=3.355+1.969 × log(10)(Ab-level at birth). The survival model estimated that loss of passive immunity will have occurred in 95% of infants by the age of 13.2 months. CONCLUSIONS Complete waning of maternal anti-HAV antibodies may take until early in the second year of life. The here-derived formula relating maternal or cord blood antibody concentrations to the age at which passive immunity is lost may be used to determine the optimal age of childhood HAV vaccination.
Resumo:
OBJECTIVES Evidence increases that cognitive failure may be used to screen for drivers at risk. Until now, most studies have relied on driving learners. This exploratory pilot study examines self-report of cognitive failure in driving beginners and error during real driving as observed by driving instructors. METHODS Forty-two driving learners of 14 driving instructors filled out a work-related cognitive failure questionnaire. Driving instructors observed driving errors during the next driving lesson. In multiple linear regression analysis, driving errors were regressed on cognitive failure with the number of driving lessons as an estimator of driving experience controlled. RESULTS Higher cognitive failure predicted more driving errors (p < .01) when age, gender and driving experience were controlled in analysis. CONCLUSIONS Cognitive failure was significantly associated with observed driving errors. Systematic research on cognitive failure in driving beginners is recommended.
Resumo:
Wireless Mesh Networks (WMNs) are increasingly deployed to enable thousands of users to share, create, and access live video streaming with different characteristics and content, such as video surveillance and football matches. In this context, there is a need for new mechanisms for assessing the quality level of videos because operators are seeking to control their delivery process and optimize their network resources, while increasing the user’s satisfaction. However, the development of in-service and non-intrusive Quality of Experience assessment schemes for real-time Internet videos with different complexity and motion levels, Group of Picture lengths, and characteristics, remains a significant challenge. To address this issue, this article proposes a non-intrusive parametric real-time video quality estimator, called MultiQoE that correlates wireless networks’ impairments, videos’ characteristics, and users’ perception into a predicted Mean Opinion Score. An instance of MultiQoE was implemented in WMNs and performance evaluation results demonstrate the efficiency and accuracy of MultiQoE in predicting the user’s perception of live video streaming services when compared to subjective, objective, and well-known parametric solutions.