925 resultados para Optimal allocation of voltage regulators and capacitor
Resumo:
In order to identify optimal therapy for children with bacterial pneumonia, Pakistan's ARI Program, in collaboration with the National Institute of Health (NIH), Islamabad, undertook a national surveillance of antimicrobial resistance in S. pneumoniae and H. influenzae. The project was carried out at selected urban and peripheral sites in 6 different regions of Pakistan, in 1991–92. Nasopharyngeal (NP) specimens and blood cultures were obtained from children with pneumonia diagnosed in the outpatient clinic of participating facilities. Organisms were isolated by local hospital laboratories and sent to NIH for confirmation, serotyping and antimicrobial susceptibility testing. Following were the aims of the study (i) to determine the antimicrobial resistance patterns of S. pneumoniae and H. influenzae in children aged 2–59 months; (ii) to determine the ability of selected laboratories to identify and effectively transport isolates of S. pneumoniae and H. influenzae cultured from nasopharyngeal and blood specimens; (iii) to validate the comparability of resistance patterns for nasopharyngeal and blood isolates of S. pneumoniae and H. influenzae from children with pneumonia; and (iv) to examine the effect of drug resistance and laboratory error on the cost of effectively treating children with ARI. ^ A total of 1293 children with ARI were included in the study: 969 (75%) from urban areas and 324 (25%) from rural parts of the country. Of 1293, there were 786 (61%) male and 507 (39%) female children. The resistance rate of S. pneumoniae to various antibiotics among the urban children with ARI was: TMP/SMX (62%); chloramphenicol (23%); penicillin (5%); tetracycline (16%); and ampicillin/amoxicillin (0%). The rates of resistance of H. influenzae were higher than S. pneumoniae: TMP/SMX (85%); chloramphenicol (62%); penicillin (59%); ampicillin/amoxicillin (46%); and tetracycline (100%). There were similar rates of resistance to each antimicrobial agent among isolates from the rural children. ^ Of a total 614 specimens that were tested for antimicrobial susceptibility, 432 (70.4%) were resistant to TMP/SMX and 93 (15.2%) were resistant to antimicrobial agents other than TMP/SMX viz. ampicillin/amoxicillin, chloramphenicol, penicillin, and tetracycline. ^ The sensitivity and positive predictive value of peripheral laboratories for H. influenzae were 99% and 65%, respectively. Similarly, the sensitivity and positive predictive value of peripheral laboratory tests compared to gold standard i.e. NIH laboratory, for S. pneumoniae were 99% and 54%, respectively. ^ The sensitivity and positive predictive value of nasopharyngeal specimens compared to blood cultures (gold standard), isolated by the peripheral laboratories, for H. influenzae were 88% and 11%, and for S. pneumoniae 92% and 39%, respectively. (Abstract shortened by UMI.)^
Resumo:
BACKGROUND Bouveret's syndrome causes gastric outlet obstruction when a gallstone is impacted in the duodenum or stomach via a bilioenteric fistula. It is a rare condition that causes significant morbidity and mortality and often occurs in the elderly with significant comorbidities. Individual diagnostic and treatment strategies are required for optimal management and outcome. The purpose of this paper is to develop a surgical strategy for optimized individual treatment of Bouveret's syndrome based on the available literature and motivated by our own experience. CASE PRESENTATION Two cases of Bouveret's syndrome are presented with individual management and restrictive surgical approaches tailored to the condition of the patients and intraoperative findings. CONCLUSIONS Improved diagnostics and restrictive individual surgical approaches have shown to lower the mortality rates of Bouveret's syndrome. For optimized outcome of the individual patient: The medical and perioperative management and time of surgery are tailored to the condition of the patient. CT-scan is most often required to secure the diagnosis. The surgical approach includes enterolithotomy alone or in combination with simultaneous or subsequent cholecystectomy and fistula repair. Lower overall morbidity and mortality are in favor of restrictive surgical approaches. The surgical strategy is adapted to the intraoperative findings and to the risk for secondary complications vs. the age and comorbidities of the patient.
Resumo:
BACKGROUND Surgical site infections are the most common hospital-acquired infections among surgical patients. The administration of surgical antimicrobial prophylaxis reduces the risk of surgical site infections . The optimal timing of this procedure is still a matter of debate. While most studies suggest that it should be given as close to the incision time as possible, others conclude that this may be too late for optimal prevention of surgical site infections. A large observational study suggests that surgical antimicrobial prophylaxis should be administered 74 to 30 minutes before surgery. The aim of this article is to report the design and protocol of a randomized controlled trial investigating the optimal timing of surgical antimicrobial prophylaxis.Methods/design: In this bi-center randomized controlled trial conducted at two tertiary referral centers in Switzerland, we plan to include 5,000 patients undergoing general, oncologic, vascular and orthopedic trauma procedures. Patients are randomized in a 1:1 ratio into two groups: one receiving surgical antimicrobial prophylaxis in the anesthesia room (75 to 30 minutes before incision) and the other receiving surgical antimicrobial prophylaxis in the operating room (less than 30 minutes before incision). We expect a significantly lower rate of surgical site infections with surgical antimicrobial prophylaxis administered more than 30 minutes before the scheduled incision. The primary outcome is the occurrence of surgical site infections during a 30-day follow-up period (one year with an implant in place). When assuming a 5 surgical site infection risk with administration of surgical antimicrobial prophylaxis in the operating room, the planned sample size has an 80% power to detect a relative risk reduction for surgical site infections of 33% when administering surgical antimicrobial prophylaxis in the anesthesia room (with a two-sided type I error of 5%). We expect the study to be completed within three years. DISCUSSION The results of this randomized controlled trial will have an important impact on current international guidelines for infection control strategies in the hospital. Moreover, the results of this randomized controlled trial are of significant interest for patient safety and healthcare economics.Trial registration: This trial is registered on ClinicalTrials.gov under the identifier NCT01790529.
Resumo:
The goal of the current investigation was to compare two monitoring processes (judgments of learning [JOLs] and confidence judgments [CJs]) and their corresponding control processes (allocation of study time and selection of answers to maximize accuracy, respectively) in 5- to 7-year-old children (N=101). Children learned the meaning of Japanese characters and provided JOLs after a study phase and CJs after a memory test. They were given the opportunity to control their learning in self-paced study phases, and to control their accuracy by placing correct answers into a treasure chest and incorrect answers into a trash can. All three age groups gave significantly higher CJs for correct compared to incorrect answers, with no age-related differences in the magnitude of this difference, suggesting robust metacognitive monitoring skills in children as young as 5. Furthermore, a link between JOLs and study time was found in the 6- and 7-year-olds, such that children spent more time studying items with low JOLs compared to items with high JOLs. Also, 6- and 7-year-olds but not 5-year-olds spent more time studying difficult items compared to easier items. Moreover, age-related improvements were found in children's use of CJs to guide their selection of answers: although children as young as 5 placed their most confident answers in the treasure chest and least confident answers in the trash can, this pattern was more robust in older children. Overall, results support the view that some metacognitive judgments may be acted upon with greater ease than others among young children.
Resumo:
BACKGROUND Vitamin D deficiency is prevalent in HIV-infected individuals and vitamin D supplementation is proposed according to standard care. This study aimed at characterizing the kinetics of 25(OH)D in a cohort of HIV-infected individuals of European ancestry to better define the influence of genetic and non-genetic factors on 25(OH)D levels. These data were used for the optimization of vitamin D supplementation in order to reach therapeutic targets. METHODS 1,397 25(OH)D plasma levels and relevant clinical information were collected in 664 participants during medical routine follow up visits. They were genotyped for 7 SNPs in 4 genes known to be associated with 25(OH)D levels. 25(OH)D concentrations were analyzed using a population pharmacokinetic approach. The percentage of individuals with 25(OH)D concentrations within the recommended range of 20-40ng/ml during 12 months of follow up and several dosage regimens were evaluated by simulation. RESULTS A one-compartment model with linear absorption and elimination was used to describe 25(OH)D pharmacokinetics, while integrating endogenous baseline plasma concentrations. Covariate analyses confirmed the effect of seasonality, body mass index, smoking habits, the analytical method, darunavir/r and the genetic variant in GC (rs2282679) on 25(OH)D concentrations. 11% of the interindividual variability in 25(OH)D levels was explained by seasonality and other non-genetic covariates and 1% by genetics. The optimal supplementation for severe vitamin D deficient patients was 300000 IU two times per year. CONCLUSIONS This analysis allowed identifying factors associated with 25(OH)D plasma levels in HIV-infected individuals. Improvement of dosage regimen and timing of vitamin D supplementation is proposed based on those results.
Resumo:
Ion channel proteins are regulated by different types of posttranslational modifications. The focus of this review is the regulation of voltage-gated sodium channels (Navs) upon their ubiquitylation. The amiloride-sensitive epithelial sodium channel (ENaC) was the first ion channel shown to be regulated upon ubiquitylation. This modification results from the binding of ubiquitin ligase from the Nedd4 family to a protein-protein interaction domain, known as the PY motif, in the ENaC subunits. Many of the Navs have similar PY motifs, which have been demonstrated to be targets of Nedd4-dependent ubiquitylation, tagging them for internalization from the cell surface. The role of Nedd4-dependent regulation of the Nav membrane density in physiology and disease remains poorly understood. Two recent studies have provided evidence that Nedd4-2 is downregulated in dorsal root ganglion (DRG) neurons in both rat and mouse models of nerve injury-induced neuropathic pain. Using two different mouse models, one with a specific knockout of Nedd4-2 in sensory neurons and another where Nedd4-2 was overexpressed with the use of viral vectors, it was demonstrated that the neuropathy-linked neuronal hyperexcitability was the result of Nav1.7 and Nav1.8 overexpression due to Nedd4-2 downregulation. These studies provided the first in vivo evidence of the role of Nedd4-2-dependent regulation of Nav channels in a disease state. This ubiquitylation pathway may be involved in the development of symptoms and diseases linked to Nav-dependent hyperexcitability, such as pain, cardiac arrhythmias, epilepsy, migraine, and myotonias.
Resumo:
BACKGROUND AND AIM So far there is little evidence from randomised clinical trials (RCT) or systematic reviews on the preferred or best number of implants to be used for the support of a fixed prosthesis in the edentulous maxilla or mandible, and no consensus has been reached. Therefore, we reviewed articles published in the past 30 years that reported on treatment outcomes for implant-supported fixed prostheses, including survival of implants and survival of prostheses after a minimum observation period of 1 year. MATERIAL AND METHODS MEDLINE and EMBASE were searched to identify eligible studies. Short and long-term clinical studies were included with prospective and retrospective study designs to see if relevant information could be obtained on the number of implants related to the prosthetic technique. Articles reporting on implant placement combined with advanced surgical techniques such as sinus floor elevation (SFE) or extensive grafting were excluded. Two reviewers extracted the data independently. RESULTS A primary search was broken down to 222 articles. Out of these, 29 studies comprising 26 datasets fulfilled the inclusion criteria. From all studies, the number of planned and placed implants was available. With two exceptions, no RCTs were found, and these two studies did not compare different numbers of implants per prosthesis. Eight studies were retrospective; all the others were prospective. Fourteen studies calculated cumulative survival rates for 5 and more years. From these data, the average survival rate was between 90% and 100%. The analysis of the selected articles revealed a clear tendency to plan 4 to 6 implants per prosthesis. For supporting a cross-arch fixed prosthesis in the maxilla, the variation is slightly greater. CONCLUSIONS In spite of a dispersion of results, similar outcomes are reported with regard to survival and number of implants per jaw. Since the 1990s, it was proven that there is no need to install as many implants as possible in the available jawbone. The overwhelming majority of articles dealing with standard surgical procedures to rehabilitate edentulous jaws uses 4 to 6 implants.
Resumo:
The primary isolation of a Mycobacterium sp. of the Mycobacterium tuberculosis complex from an infected animal provides a definitive diagnosis of tuberculosis. However, as Mycobacterium bovis and Mycobacterium caprae are difficult to isolate, particularly for animals in the early stages of disease, success is dependent on the optimal performance of all aspects of the bacteriological process, from the initial choice of tissue samples at post-mortem examination or clinical samples, to the type of media and conditions used to cultivate the microorganism. Each step has its own performance characteristics, which can contribute to sensitivity and specificity of the procedure, and may need to be optimized in order to achieve the gold standard diagnosis. Having isolated the slow-growing mycobacteria, species identification and fine resolution strain typing are keys to understanding the epidemiology of the disease and to devise strategies to limit transmission of infection. New technologies have emerged that can now even discriminate different isolates from the same animal. In this review we highlight the key factors that contribute to the accuracy of bacteriological diagnosis of M. bovis and M. caprae, and describe the development of advanced genotyping techniques that are increasingly used in diagnostic laboratories for the purpose of supporting detailed epidemiological investigations.
Resumo:
Accumulating recent evidence identified the ribosome as binding target for numerous small and long non-protein-coding RNAs (ncRNAs) in various organisms of all 3 domains of life. Therefore it appears that ribosome-associated ncRNAs (rancRNAs) are a prevalent, yet poorly understood class of cellular transcripts. Since rancRNAs are associated with the arguable most central enzyme of the cell it seems plausible to propose a role in translation control. Indeed first experimental evidence on small rancRNAs has been presented, linking ribosome association with fine-tuning the rate of protein biosynthesis in a stress-dependent manner.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.
Resumo:
In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance
Resumo:
Small non-protein-coding RNA (ncRNA) molecules represent major contributors to regulatory networks in controlling gene expression in a highly efficient manner. Most of the recently discovered regulatory ncRNAs acting on translation target the mRNA rather than the ribosome (e.g.: miRNAs, siRNAs, antisense RNAs). To address the question, whether ncRNA regulators exist that are capable of modulating the rate of protein production by directly interacting with the ribosome, we have analyzed the small ncRNA interactomes of ribosomes. Deep-sequencing analyses revealed thousands of putative rancRNAs in various model organisms (1,2). For a subset of these ncRNA candidates we have gathered experimental evidence that they associate with ribosomes in a stress-dependent manner and fine-tune the rate of protein biosynthesis (3,4). Many of the investigated rancRNAs appear to be processing products of larger functional RNAs, such as tRNAs (2,3), mRNAs (3), or snoRNAs (2). Post-transcriptional cleavage of RNA to generate smaller fragments is a widespread mechanism that enlarges the structural and functional complexity of cellular RNomes. Our data disclose the ribosome as target for small regulatory RNAs. rancRNAs are found in all domains of life and represent a prevalent but so far largely unexplored class of regulatory molecules (5). Ongoing work in our lab revealed first insight into rancRNA processing and mechanism of this emerging class of translation regulators.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.
Resumo:
PURPOSE OF REVIEW Fever and neutropenia is the most common complication in the treatment of childhood cancer. This review will summarize recent publications that focus on improving the management of this condition as well as those that seek to optimize translational research efforts. RECENT FINDINGS A number of clinical decision rules are available to assist in the identification of low-risk fever and neutropenia however few have undergone external validation and formal impact analysis. Emerging evidence suggests acute fever and neutropenia management strategies should include time to antibiotic recommendations, and quality improvement initiatives have focused on eliminating barriers to early antibiotic administration. Despite reported increases in antimicrobial resistance, few studies have focused on the prediction, prevention, and optimal treatment of these infections and the effect on risk stratification remains unknown. A consensus guideline for paediatric fever and neutropenia research is now available and may help reduce some of the heterogeneity between studies that have previously limited the translation of evidence into clinical practice. SUMMARY Risk stratification is recommended for children with cancer and fever and neutropenia. Further research is required to quantify the overall impact of this approach and to refine exactly which children will benefit from early antibiotic administration as well as modifications to empiric regimens to cover antibiotic-resistant organisms.