849 resultados para RM extended algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The therapeutic efficacy of BAL9141 (formerly Ro 63-9141), a novel cephalosporin with broad in vitro activity that also has activity against methicillin-resistant Staphylococcus aureus (MRSA), was investigated in rats with experimental endocarditis. The test organisms were homogeneously methicillin-resistant S. aureus strain COL transformed with the penicillinase-encoding plasmid pI524 (COL Bla+) and homogeneously methicillin-resistant, penicillinase-producing isolate P8-Hom, selected by serial exposure of parent strain P8 to methicillin. The MICs of BAL9141 for these organisms (2 mg/liter) were low, and BAL9141was bactericidal in time-kill curve studies after 24 h of exposure to either two, four, or eight times the MIC. Rats with experimental endocarditis were treated in a three-arm study with a continuous infusion of BAL5788 (formerly Ro 65-5788), a carbamate prodrug of BAL9141, or with amoxicillin-clavulanate or vancomycin. The rats were administered BAL9141 to obtain steady-state target levels of 20, 10, and 5 mg of per liter or were administered either 1.2 g of amoxicillin-clavulanate (ratio 5:1) every 6 h or 1 g of vancomycin every 12 h at changing flow rates to simulate the pharmacokinetics produced in humans by intermittent intravenous treatment. Treatment was started 12 h after bacterial challenge and lasted for 3 days. BAL9141 was successful in the treatment of experimental endocarditis due to either MRSA isolate COL Bla+ or MRSA isolate P8-Hom at the three targeted steady-state concentrations and sterilized >90% of cardiac vegetations (P < 0.005 versus controls; P < 0.05 versus amoxicillin-clavulanate and vancomycin treatment groups). These promising in vivo results with BAL9141 correlated with the high affinity of the drug for PBP 2a and its stability to penicillinase hydrolysis observed in vitro.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this day of the mature highway systems, a new set of problems is facing the highway engineer. The existing infrastructure has aged to or past the design life of the original pavement design. In many cases, increased commercial traffic is creating the need for additional load carrying capacity, causing state highway engineers to consider new alternatives for rehabilitation of existing surfaces. Alternative surface materials, thicknesses, and methods of installation must be identified to meet the needs of individual pavements and budgets. With overlays being one of the most frequently used rehabilitation alternatives, it is important to learn more about the limitations and potential performance of thin bonded portland cement overlays and subsequent rehabilitation. The Iowa ultra-thin project demonstrated the application of thin portland cement concrete overlays as a rehabilitation technique. It combined the variables of base preparation, overlay thickness, slab size, and fiber enhancement into a series of test sections over a 7.2-mile length. This report identifies the performance of the overlays in terms of deflection reduction, reduced cracking, and improved bonding between the portland cement concrete (PCC) and asphalt cement concrete (ACC) base layers. The original research project was designed to evaluate the variables over a 5-year period of time. A second project provided the opportunity to test overlay rehabilitation techniques and continue measurement of the original overlay performance for 5 additional years. All performance indicators identified exceptional performance over the 10-year evaluation period for each of the variable combinations considered. The report summarizes the research methods, results, and identifies future research ideas to aid the pavement overlay designer in the successful implementation of ultra-thin portland cement concrete overlays as an lternative pavement rehabilitation technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Barrels are discrete cytoarchitectonic neurons cluster located in the layer IV of the somatosensory¦cortex in mice brain. Each barrel is related to a specific whisker located on the mouse snout. The¦whisker-to-barrel pathway is a part of the somatosensory system that is intensively used to explore¦sensory activation induced plasticity in the cerebral cortex.¦Different recording methods exist to explore the cortical response induced by whisker deflection in¦the cortex of anesthetized mice. In this work, we used a method called the Single-Unit Analysis by¦which we recorded the extracellular electric signals of a single barrel neuron using a microelectrode.¦After recording the signal was processed by discriminators to isolate specific neuronal shape (action¦potentials).¦The objective of this thesis was to familiarize with the barrel cortex recording during whisker¦deflection and its theoretical background and to compare two different ways of discriminating and¦sorting cortical signal, the Waveform Window Discriminator (WWD) or the Spike Shape Discriminator (SSD).¦WWD is an electric module allowing the selection of specific electric signal shape. A trigger and a¦window potential level are set manually. During measurements, every time the electric signal passes¦through the two levels a dot is generated on time line. It was the method used in previous¦extracellular recording study in the Département de Biologie Cellulaire et de Morphologie (DBCM) in¦Lausanne.¦SSD is a function provided by the signal analysis software Spike2 (Cambridge Electronic Design). The¦neuronal signal is discriminated by a complex algorithm allowing the creation of specific templates.¦Each of these templates is supposed to correspond to a cell response profile. The templates are saved¦as a number of points (62 in this study) and are set for each new cortical location. During¦measurements, every time the cortical recorded signal corresponds to a defined number of templates¦points (60% in this study) a dot is generated on time line. The advantage of the SSD is that multiple¦templates can be used during a single stimulation, allowing a simultaneous recording of multiple¦signals.¦It exists different ways to represent data after discrimination and sorting. The most commonly used¦in the Single-Unit Analysis of the barrel cortex are the representation of the time between stimulation¦and the first cell response (the latency), the representation of the Response Magnitude (RM) after¦whisker deflection corrected for spontaneous activity and the representation of the time distribution¦of neuronal spikes on time axis after whisker stimulation (Peri-Stimulus Time Histogram, PSTH).¦The results show that the RMs and the latencies in layer IV were significantly different between the¦WWD and the SSD discriminated signal. The temporal distribution of the latencies shows that the¦different values were included between 6 and 60ms with no peak value for SSD while the WWD¦data were all gathered around a peak of 11ms (corresponding to previous studies). The scattered¦distribution of the latencies recorded with the SSD did not correspond to a cell response.¦The SSD appears to be a powerful tool for signal sorting but we do not succeed to use it for the¦Single-Unit Analysis extracellular recordings. Further recordings with different SSD templates settings¦and larger sample size may help to show the utility of this tool in Single-Unit Analysis studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To document the feasibility and report the results of dosing darbepoetin-alpha at extended intervals up to once monthly (QM) in a large dialysis patient population. MATERIAL: 175 adult patients treated, at 23 Swiss hemodialysis centres, with stable doses of any erythropoiesis-stimulating agent who were switched by their physicians to darbepoetin-alpha treatment at prolonged dosing intervals (every 2 weeks [Q2W] or QM). METHOD: Multicentre, prospective, observational study. Patients' hemoglobin (Hb) levels and other data were recorded 1 month before conversion (baseline) to an extended darbepoetin-alpha dosing interval, at the time of conversion, and once monthly thereafter up to the evaluation point (maximum of 12 months or until loss to follow-up). RESULTS: Data for 161 evaluable patients from 23 sites were included in the final analysis. At 1 month prior to conversion, 73% of these patients were receiving darbepoetin-alpha weekly (QW) and 27% of the patients biweekly (Q2W). After a mean follow-up of 9.5 months, 34% received a monthly (QM) dosing regimen, 52% of the patients were receiving darbepoetin-alpha Q2W, and 14% QW. The mean (SD) Hb concentration at baseline was 12.3 +/- 1.2 g/dl, compared to 11.9 +/- 1.2 g/dl at the evaluation point. The corresponding mean weekly darbepoetin-alpha dose was 44.3 +/- 33.4 microg at baseline and 37.7 +/- 30.8 microg at the evaluation point. CONCLUSIONS: Conversion to extended darbepoetin-alpha dosing intervals of up to QM, with maintenance of initial Hb concentrations, was successful for the majority of stable dialysis patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente trabajo muestra los datos preliminares de una estudio de cohortes prospectivo unicéntrico que pretende comparar el daño neurológico asociado a dos intervenciones cardiacas para el tratamiento de la estenosis aórtica severa. Concretamente se analiza la aparición de lesiones isquémica agudas cerebrales detectadas por RM tras los dos procedimientos y su posible asociación con alteraciones del estado neurocognitivo en la evolución. La presentación actual solo muestra los datos preliminares de los resultados de la RM cerebral. En el apartado métodos se describe también como se realizó la valoración del estado neurocognitivo, no obstante, los resultados de estas valoraciones y su posible correlación con las lesiones en la RM cerebral aún no estan analizados y por lo tanto no se presentan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the nature of health care provider choice inthe case of patient-initiated contacts, with special reference toa National Health Service setting, where monetary prices are zeroand general practitioners act as gatekeepers to publicly financedspecialized care. We focus our attention on the factors that mayexplain the continuously increasing use of hospital emergencyvisits as opposed to other provider alternatives. An extendedversion of a discrete choice model of demand for patient-initiatedcontacts is presented, allowing for individual and town residencesize differences in perceived quality (preferences) betweenalternative providers and including travel and waiting time asnon-monetary costs. Results of a nested multinomial logit model ofprovider choice are presented. Individual choice betweenalternatives considers, in a repeated nested structure, self-care,primary care, hospital and clinic emergency services. Welfareimplications and income effects are analyzed by computingcompensating variations, and by simulating the effects of userfees by levels of income. Results indicate that compensatingvariation per visit is higher than the direct marginal cost ofemergency visits, and consequently, emergency visits do not appearas an inefficient alternative even for non-urgent conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To provide a global, up-to-date picture of the prevalence, treatment, and outcomes of Candida bloodstream infections in intensive care unit patients and compare Candida with bacterial bloodstream infection. DESIGN: A retrospective analysis of the Extended Prevalence of Infection in the ICU Study (EPIC II). Demographic, physiological, infection-related and therapeutic data were collected. Patients were grouped as having Candida, Gram-positive, Gram-negative, and combined Candida/bacterial bloodstream infection. Outcome data were assessed at intensive care unit and hospital discharge. SETTING: EPIC II included 1265 intensive care units in 76 countries. PATIENTS: Patients in participating intensive care units on study day. INTERVENTIONS: None. MEASUREMENT AND MAIN RESULTS: Of the 14,414 patients in EPIC II, 99 patients had Candida bloodstream infections for a prevalence of 6.9 per 1000 patients. Sixty-one patients had candidemia alone and 38 patients had combined bloodstream infections. Candida albicans (n = 70) was the predominant species. Primary therapy included monotherapy with fluconazole (n = 39), caspofungin (n = 16), and a polyene-based product (n = 12). Combination therapy was infrequently used (n = 10). Compared with patients with Gram-positive (n = 420) and Gram-negative (n = 264) bloodstream infections, patients with candidemia were more likely to have solid tumors (p < .05) and appeared to have been in an intensive care unit longer (14 days [range, 5-25 days], 8 days [range, 3-20 days], and 10 days [range, 2-23 days], respectively), but this difference was not statistically significant. Severity of illness and organ dysfunction scores were similar between groups. Patients with Candida bloodstream infections, compared with patients with Gram-positive and Gram-negative bloodstream infections, had the greatest crude intensive care unit mortality rates (42.6%, 25.3%, and 29.1%, respectively) and longer intensive care unit lengths of stay (median [interquartile range]) (33 days [18-44], 20 days [9-43], and 21 days [8-46], respectively); however, these differences were not statistically significant. CONCLUSION: Candidemia remains a significant problem in intensive care units patients. In the EPIC II population, Candida albicans was the most common organism and fluconazole remained the predominant antifungal agent used. Candida bloodstream infections are associated with high intensive care unit and hospital mortality rates and resource use.