976 resultados para parallel admission algorithm
Resumo:
To improve long-term survival, prompt revascularization of the infarct-related artery should be done in patients with acute myocardial infarction (AMI); therefore, a large proportion of these patients would be hospitalized during out of hours. The clinical effects of out-of-hours AMI management were already questioned, with conflicting results. The purpose of this investigation was to compare the in-hospital outcome of patients admitted for AMI during out of hours and working hours. All patients with AMI included in the AMIS Plus Registry from January 1, 1997, to March 30, 2006, were analyzed. The working-hours group included patients admitted from 7 a.m. to 7 p.m. on weekdays, and the out-of-hours group included patients admitted from 7 p.m. to 7 a.m. on weekdays or weekends. Major cardiac events were defined as cardiovascular death, reinfarction, and stroke. The study primary end points were in-hospital death and major adverse cardiac event (MACE) rates. A total of 12,480 patients met the inclusion criteria, with 52% admitted during normal working hours, and 48%, during out of hours. Patients admitted during weekdays included more women (28.1% vs 26%; p = 0.009), older patients (65.5 +/- 13 vs 64.1 +/- 13 years; p = 0.0011), less current smokers (40.1% vs 43.5%; p <0.001), and less patients with a history of ischemic heart disease (31.5% vs 34.5%; p = 0.001). A significantly higher proportion of patients admitted during out of hours had Killip's class III and IV. No differences in terms of in-hospital survival rates between the 2 groups (91.5% vs 91.2%; p = 0.633) or MACE-free survival rates (both 88.5%; p = 1.000) were noted. In conclusion, the outcome of patients with AMI admitted out of hours was the same compared with those with a weekday admission. Of predictors for in-hospital outcome, timing of admission had no significant influence on mortality and/or MACE incidence.
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
In Brazil, human and canine visceral leishmaniasis (CVL) caused byLeishmania infantum has undergone urbanisation since 1980, constituting a public health problem, and serological tests are tools of choice for identifying infected dogs. Until recently, the Brazilian zoonoses control program recommended enzyme-linked immunosorbent assays (ELISA) and indirect immunofluorescence assays (IFA) as the screening and confirmatory methods, respectively, for the detection of canine infection. The purpose of this study was to estimate the accuracy of ELISA and IFA in parallel or serial combinations. The reference standard comprised the results of direct visualisation of parasites in histological sections, immunohistochemical test, or isolation of the parasite in culture. Samples from 98 cases and 1,327 noncases were included. Individually, both tests presented sensitivity of 91.8% and 90.8%, and specificity of 83.4 and 53.4%, for the ELISA and IFA, respectively. When tests were used in parallel combination, sensitivity attained 99.2%, while specificity dropped to 44.8%. When used in serial combination (ELISA followed by IFA), decreased sensitivity (83.3%) and increased specificity (92.5%) were observed. Serial testing approach improved specificity with moderate loss in sensitivity. This strategy could partially fulfill the needs of public health and dog owners for a more accurate diagnosis of CVL.
Resumo:
We examined the spatial and temporal variation of species diversity and genetic diversity in a metacommunity comprising 16 species of freshwater gastropods. We monitored species abundance at five localities of the Ain river floodplain in southeastern France, over a period of four years. Using 190 AFLP loci, we monitored the genetic diversity of Radix balthica, one of the most abundant gastropod species of the metacommunity, twice during that period. An exceptionally intense drought occurred during the last two years and differentially affected the study sites. This allowed us to test the effect of natural disturbances on changes in both genetic and species diversity. Overall, local (alpha) diversity declined as reflected by lower values of gene diversity H(S) and evenness. In parallel, the among-sites (beta) diversity increased at both the genetic (F(ST)) and species (F(STC)) levels. These results suggest that disturbances can lead to similar changes in genetic and community structure through the combined effects of selective and neutral processes.
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
INTRODUCTION. Patients admitted in Intensive Care Unit (ICU) from general wards are more severe and have a higher mortality than those admitted from emergency department as reported [1]. The majority of them develop signs of instability (e.g. tachypnea, tachycardia, hypotension, decreased oxygen saturation and change in conscious state) several hours before ICU admission. Considering this fact and that in-hospital cardiac arrests and unexpected deaths are usually preceded by warning signs, immediate on site intervention by specialists may be effective. This gave an impulse to medical emergency team (MET) implementation, which has been shown to decrease cardiac arrest, morbidity and mortality in several hospitals. OBJECTIVES AND METHODS. In order to verify if the same was true in our hospital and to determine if there was a need for MET, we prospectively collected all non elective ICU admissions of already hospitalized patients (general wards) and of patients remaining more than 3 h in emergency department (considered hospitalized). Instability criteria leading to MET call correspond to those described in the literature. The delay between the development of one criterion and ICU admission was registered. RESULTS. During an observation period of 12 months, 321 patients with our MET criteria were admitted to ICU. 88 patients came from the emergency department, 115 from the surgical and 113 from the medical ward. 65% were male. The median age was 65 years (range 17-89). The delay fromMETcriteria development to ICU admission was higher than 8 h in 155 patients, with a median delay of 32 h and a range of 8.4 h to 10 days. For the remaining 166 patients, an early MET criterion was present up to 8 h (median delay 3 h) before ICU admission. These results are quite concordant with the data reported in the literature (ref 1-8). 122 patients presented signs of sepsis or septic shock, 70 patients a respiratory failure, 58 patients a cardiac emergency. Cardiac arrest represent 5% of our collective of patients. CONCLUSIONS.Similar to others observations, the majority of hospitalized patients admitted on emergency basis in our ICU have warning signs lasting for several hours. More than half of them were unstable for more than 8 h. This shows there is plenty of time for early acute management by dedicated and specialized team such as MET. However, further studies are required to determine if MET implementation can reduce in-hospital cardiac arrests and influence the morbidity, the length of stay and the mortality.
Resumo:
Algoritmo que optimiza y crea pairings para tripulaciones de líneas aéreas mediante la posterior programación en Java.
Resumo:
Es tracta d'un projecte que proposa una aplicació per al calibratge automàtic de models P-sistema. Per a fer-ho primer es farà un estudi sobre els models P-sistema i el procediment seguit pels investigadors per desenvolupar aquest tipus de models. Es desenvoluparà una primera solució sèrie per al problema, i s'analitzaran els seus punts febles. Seguidament es proposarà una versió paral·lela que millori significativament el temps d'execució, tot mantenint una alta eficiència i escalabilitat.
Resumo:
We are going to implement the "GA-SEFS" by Tsymbal and analyse experimentally its performance depending on the classifier algorithms used in the fitness function (NB, MNge, SMO). We are also going to study the effect of adding to the fitness function a measure to control complexity of the base classifiers.
Resumo:
OBJECTIVE: Previous research suggested that proper blood pressure (BP) management in acute stroke may need to take into account the underlying etiology. METHODS: All patients with acute ischemic stroke registered in the ASTRAL registry between 2003 and 2009 were analyzed. Unfavorable outcome was defined as modified Rankin Scale score >2. A local polynomial surface algorithm was used to assess the effect of baseline and 24- to 48-hour systolic BP (SBP) and mean arterial pressure (MAP) on outcome in patients with lacunar, atherosclerotic, and cardioembolic stroke. RESULTS: A total of 791 patients were included in the analysis. For lacunar and atherosclerotic strokes, there was no difference in the predicted probability of unfavorable outcome between patients with an admission BP of <140 mm Hg, 140-160 mm Hg, or >160 mm Hg (15.3 vs 12.1% vs 20.8%, respectively, for lacunar, p = 015; 41.0% vs 41.5% vs 45.5%, respectively, for atherosclerotic, p = 075), or between patients with BP increase vs decrease at 24-48 hours (18.7% vs 18.0%, respectively, for lacunar, p = 0.84; 43.4% vs 43.6%, respectively, for atherosclerotic, p = 0.88). For cardioembolic strokes, increase of BP at 24-48 hours was associated with higher probability of unfavorable outcome compared to BP reduction (53.4% vs 42.2%, respectively, p = 0.037). Also, the predicted probability of unfavorable outcome was significantly different between patients with an admission BP of <140 mm Hg, 140-160 mm Hg, and >160 mm Hg (34.8% vs 42.3% vs 52.4%, respectively, p < 0.01). CONCLUSIONS: This study provides evidence to support that BP management in acute stroke may have to be tailored with respect to the underlying etiopathogenetic mechanism.
Resumo:
BACKGROUND: Solexa/Illumina short-read ultra-high throughput DNA sequencing technology produces millions of short tags (up to 36 bases) by parallel sequencing-by-synthesis of DNA colonies. The processing and statistical analysis of such high-throughput data poses new challenges; currently a fair proportion of the tags are routinely discarded due to an inability to match them to a reference sequence, thereby reducing the effective throughput of the technology. RESULTS: We propose a novel base calling algorithm using model-based clustering and probability theory to identify ambiguous bases and code them with IUPAC symbols. We also select optimal sub-tags using a score based on information content to remove uncertain bases towards the ends of the reads. CONCLUSION: We show that the method improves genome coverage and number of usable tags as compared with Solexa's data processing pipeline by an average of 15%. An R package is provided which allows fast and accurate base calling of Solexa's fluorescence intensity files and the production of informative diagnostic plots.
Resumo:
PURPOSE: Most RB1 mutations are unique and distributed throughout the RB1 gene. Their detection can be time-consuming and the yield especially low in cases of conservatively-treated sporadic unilateral retinoblastoma (Rb) patients. In order to identify patients with true risk of developing Rb, and to reduce the number of unnecessary examinations under anesthesia in all other cases, we developed a universal sensitive, efficient and cost-effective strategy based on intragenic haplotype analysis. METHODS: This algorithm allows the calculation of the a posteriori risk of developing Rb and takes into account (a) RB1 loss of heterozygosity in tumors, (b) preferential paternal origin of new germline mutations, (c) a priori risk derived from empirical data by Vogel, and (d) disease penetrance of 90% in most cases. We report the occurrence of Rb in first degree relatives of patients with sporadic Rb who visited the Jules Gonin Eye Hospital, Lausanne, Switzerland, from January 1994 to December 2006 compared to expected new cases of Rb using our algorithm. RESULTS: A total of 134 families with sporadic Rb were enrolled; testing was performed in 570 individuals and 99 patients younger than 4 years old were identified. We observed one new case of Rb. Using our algorithm, the cumulated total a posteriori risk of recurrence was 1.77. CONCLUSIONS: This is the first time that linkage analysis has been validated to monitor the risk of recurrence in sporadic Rb. This should be a useful tool in genetic counseling, especially when direct RB1 screening for mutations leaves a negative result or is unavailable.
Resumo:
The care for a patient with ulcerative colitis (UC) remains challenging despite the fact that morbidity and mortality rates have been considerably reduced during the last 30 years. The traditional management with intravenous corticosteroids was modified by the introduction of ciclosporin and infliximab. In this review, we focus on the treatment of patients with moderate to severe UC. Four typical clinical scenarios are defined and discussed in detail. The treatment recommendations are based on current literature, published guidelines and reviews, and were discussed at a consensus meeting of Swiss experts in the field. Comprehensive treatment algorithms were developed, aimed for daily clinical practice.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.