914 resultados para Saturated throughput


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The downlink scheduling problem in multi-queue multi-server systems under channel uncertainty is considered. Two policies that make allocations based on predicted channel states are proposed. The first is an extension of the well-known dynamic backpressure policy to the uncertain channel case. The second is a variant that improves delay performance under light loads. The stability region of the system is characterised and the first policy is argued to be throughput optimal. A recently proposed policy of Kar et al [1] has lesser complexity, but is shown to be throughput suboptimal. Further, simulations demonstrate better delay and backlog properties for both our policies at light loads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Determination of the differential DNA methylation patterns of methylenetetrahydrofolate reductase (MTHFR) that are associated with differential MTHFR activity is important to understand the pathogenesis of ischemic stroke. However, to date, no data are available on the differential DNA methylation profiles of Kelantanese Malays. Therefore, we developed a rapid and efficient serial pyrosequencing assay to determine differential DNA methylation profiles of MTHFR, which help to further our understanding of the pathogenesis of ischemic stroke. The developed assay also served as the validation platform for our previous computational epigenetic research on MTHFR. Methods Polymerase chain reaction primers were designed and validated to specifically amplify the cytosine that is followed by guanine residues (CpGs) A and B regions. Prior epigenotyping on 110 Kelantanese Malays, the serial pyrosequencing assays for the CpGs A and B regions were validated using five validation controls. The mean values of the DNA methylation profiles of CpGs A and B were calculated. Results The mean DNA methylation levels for CpGs A and B were 0.984 ± 0.582 and 2.456 ± 1.406, respectively. The CpGs 8 and 20 showed the highest (5.581 ± 4.497) and the lowest (0.414 ± 2.814) levels of DNA methylation at a single-base resolution. Conclusion We have successfully developed and validated a pyrosequencing assay that is fast and can yield high-quality pyrograms for DNA methylation analysis and is therefore applicable to high throughput study. Using this newly developed pyrosequencing assay, the MTHFR DNA methylation profiles of 110 Kelantanese Malays were successfully determined. It also validated our computational epigenetic research on MTHFR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Importance of the field: The shift in focus from ligand based design approaches to target based discovery over the last two to three decades has been a major milestone in drug discovery research. Currently, it is witnessing another major paradigm shift by leaning towards the holistic systems based approaches rather the reductionist single molecule based methods. The effect of this new trend is likely to be felt strongly in terms of new strategies for therapeutic intervention, new targets individually and in combinations, and design of specific and safer drugs. Computational modeling and simulation form important constituents of new-age biology because they are essential to comprehend the large-scale data generated by high-throughput experiments and to generate hypotheses, which are typically iterated with experimental validation. Areas covered in this review: This review focuses on the repertoire of systems-level computational approaches currently available for target identification. The review starts with a discussion on levels of abstraction of biological systems and describes different modeling methodologies that are available for this purpose. The review then focuses on how such modeling and simulations can be applied for drug target discovery. Finally, it discusses methods for studying other important issues such as understanding targetability, identifying target combinations and predicting drug resistance, and considering them during the target identification stage itself. What the reader will gain: The reader will get an account of the various approaches for target discovery and the need for systems approaches, followed by an overview of the different modeling and simulation approaches that have been developed. An idea of the promise and limitations of the various approaches and perspectives for future development will also be obtained. Take home message: Systems thinking has now come of age enabling a `bird's eye view' of the biological systems under study, at the same time allowing us to `zoom in', where necessary, for a detailed description of individual components. A number of different methods available for computational modeling and simulation of biological systems can be used effectively for drug target discovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lipid analysis is commonly performed by gas chromatography (GC) in laboratory conditions. Spectroscopic techniques, however, are non-destructive and can be implemented noninvasively in vivo. Excess fat (triglycerides) in visceral adipose tissue and liver is known predispose to metabolic abnormalities, collectively known as the metabolic syndrome. Insulin resistance is the likely cause with diets high in saturated fat known to impair insulin sensitivity. Tissue triglyceride composition has been used as marker of dietary intake but it can also be influenced by tissue specific handling of fatty acids. Recent studies have shown that adipocyte insulin sensitivity correlates positively with their saturated fat content, contradicting the common view of dietary effects. A better understanding of factors affecting tissue triglyceride composition is needed to provide further insights into tissue function in lipid metabolism. In this thesis two spectroscopic techniques were developed for in vitro and in vivo analysis of tissue triglyceride composition. In vitro studies (Study I) used infrared spectroscopy (FTIR), a fast and cost effective analytical technique well suited for multivariate analysis. Infrared spectra are characterized by peak overlap leading to poorly resolved absorbances and limited analytical performance. In vivo studies (Studies II, III and IV) used proton magnetic resonance spectroscopy (1H-MRS), an established non-invasive clinical method for measuring metabolites in vivo. 1H-MRS has been limited in its ability to analyze triglyceride composition due to poorly resolved resonances. Using an attenuated total reflection accessory, we were able to obtain pure triglyceride infrared spectra from adipose tissue biopsies. Using multivariate curve resolution (MCR), we were able to resolve the overlapping double bond absorbances of monounsaturated fat and polyunsaturated fat. MCR also resolved the isolated trans double bond and conjugated linoleic acids from an overlapping background absorbance. Using oil phantoms to study the effects of different fatty acid compositions on the echo time behaviour of triglycerides, it was concluded that the use of long echo times improved peak separation with T2 weighting having a negligible impact. It was also discovered that the echo time behaviour of the methyl resonance of omega-3 fats differed from other fats due to characteristic J-coupling. This novel insight could be used to detect omega-3 fats in human adipose tissue in vivo at very long echo times (TE = 470 and 540 ms). A comparison of 1H-MRS of adipose tissue in vivo and GC of adipose tissue biopsies in humans showed that long TE spectra resulted in improved peak fitting and better correlations with GC data. The study also showed that calculation of fatty acid fractions from 1H-MRS data is unreliable and should not be used. Omega-3 fatty acid content derived from long TE in vivo spectra (TE = 540 ms) correlated with total omega-3 fatty acid concentration measured by GC. The long TE protocol used for adipose tissue studies was subsequently extended to the analysis of liver fat composition. Respiratory triggering and long TE resulted in spectra with the olefinic and tissue water resonances resolved. Conversion of the derived unsaturation to double bond content per fatty acid showed that the results were in accordance with previously published gas chromatography data on liver fat composition. In patients with metabolic syndrome, liver fat was found to be more saturated than subcutaneous or visceral adipose tissue. The higher saturation observed in liver fat may be a result of a higher rate of de-novo-lipogenesis in liver than in adipose tissue. This thesis has introduced the first non-invasive method for determining adipose tissue omega-3 fatty acid content in humans in vivo. The methods introduced here have also shown that liver fat is more saturated than adipose tissue fat.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Loop heat pipe is a passive two-phase heat transport device that is gaining importance as a part of spacecraft thermal control systems and also in applications (such as in avionic cooling and submarines). Hard fill of a loop heat pipe occurs when the compensation chamber is full of liquid. A theoretical study is undertaken to investigate the issues underlying the loop beat pipe hard-fill phenomenon. The results of the study suggest that the mass of charge and the presence of a bayonet have significant impact on the loop heat pipe operation. With a largern mass of charge, a loop heat pipe hard fills at a lower heat load. As the heat load increases, there is a steep rise in the loop heat pipe operating temperature. In a loop heat pipe with a saturated compensation chamber, and also in a hard-filled loop heat pipe without a bayonet, the temperature of the compensation chamber and that of the liquid core are nearly equal. When a loop heat pipe with a bayonet hard fills, the compensation chamber and the evaporator core temperatures are different.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We extend the modeling heuristic of (Harsha et al. 2006. In IEEE IWQoS 06, pp 178 - 187) to evaluate the performance of an IEEE 802.11e infrastructure network carrying packet telephone calls, streaming video sessions and TCP controlled file downloads, using Enhanced Distributed Channel Access (EDCA). We identify the time boundaries of activities on the channel (called channel slot boundaries) and derive a Markov Renewal Process of the contending nodes on these epochs. This is achieved by the use of attempt probabilities of the contending nodes as those obtained from the saturation fixed point analysis of (Ramaiyan et al. 2005. In Proceedings ACM Sigmetrics, `05. Journal version accepted for publication in IEEE TON). Regenerative analysis on this MRP yields the desired steady state performance measures. We then use the MRP model to develop an effective bandwidth approach for obtaining a bound on the size of the buffer required at the video queue of the AP, such that the streaming video packet loss probability is kept to less than 1%. The results obtained match well with simulations using the network simulator, ns-2. We find that, with the default IEEE 802.11e EDCA parameters for access categories AC 1, AC 2 and AC 3, the voice call capacity decreases if even one streaming video session and one TCP file download are initiated by some wireless station. Subsequently, reducing the voice calls increases the video downlink stream throughput by 0.38 Mbps and file download capacity by 0.14 Mbps, for every voice call (for the 11 Mbps PHY). We find that a buffer size of 75KB is sufficient to ensure that the video packet loss probability at the QAP is within 1%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bluetooth is a short-range radio technology operating in the unlicensed industrial-scientific-medical (ISM) band at 2.45 GHz. A piconet is basically a collection of slaves controlled by a master. A scatternet, on the other hand, is established by linking several piconets together in an ad hoc fashion to yield a global wireless ad hoc network. This paper proposes a scheduling policy that aims to achieve increased system throughput and reduced packet delays while providing reasonably good fairness among all traffic flows in bluetooth piconets and scatternets. We propose a novel algorithm for scheduling slots to slaves for both piconets and scatternets using multi-layered parameterized policies. Our scheduling scheme works with real data and obtains an optimal feedback policy within prescribed parameterized classes of these by using an efficient two-timescale simultaneous perturbation stochastic approximation (SPSA) algorithm. We show the convergence of our algorithm to an optimal multi-layered policy. We also propose novel polling schemes for intra- and inter-piconet scheduling that are seen to perform well. We present an extensive set of simulation results and performance comparisons with existing scheduling algorithms. Our results indicate that our proposed scheduling algorithm performs better overall on a wide range of experiments over the existing algorithms for both piconets (Das et al. in INFOCOM, pp. 591–600, 2001; Lapeyrie and Turletti in INFOCOM conference proceedings, San Francisco, US, 2003; Shreedhar and Varghese in SIGCOMM, pp. 231–242, 1995) and scatternets (Har-Shai et al. in OPNETWORK, 2002; Saha and Matsumot in AICT/ICIW, 2006; Tan and Guttag in The 27th annual IEEE conference on local computer networks(LCN). Tampa, 2002). Our studies also confirm that our proposed scheme achieves a high throughput and low packet delays with reasonable fairness among all the connections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Run-time interoperability between different applications based on H.264/AVC is an emerging need in networked infotainment, where media delivery must match the desired resolution and quality of the end terminals. In this paper, we describe the architecture and design of a polymorphic ASIC to support this. The H.264 decoding flow is partitioned into modules, such that the polymorphic ASIC meets the design goals of low-power, low-area, high flexibility, high throughput and fast interoperability between different profiles and levels of H.264. We demonstrate the idea with a multi-mode decoder that can decode baseline, main and high profile H.264 streams and can interoperate at run.time across these profiles. The decoder is capable of processing frame sizes of up to 1024 times 768 at 30 fps. The design synthesized with UMC 0.13 mum technology, occupies 250 k gates and runs at 100 MHz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous studies have shown that buffering packets in DRAM is a performance bottleneck. In order to understand the impediments in accessing the DRAM, we developed a detailed Petri net model of IP forwarding application on IXP2400 that models the different levels of the memory hierarchy. The cell based interface used to receive and transmit packets in a network processor leads to some small size DRAM accesses. Such narrow accesses to the DRAM expose the bank access latency, reducing the bandwidth that can be realized. With real traces up to 30% of the accesses are smaller than the cell size, resulting in 7.7% reduction in DRAM bandwidth. To overcome this problem, we propose buffering these small chunks of data in the on chip scratchpad memory. This scheme also exploits greater degree of parallelism between different levels of the memory hierarchy. Using real traces from the internet, we show that the transmit rate can be improved by an average of 21% over the base scheme without the use of additional hardware. Further, the impact of different traffic patterns on the network processor resources is studied. Under real traffic conditions, we show that the data bus which connects the off-chip packet buffer to the micro-engines, is the obstacle in achieving higher throughput.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bluetooth is an emerging standard in short range, low cost and low power wireless networks. MAC is a generic polling based protocol, where a central Bluetooth unit (master) determines channel access to all other nodes (slaves) in the network (piconet). An important problem in Bluetooth is the design of efficient scheduling protocols. This paper proposes a polling policy that aims to achieve increased system throughput and reduced packet delays while providing reasonably good fairness among all traffic flows in a Bluetooth Piconet. We present an extensive set of simulation results and performance comparisons with two important existing algorithms. Our results indicate that our proposed scheduling algorithm outperforms the Round Robin scheduling algorithm by more than 40% in all cases tried. Our study also confirms that our proposed policy achieves higher throughput and lower packet delays with reasonable fairness among all the connections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under a single service class and multiple queues, and present a multi-layered pricing scheme. We propose an algorithm for finding the optimal state dependent price levels for individual queues, at each node. The pricing policy used depends on a weighted average queue length at each node. This helps in reducing frequent price variations and is in the spirit of the random early detection (RED) mechanism used in TCP/IP networks. We observe in our numerical results a considerable improvement in performance using our scheme over that of a recently proposed related scheme in terms of both throughput and delay performance. In particular, our approach exhibits a throughput improvement in the range of 34 to 69 percent in all cases studied (over all routes) over the above scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent technical advances have enabled for the first time, reliable in vitro culture of prostate cancer samples as prostate cancer organoids. This breakthrough provides the significant possibility of high throughput drug screening covering the spectrum of prostate cancer phenotypes seen clinically. These advances will enable precision medicine to become a reality, allowing patient samples to be screened for effective therapeutics ex vivo, with tailoring of treatments specific to that individual. This will hopefully lead to enhanced clinical outcomes, avoid morbidity due to ineffective therapies and improve the quality of life in men with advanced prostate cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under multiple service classes. Our work draws upon [1] and [2] in various ways. We use the Tirupati pricing scheme in conjunction with the stochastic approximation based adaptive pricing methodology for queue control (proposed in [1]) for minimizing network congestion. However, unlike the methodology of [1] where pricing for entire routes is directly considered, we consider prices for individual link-service grade tuples. Further, we adapt the methodology proposed in [21 for a single-node scenario to the case of a network of nodes, for evaluating performance in terms of price, revenue rate and disutility. We obtain considerable performance improvements using our approach over that in [1]. In particular, our approach exhibits a throughput improvement in the range of 54 to 80 percent in all cases studied (over all routes) while exhibiting a lower packet delay in the range of 26 to 38 percent over the scheme in [1].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the current era of high-throughput sequencing and structure determination, functional annotation has become a bottleneck in biomedical science. Here, we show that automated inference of molecular function using functional linkages among genes increases the accuracy of functional assignments by >= 8% and enriches functional descriptions in >= 34% of top assignments. Furthermore, biochemical literature supports >80% of automated inferences for previously unannotated proteins. These results emphasize the benefit of incorporating functional linkages in protein annotation.