964 resultados para Data Throughput


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In questa tesi vengono studiate alcune caratteristiche dei network a multiplex; in particolare l'analisi verte sulla quantificazione delle differenze fra i layer del multiplex. Le dissimilarita sono valutate sia osservando le connessioni di singoli nodi in layer diversi, sia stimando le diverse partizioni dei layer. Sono quindi introdotte alcune importanti misure per la caratterizzazione dei multiplex, che vengono poi usate per la costruzione di metodi di community detection . La quantificazione delle differenze tra le partizioni di due layer viene stimata utilizzando una misura di mutua informazione. Viene inoltre approfondito l'uso del test dell'ipergeometrica per la determinazione di nodi sovra-rappresentati in un layer, mostrando l'efficacia del test in funzione della similarita dei layer. Questi metodi per la caratterizzazione delle proprieta dei network a multiplex vengono applicati a dati biologici reali. I dati utilizzati sono stati raccolti dallo studio DILGOM con l'obiettivo di determinare le implicazioni genetiche, trascrittomiche e metaboliche dell'obesita e della sindrome metabolica. Questi dati sono utilizzati dal progetto Mimomics per la determinazione di relazioni fra diverse omiche. Nella tesi sono analizzati i dati metabolici utilizzando un approccio a multiplex network per verificare la presenza di differenze fra le relazioni di composti sanguigni di persone obese e normopeso.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dedicated Short Range Communication (DSRC) is the emerging key technology supporting cooperative road safety systems within Intelligent Transportation Systems (ITS). The DSRC protocol stack includes a variety of standards such as IEEE 802.11p and SAE J2735. The effectiveness of the DSRC technology depends on not only the interoperable cooperation of these standards, but also on the interoperability of DSRC devices manufactured by various manufacturers. To address the second constraint, the SAE defines a message set dictionary under the J2735 standard for construction of device independent messages. This paper focuses on the deficiencies of the SAE J2735 standard being developed for deployment in Vehicular Ad-hoc Networks (VANET). In this regard, the paper discusses the way how a Basic Safety Message (BSM) as the fundamental message type defined in SAE J2735 is constructed, sent and received by safety communication platforms to provide a comprehensive device independent solution for Cooperative ITS (C-ITS). This provides some insight into the technical knowledge behind the construction and exchange of BSMs within VANET. A series of real-world DSRC data collection experiments was conducted. The results demonstrate that the reliability and throughput of DSRC highly depend on the applications utilizing the medium. Therefore, an active application-dependent medium control measure, using a novel message-dissemination frequency controller, is introduced. This application level message handler improves the reliability of both BSM transmissions/receptions and the Application layer error handling which is extremely vital to decentralized congestion control (DCC) mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-transcriptional silencing of plant genes using anti-sense or co-suppression constructs usually results in only a modest proportion of silenced individuals. Recent work has demonstrated the potential for constructs encoding self-complementary 'hairpin' RNA (hpRNA) to efficiently silence genes. In this study we examine design rules for efficient gene silencing, in terms of both the proportion of independent transgenic plants showing silencing, and the degree of silencing. Using hpRNA constructs containing sense/anti-sense arms ranging from 98 to 853 nt gave efficient silencing in a wide range of plant species, and inclusion of an intron in these constructs had a consistently enhancing effect. Intron-containing constructs (ihpRNA) generally gave 90-100% of independent transgenic plants showing silencing. The degree of silencing with these constructs was much greater than that obtained using either co-suppression or anti-sense constructs. We have made a generic vector, pHANNIBAL, that allows a simple, single PCR product from a gene of interest to be easily converted into a highly effective ihpRNA silencing construct. We have also created a high-throughput vector, pHELLSGATE, that should facilitate the cloning of gene libraries or large numbers of defined genes, such as those in EST collections, using an in vitro recombinase system. This system may facilitate the large-scale determination and discovery of plant gene functions in the same way as RNAi is being used to examine gene function in Caenorhabditis elegans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Engineered biphasic osteochondral tissues may have utility in cartilage defect repair. As bone-marrow-derived mesenchymal stem/stromal cells (MSC) have the capacity to make both bone-like and cartilage-like tissues, they are an ideal cell population for use in the manufacture of osteochondral tissues. Effective differentiation of MSC to bone-like and cartilage-like tissues requires two unique medium formulations and this presents a challenge both in achieving initial MSC differentiation and in maintaining tissue stability when the unified osteochondral tissue is subsequently cultured in a single medium formulation. In this proof-of-principle study, we used an in-house fabricated microwell platform to manufacture thousands of micropellets formed from 166 MSC each. We then characterized the development of bone-like and cartilage-like tissue formation in the micropellets maintained for 8–14 days in sequential combinations of osteogenic or chondrogenic induction medium. When bone-like or cartilage-like micropellets were induced for only 8 days, they displayed significant phenotypic changes when the osteogenic or chondrogenic induction medium, respectively, was swapped. Based on these data, we developed an extended 14-day protocol for the pre-culture of bone-like and cartilage-like micropellets in their respective induction medium. Unified osteochondral tissues were formed by layering 12,000 osteogenic micropellets and 12,000 chondrogenic micropellets into a biphasic structure and then further culture in chondrogenic induction medium. The assembled tissue was cultured for a further 8 days and characterized via histology. The micropellets had amalgamated into a continuous structure with distinctive bone-like and cartilage-like regions. This proof-of-concept study demonstrates the feasibility of micropellet assembly for the formation of osteochondral-like tissues for possible use in osteochondral defect repair.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Sorghum genome mapping based on DNA markers began in the early 1990s and numerous genetic linkage maps of sorghum have been published in the last decade, based initially on RFLP markers with more recent maps including AFLPs and SSRs and very recently, Diversity Array Technology (DArT) markers. It is essential to integrate the rapidly growing body of genetic linkage data produced through DArT with the multiple genetic linkage maps for sorghum generated through other marker technologies. Here, we report on the colinearity of six independent sorghum component maps and on the integration of these component maps into a single reference resource that contains commonly utilized SSRs, AFLPs, and high-throughput DArT markers. Results: The six component maps were constructed using the MultiPoint software. The lengths of the resulting maps varied between 910 and 1528 cM. The order of the 498 markers that segregated in more than one population was highly consistent between the six individual mapping data sets. The framework consensus map was constructed using a "Neighbours" approach and contained 251 integrated bridge markers on the 10 sorghum chromosomes spanning 1355.4 cM with an average density of one marker every 5.4 cM, and were used for the projection of the remaining markers. In total, the sorghum consensus map consisted of a total of 1997 markers mapped to 2029 unique loci ( 1190 DArT loci and 839 other loci) spanning 1603.5 cM and with an average marker density of 1 marker/0.79 cM. In addition, 35 multicopy markers were identified. On average, each chromosome on the consensus map contained 203 markers of which 58.6% were DArT markers. Non-random patterns of DNA marker distribution were observed, with some clear marker-dense regions and some marker-rare regions. Conclusion: The final consensus map has allowed us to map a larger number of markers than possible in any individual map, to obtain a more complete coverage of the sorghum genome and to fill a number of gaps on individual maps. In addition to overall general consistency of marker order across individual component maps, good agreement in overall distances between common marker pairs across the component maps used in this study was determined, using a difference ratio calculation. The obtained consensus map can be used as a reference resource for genetic studies in different genetic backgrounds, in addition to providing a framework for transferring genetic information between different marker technologies and for integrating DArT markers with other genomic resources. DArT markers represent an affordable, high throughput marker system with great utility in molecular breeding programs, especially in crops such as sorghum where SNP arrays are not publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In order to rapidly and efficiently screen potential biofuel feedstock candidates for quintessential traits, robust high-throughput analytical techniques must be developed and honed. The traditional methods of measuring lignin syringyl/guaiacyl (S/G) ratio can be laborious, involve hazardous reagents, and/or be destructive. Vibrational spectroscopy can furnish high-throughput instrumentation without the limitations of the traditional techniques. Spectral data from mid-infrared, near-infrared, and Raman spectroscopies was combined with S/G ratios, obtained using pyrolysis molecular beam mass spectrometry, from 245 different eucalypt and Acacia trees across 17 species. Iterations of spectral processing allowed the assembly of robust predictive models using partial least squares (PLS). RESULTS: The PLS models were rigorously evaluated using three different randomly generated calibration and validation sets for each spectral processing approach. Root mean standard errors of prediction for validation sets were lowest for models comprised of Raman (0.13 to 0.16) and mid-infrared (0.13 to 0.15) spectral data, while near-infrared spectroscopy led to more erroneous predictions (0.18 to 0.21). Correlation coefficients (r) for the validation sets followed a similar pattern: Raman (0.89 to 0.91), mid-infrared (0.87 to 0.91), and near-infrared (0.79 to 0.82). These statistics signify that Raman and mid-infrared spectroscopy led to the most accurate predictions of S/G ratio in a diverse consortium of feedstocks. CONCLUSION: Eucalypts present an attractive option for biofuel and biochemical production. Given the assortment of over 900 different species of Eucalyptus and Corymbia, in addition to various species of Acacia, it is necessary to isolate those possessing ideal biofuel traits. This research has demonstrated the validity of vibrational spectroscopy to efficiently partition different potential biofuel feedstocks according to lignin S/G ratio, significantly reducing experiment and analysis time and expense while providing non-destructive, accurate, global, predictive models encompassing a diverse array of feedstocks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-throughput techniques are necessary to efficiently screen potential lignocellulosic feedstocks for the production of renewable fuels, chemicals, and bio-based materials, thereby reducing experimental time and expense while supplanting tedious, destructive methods. The ratio of lignin syringyl (S) to guaiacyl (G) monomers has been routinely quantified as a way to probe biomass recalcitrance. Mid-infrared and Raman spectroscopy have been demonstrated to produce robust partial least squares models for the prediction of lignin S/G ratios in a diverse group of Acacia and eucalypt trees. The most accurate Raman model has now been used to predict the S/G ratio from 269 unknown Acacia and eucalypt feedstocks. This study demonstrates the application of a partial least squares model composed of Raman spectral data and lignin S/G ratios measured using pyrolysis/molecular beam mass spectrometry (pyMBMS) for the prediction of S/G ratios in an unknown data set. The predicted S/G ratios calculated by the model were averaged according to plant species, and the means were not found to differ from the pyMBMS ratios when evaluating the mean values of each method within the 95 % confidence interval. Pairwise comparisons within each data set were employed to assess statistical differences between each biomass species. While some pairwise appraisals failed to differentiate between species, Acacias, in both data sets, clearly display significant differences in their S/G composition which distinguish them from eucalypts. This research shows the power of using Raman spectroscopy to supplant tedious, destructive methods for the evaluation of the lignin S/G ratio of diverse plant biomass materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a highly sensitive genome wide search method for recessive mutations. The method is suitable for distantly related samples that are divided into phenotype positives and negatives. High throughput genotype arrays are used to identify and compare homozygous regions between the cohorts. The method is demonstrated by comparing colorectal cancer patients against unaffected references. The objective is to find homozygous regions and alleles that are more common in cancer patients. We have designed and implemented software tools to automate the data analysis from genotypes to lists of candidate genes and to their properties. The programs have been designed in respect to a pipeline architecture that allows their integration to other programs such as biological databases and copy number analysis tools. The integration of the tools is crucial as the genome wide analysis of the cohort differences produces many candidate regions not related to the studied phenotype. CohortComparator is a genotype comparison tool that detects homozygous regions and compares their loci and allele constitutions between two sets of samples. The data is visualised in chromosome specific graphs illustrating the homozygous regions and alleles of each sample. The genomic regions that may harbour recessive mutations are emphasised with different colours and a scoring scheme is given for these regions. The detection of homozygous regions, cohort comparisons and result annotations are all subjected to presumptions many of which have been parameterized in our programs. The effect of these parameters and the suitable scope of the methods have been evaluated. Samples with different resolutions can be balanced with the genotype estimates of their haplotypes and they can be used within the same study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compute the throughput obtained by a TCP connection in a UMTS environment. For downloading data at a mobile terminal, the packets of each TCP connection are stored in separate queues at the base station (node B). Also due to fragmentation of the TCP packets into Protocol Data Units (PDU) and link layer retransmissions of PDUs there can be significant delays at the queue of the node B. In such a scenario the existing models of TCP may not be sufficient. Thus, we provide a new approximate TCP model and also obtain new closed-form expressions of mean window size. Using these we obtain the throughput of a TCP connection which matches with simulations quite well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A link failure in the path of a virtual circuit in a packet data network will lead to premature disconnection of the circuit by the end-points. A soft failure will result in degraded throughput over the virtual circuit. If these failures can be detected quickly and reliably, then appropriate rerouteing strategies can automatically reroute the virtual circuits that use the failed facility. In this paper, we develop a methodology for analysing and designing failure detection schemes for digital facilities. Based on errored second data, we develop a Markov model for the error and failure behaviour of a T1 trunk. The performance of a detection scheme is characterized by its false alarm probability and the detection delay. Using the Markov model, we analyse the performance of detection schemes that use physical layer or link layer information. The schemes basically rely upon detecting the occurrence of severely errored seconds (SESs). A failure is declared when a counter, that is driven by the occurrence of SESs, reaches a certain threshold.For hard failures, the design problem reduces to a proper choice;of the threshold at which failure is declared, and on the connection reattempt parameters of the virtual circuit end-point session recovery procedures. For soft failures, the performance of a detection scheme depends, in addition, on how long and how frequent the error bursts are in a given failure mode. We also propose and analyse a novel Level 2 detection scheme that relies only upon anomalies observable at Level 2, i.e. CRC failures and idle-fill flag errors. Our results suggest that Level 2 schemes that perform as well as Level 1 schemes are possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a system comprising a finite number of nodes, with infinite packet buffers, that use unslotted ALOHA with Code Division Multiple Access (CDMA) to share a channel for transmitting packetised data. We propose a simple model for packet transmission and retransmission at each node, and show that saturation throughput in this model yields a sufficient condition for the stability of the packet buffers; we interpret this as the capacity of the access method. We calculate and compare the capacities of CDMA-ALOHA (with and without code sharing) and TDMA-ALOHA; we also consider carrier sensing and collision detection versions of these protocols. In each case, saturation throughput can be obtained via analysis pf a continuous time Markov chain. Our results show how saturation throughput degrades with code-sharing. Finally, we also present some simulation results for mean packet delay. Our work is motivated by optical CDMA in which "chips" can be optically generated, and hence the achievable chip rate can exceed the achievable TDMA bit rate which is limited by electronics. Code sharing may be useful in the optical CDMA context as it reduces the number of optical correlators at the receivers. Our throughput results help to quantify by how much the CDMA chip rate should exceed the TDMA bit rate so that CDMA-ALOHA yields better capacity than TDMA-ALOHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wireless Energy Harvesting Sensor (EHS) needs to send data packets arriving in its queue over a fading channel at maximum possible throughput while ensuring acceptable packet delays. At the same time, it needs to ensure that energy neutrality is satisfied, i.e., the average energy drawn from a battery should equal the amount of energy deposited in it minus the energy lost due to the inefficiency of the battery. In this work, a framework is developed under which a system designer can optimize the performance of the EHS node using power control based on the current channel state information, when the EHS node employs a single modulation and coding scheme and the channel is Rayleigh fading. Optimal system parameters for throughput optimal, delay optimal and delay-constrained throughput optimal policies that ensure energy neutrality are derived. It is seen that a throughput optimal (maximal) policy is packet delay-unbounded and an average delay optimal (minimal) policy achieves negligibly small throughput. Finally, the influence of the harvested energy profile on the performance of the EHS is illustrated through the example of solar energy harvesting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new method of data handling for web servers. We call this method Network Aware Buffering and Caching (NABC for short). NABC facilitates reduction of data copies in web server's data sending path, by doing three things: (1) Layout the data in main memory in a way that protocol processing can be done without data copies (2) Keep a unified cache of data in kernel and ensure safe access to it by various processes and kernel and (3) Pass only the necessary meta data between processes so that bulk data handling time spent during IPC can be reduced. We realize NABC by implementing a set of system calls and an user library. The end product of the implementation is a set of APIs specifically designed for use by the web servers. We port an in house web server called SWEET, to NABC APIs and evaluate performance using a range of workloads both simulated and real. The results show a very impressive gain of 12% to 21% in throughput for static file serving and 1.6 to 4 times gain in throughput for lightweight dynamic content serving for a server using NABC APIs over the one using UNIX APIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The poor performance of TCP over multi-hop wireless networks is well known. In this paper we explore to what extent network coding can help to improve the throughput performance of TCP controlled bulk transfers over a chain topology multi-hop wireless network. The nodes use a CSMA/ CA mechanism, such as IEEE 802.11’s DCF, to perform distributed packet scheduling. The reverse flowing TCP ACKs are sought to be X-ORed with forward flowing TCP data packets. We find that, without any modification to theMAC protocol, the gain from network coding is negligible. The inherent coordination problem of carrier sensing based random access in multi-hop wireless networks dominates the performance. We provide a theoretical analysis that yields a throughput bound with network coding. We then propose a distributed modification of the IEEE 802.11 DCF, based on tuning the back-off mechanism using a feedback approach. Simulation studies show that the proposed mechanism when combined with network coding, improves the performance of a TCP session by more than 100%.