810 resultados para algorithm design and analysis
Resumo:
Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.
Resumo:
Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^
Resumo:
The purpose of this study was to investigate the effect of cement paste quality on the concrete performance, particularly fresh properties, by changing the water-to-cementitious materials ratio (w/cm), type and dosage of supplementary cementitious materials (SCM), and airvoid system in binary and ternary mixtures. In this experimental program, a total matrix of 54 mixtures with w/cm of 0.40 and 0.45; target air content of 2%, 4%, and 8%; a fixed cementitious content of 600 pounds per cubic yard (pcy), and the incorporation of three types of SCMs at different dosages was prepared. The fine aggregate-to- total aggregate ratio was fixed at 0.42. Workability, rheology, air-void system, setting time, strength, Wenner Probe surface resistivity, and shrinkage were determined. The effects of paste variables on workability are more marked at the higher w/cm. The compressive strength is strongly influenced by the paste quality, dominated by w/cm and air content. Surface resistivity is improved by inclusion of Class F fly ash and slag cement, especially at later ages. Ternary mixtures performed in accordance with their ingredients. The data collected will be used to develop models that will be part of an innovative mix proportioning procedure.
Resumo:
This guide specification and commentary for concrete pavements presents current state-of-the art thinking with respect to materials and mixture selection, proportioning, and acceptance. This document takes into account the different environments, practices, and materials in use across the United States and allows optional inputs for local application. The following concrete pavement types are considered: jointed plain concrete pavement, the most commonly used pavement type and may be doweled or non-doweled at transverse joints; and continuously reinforced concrete pavement, typically constructed without any transverse joints, typically used for locations with high truck traffic loads and/or poor support conditions.
Resumo:
A guide specification and commentary have been prepared that lay out current state-of-the art thinking with respect to materials and mixture selection, proportioning, and acceptance. These documents take into account the different environments, practices, and materials in use across the US and allow optional inputs for local application.
Resumo:
For years, specifications have focused on the water to cement ratio (w/cm) and strength of concrete, despite the majority of the volume of a concrete mixture consisting of aggregate. An aggregate distribution of roughly 60% coarse aggregate and 40% fine aggregate, regardless of gradation and availability of aggregates, has been used as the norm for a concrete pavement mixture. Efforts to reduce the costs and improve sustainability of concrete mixtures have pushed owners to pay closer attention to mixtures with a well-graded aggregate particle distribution. In general, workability has many different variables that are independent of gradation, such as paste volume and viscosity, aggregate’s shape, and texture. A better understanding of how the properties of aggregates affect the workability of concrete is needed. The effects of aggregate characteristics on concrete properties, such as ability to be vibrated, strength, and resistivity, were investigated using mixtures in which the paste content and the w/cm were held constant. The results showed the different aggregate proportions, the maximum nominal aggregate sizes, and combinations of different aggregates all had an impact on the performance in the strength, slump, and box test.
Resumo:
Concrete will suffer frost damage when saturated and subjected to freezing temperatures. Frost-durable concrete can be produced if a specialized surfactant, also known as an air-entraining admixture (AEA), is added during mixing to stabilize microscopic air voids. Small and well-dispersed air voids are critical to produce frost-resistant concrete. Work completed by Klieger in 1952 found the minimum volume of air required to consistently ensure frost durability in a concrete mixture subjected to rapid freezing and thawing cycles. He suggested that frost durability was provided if 18 percent air was created in the paste. This is the basis of current practice despite the tests being conducted on materials that are no longer available using tests that are different from those in use today. Based on the data presented, it was found that a minimum air content of 3.5 percent in the concrete and 11.0 percent in the paste should yield concrete durable in the ASTM C 666 with modern AEAs and low or no lignosulfonate water reducers (WRs). Limited data suggests that mixtures with a higher dosage of lignosulfonate will need about 1 percent more air in the concrete or 3 percent more air in the paste for the materials and procedures used. A spacing factor of 0.008 in. was still found to be necessary to provide frost durability for the mixtures investigated.
Resumo:
Any transportation infrastructure system is inherently concerned with durability and performance issues. The proportioning and uniformity control of concrete mixtures are critical factors that directly affect the longevity and performance of the portland cement concrete pavement systems. At present, the only means available to monitor mix proportions of any given batch are to track batch tickets created at the batch plant. However, this does not take into account potential errors in loading materials into storage silos, calibration errors, and addition of water after dispatch. Therefore, there is a need for a rapid, cost-effective, and reliable field test that estimates the proportions of as-delivered concrete mixtures. In addition, performance based specifications will be more easily implemented if there is a way to readily demonstrate whether any given batch is similar to the proportions already accepted based on laboratory performance testing. The goal of the present research project is to investigate the potential use of a portable x-ray fluorescence (XRF) technique to assess the proportions of concrete mixtures as they are delivered. Tests were conducted on the raw materials, paste and mortar samples using a portable XRF device. There is a reasonable correlation between the actual and calculated mix proportions of the paste samples, but data on mortar samples was less reliable.
Resumo:
This literature review focuses on factors influencing drying shrinkage of concrete. Although the factors are normally interrelated, they can be categorized into three groups: paste quantity, paste quality, and other factors.
Resumo:
Due to the low workability of slipform concrete mixtures, the science of rheology is not strictly applicable for such concrete. However, the concept of rheological behavior may still be considered useful. A novel workability test method (Vibrating Kelly Ball or VKelly test) that would quantitatively assess the responsiveness of a dry concrete mixture to vibration, as is desired of a mixture suitable for slipform paving, was developed and evaluated. The objectives of this test method are for it to be cost-effective, portable, and repeatable while reporting the suitability of a mixture for use in slipform paving. The work to evaluate and refine the test was conducted in three phases: 1. Assess whether the VKelly test can signal variations in laboratory mixtures with a range of materials and proportions 2. Run the VKelly test in the field at a number of construction sites 3. Validate the VKelly test results using the Box Test developed at Oklahoma State University for slipform paving concrete The data collected to date indicate that the VKelly test appears to be suitable for assessing a mixture’s response to vibration (workability) with a low multiple operator variability. A unique parameter, VKelly Index, is introduced and defined that seems to indicate that a mixture is suitable for slipform paving when it falls in the range of 0.8 to 1.2 in./√s.
Resumo:
Mixture proportioning is routinely a matter of using a recipe based on a previously produced concrete, rather than adjusting the proportions based on the needs of the mixture and the locally available materials. As budgets grow tighter and increasing attention is being paid to sustainability metrics, greater attention is beginning to be focused on making mixtures that are more efficient in their usage of materials yet do not compromise engineering performance. Therefore, a performance-based mixture proportioning method is needed to provide the desired concrete properties for a given project specification. The proposed method should be user friendly, easy to apply in practice, and flexible in terms of allowing a wide range of material selection. The objective of this study is to further develop an innovative performance-based mixture proportioning method by analyzing the relationships between the selected mix characteristics and their corresponding effects on tested properties. The proposed method will provide step-by-step instructions to guide the selection of required aggregate and paste systems based on the performance requirements. Although the provided guidance in this report is primarily for concrete pavements, the same approach can be applied to other concrete applications as well.
Resumo:
The stability of air bubbles in fresh concrete can have a profound influence of the potential durability of the system, because excessive losses during placement and consolidation can compromise the ability of the mixture to resist freezing and thawing. The stability of air void systems developed by some air entraining admixtures (AEAs) could be affected by the presence of some polycarboxylate-based water reducing admixtures (WRAs). The foam drainage test provides a means of measuring the potential stability of air bubbles in a paste. A barrier to acceptance of the test was that there was little investigation of the correlation with field performance. The work reported here was a limited exercise seeking to observe the stability of a range of currently available AEA/WRA combinations in the foam drainage test; then, to take the best and the worst and observe their stabilities on concrete mixtures in the lab. Based on the data collected, the foam drainage test appears to identify stable combinations of AEA and WRA.
Resumo:
Concrete durability may be considered as the ability to maintain serviceability over the design life without significant deterioration, and is generally a direct function of the mixture permeability. Therefore, reducing permeability will improve the potential durability of a given mixture and, in turn, improve the serviceability and longevity of the structure. Given the importance of this property, engineers often look for methods that can decrease permeability. One approach is to add chemical compounds known as integral waterproofing admixtures or permeability-reducing admixtures, which help fill and block capillary pores in the paste. Currently, there are no standard approaches to evaluate the effectiveness of permeability-reducing admixtures or to compare different products in the US. A review of manufacturers’ data sheets shows that a wide range of test methods have been used, and rarely are the same tests used on more than one product. This study investigated the fresh and hardened properties of mixtures containing commercially available hydrophilic and hydrophobic types of permeability-reducing admixtures. The aim was to develop a standard test protocol that would help owners, engineers, and specifiers compare different products and to evaluate their effects on concrete mixtures that may be exposed to hydrostatic or non-hydrostatic pressure. In this experimental program, 11 concrete mixtures were prepared with a fixed water-to-cement ratio and cement content. One plain mixture was prepared as a reference, 5 mixtures were prepared using the recommended dosage of the different permeability-reducing admixtures, and 5 mixtures were prepared using double the recommended dosage. Slump, air content, setting time, compressive and flexural strength, shrinkage, and durability indicating tests including electrical resistivity, rapid chloride penetration, air permeability, permeable voids, and sorptivity tests were conducted at various ages. The data are presented and recommendations for a testing protocol are provided.