993 resultados para pacs: mathematical techniques
Investigation into Improved Pavement Curing Materials and Techniques: Part 2 - Phase III, March 2003
Resumo:
Appropriate curing is important for concrete to obtain the designed properties. This research was conducted to evaluate the curing effects of different curing materials and methods on pavement properties. At present the sprayed curing compound is a common used method for pavement and other concrete structure construction. Three curing compounds were selected for testing. Two different application rates were employed for the white-pigmented liquid curing compounds. The concrete properties of temperature, moisture content, conductivity, and permeability were examined at several test locations. It was found, in this project, that the concrete properties varied with the depth. Of the tests conducted (maturity, sorptivity, permeability, and conductivity), conductivity appears to be the best method to evaluate the curing effects in the field and bears potential for field application. The results indicated that currently approved curing materials in Iowa, when spread uniformly in a single or double application, provide adequate curing protection and meet the goals of the Iowa Department of Transportation. Experimental curing methods can be compared to this method through the use of conductivity testing to determine their application in the field.
Resumo:
Concrete curing is closely related to cement hydration, microstructure development, and concrete performance. Application of a liquid membrane-forming curing compound is among the most widely used curing methods for concrete pavements and bridge decks. Curing compounds are economical, easy to apply, and maintenance free. However, limited research has been done to investigate the effectiveness of different curing compounds and their application technologies. No reliable standard testing method is available to evaluate the effectiveness of curing, especially of the field concrete curing. The present research investigates the effects of curing compound materials and application technologies on concrete properties, especially on the properties of surface concrete. This report presents a literature review of curing technology, with an emphasis on curing compounds, and the experimental results from the first part of this research—lab investigation. In the lab investigation, three curing compounds were selected and applied to mortar specimens at three different times after casting. Two application methods, single- and double-layer applications, were employed. Moisture content, conductivity, sorptivity, and degree of hydration were measured at different depths of the specimens. Flexural and compressive strength of the specimens were also tested. Statistical analysis was conducted to examine the relationships between these material properties. The research results indicate that application of a curing compound significantly increased moisture content and degree of cement hydration and reduced sorptivity of the near-surface-area concrete. For given concrete materials and mix proportions, optimal application time of curing compounds depended primarily upon the weather condition. If a sufficient amount of a high-efficiency-index curing compound was uniformly applied, no double-layer application was necessary. Among all test methods applied, the sorptivity test is the most sensitive one to provide good indication for the subtle changes in microstructure of the near-surface-area concrete caused by different curing materials and application methods. Sorptivity measurement has a close relation with moisture content and degree of hydration. The research results have established a baseline for and provided insight into the further development of testing procedures for evaluation of curing compounds in field. Recommendations are provided for further field study.
Resumo:
Standards for the construction of full-depth patching in portland cement concrete pavement usually require replacement of all deteriorated based materials with crushed stone, up to the bottom of the existing pavement layer. In an effort to reduce the time of patch construction and costs, the Iowa Department of Transportation and the Department of Civil, Construction and Environmental Engineering at Iowa State University studied the use of extra concrete depth as an option for base construction. This report compares the impact of additional concrete patching material depth on rate of strength gain, potential for early opening to traffic, patching costs, and long-term patch performance. This report also compares those characteristics in terms of early setting and standard concrete mixes. The results have the potential to change the method of Portland cement concrete pavement patch construction in Iowa.
Resumo:
The Baldwin effect can be observed if phenotypic learning influences the evolutionary fitness of individuals, which can in turn accelerate or decelerate evolutionary change. Evidence for both learning-induced acceleration and deceleration can be found in the literature. Although the results for both outcomes were supported by specific mathematical or simulation models, no general predictions have been achieved so far. Here we propose a general framework to predict whether evolution benefits from learning or not. It is formulated in terms of the gain function, which quantifies the proportional change of fitness due to learning depending on the genotype value. With an inductive proof we show that a positive gain-function derivative implies that learning accelerates evolution, and a negative one implies deceleration under the condition that the population is distributed on a monotonic part of the fitness landscape. We show that the gain-function framework explains the results of several specific simulation models. We also use the gain-function framework to shed some light on the results of a recent biological experiment with fruit flies.
Resumo:
Pavement settlement occurring in and around utility cuts is a common problem, resulting in uneven pavement surfaces, annoyance to drivers, and ultimately, further maintenance. A survey of municipal authorities and field and laboratory investigations were conducted to identify the factors contributing to the settlement of utility cut restorations in pavement sections. Survey responses were received from seven cities across Iowa and indicate that utility cut restorations often last less than two years. Observations made during site inspections showed that backfill material varies from one city to another, backfill lift thickness often exceeds 12 inches, and the backfill material is often placed at bulking moisture contents with no Quality control/Quality Assurance. Laboratory investigation of the backfill materials indicate that at the field moisture contents encountered, the backfill materials have collapse potentials up to 35%. Falling Weight Deflectometer (FWD) deflection data and elevation shots indicate that the maximum deflection in the pavement occurs in the area around the utility cut restoration. The FWD data indicate a zone of influence around the perimeter of the restoration extending two to three feet beyond the trench perimeter. The research team proposes moisture control, the use of 65% relative density in a granular fill, and removing and compacting the native material near the ground surface around the trench. Test sections with geogrid reinforcement were also incorporated. The performance of inspected and proposed utility cuts needs to be monitored for at least two more years.
Resumo:
PURPOSE: To compare examination time with radiologist time and to measure radiation dose of computed tomographic (CT) fluoroscopy, conventional CT, and conventional fluoroscopy as guiding modalities for shoulder CT arthrography. MATERIALS AND METHODS: Glenohumeral injection of contrast material for CT arthrography was performed in 64 consecutive patients (mean age, 32 years; age range, 16-74 years) and was guided with CT fluoroscopy (n = 28), conventional CT (n = 14), or conventional fluoroscopy (n = 22). Room times (arthrography, room change, CT, and total examination times) and radiologist times (time the radiologist spent in the fluoroscopy or CT room) were measured. One-way analysis of variance and Bonferroni-Dunn posthoc tests were performed for comparison of mean times. Mean effective radiation dose was calculated for each method with examination data, phantom measurements, and standard software. RESULTS: Mean total examination time was 28.0 minutes for CT fluoroscopy, 28.6 minutes for conventional CT, and 29.4 minutes for conventional fluoroscopy; mean radiologist time was 9.9 minutes, 10.5 minutes, and 9.0 minutes, respectively. These differences were not statistically significant. Mean effective radiation dose was 0.0015 mSv for conventional fluoroscopy (mean, nine sections), 0.22 mSv for CT fluoroscopy (120 kV; 50 mA; mean, 15 sections), and 0.96 mSv for conventional CT (140 kV; 240 mA; mean, six sections). Effective radiation dose can be reduced to 0.18 mSv for conventional CT by changing imaging parameters to 120 kV and 100 mA. Mean effective radiation dose of the diagnostic CT arthrographic examination (140 kV; 240 mA; mean, 25 sections) was 2.4 mSv. CONCLUSION: CT fluoroscopy and conventional CT are valuable alternative modalities for glenohumeral CT arthrography, as examination and radiologist times are not significantly different. CT guidance requires a greater radiation dose than does conventional fluoroscopy, but with adequate parameters CT guidance constitutes approximately 8% of the radiation dose.
Resumo:
Ligands and receptors of the TNF superfamily are therapeutically relevant targets in a wide range of human diseases. This chapter describes assays based on ELISA, immunoprecipitation, FACS, and reporter cell lines to monitor interactions of tagged receptors and ligands in both soluble and membrane-bound forms using unified detection techniques. A reporter cell assay that is sensitive to ligand oligomerization can identify ligands with high probability of being active on endogenous receptors. Several assays are also suitable to measure the activity of agonist or antagonist antibodies, or to detect interactions with proteoglycans. Finally, self-interaction of membrane-bound receptors can be evidenced using a FRET-based assay. This panel of methods provides a large degree of flexibility to address questions related to the specificity, activation, or inhibition of TNF-TNF receptor interactions in independent assay systems, but does not substitute for further tests in physiologically relevant conditions.
Resumo:
Objective: To evaluate the safety of the performance of the traditional and protected collection techniques of tracheal aspirate and to identify qualitative and quantitative agreement of the results of microbiological cultures between the techniques. Method: Clinical, prospective, comparative, single-blind research. The sample was composed of 54 patients of >18 years of age, undergoing invasive mechanical ventilation for a period of ≥48 hours and with suspected Ventilator Associated Pneumonia. The two techniques were implemented in the same patient, one immediately after the other, with an order of random execution, according to randomization by specialized software. Results: No significant events occurred oxygen desaturation, hemodynamic instability or tracheobronchial hemorrhage (p<0.05) and, although there were differences in some strains, there was qualitative and quantitative agreement between the techniques (p<0.001). Conclusion: Utilization of the protected technique provided no advantage over the traditional and execution of both techniques was safe for the patient.
Resumo:
Histoire discursive du « cinéma-vérité ». Techniques, controverses, historiographie (1960-1970) retrace l'histoire du succès et de la disgrâce du label « cinéma vérité » en France qui, entre 1960 - date à laquelle Edgar Morin publie son essai programmatique « Pour un nouveau "cinéma vérité" » dans France Observateur - et 1964-65 - moment où la notion commence à perdre en popularité - sert de bannière à un mouvement cinématographique supposé renouveler les rapports entre cinéma et réalité. Une vingtaine de films - comme Chronique d'un été de Jean Rouch et Edgar Morin, Primary de Richard Leacock et Robert Drew, Les Inconnus de la terre ou Regard sur la folie de Mario Ruspoli, Hitler, connais pas de Bertrand Blier, Le Chemin de la mauvaise route de Jean Herman, Le Joli Mai de Chris Marker, La Punition de Jean Rouch ou Pour la Suite du monde de Michel Brault et Pierre Perrault - revendiquent cette étiquette ou y sont associés par la presse hexagonale qui y consacre des centaines d'articles. En effet, la sortie en salles de ces « films-vérité » provoque en France de virulentes controverses qui interrogent aussi bien l'éthique de ces projets où les personnes filmées sont supposées révéler une vérité intime face à la caméra, le statut artistique de ces réalisations, ou l'absence d'un engagement politique marqué des « cinéastes-vérité » devant les questions abordées par les protagonistes (par exemple la Guerre d'Algérie, la jeunesse française, la politique internationale). L'hypothèse à la base de cette recherche est que la production cinématographique qui se réclame du « cinéma-vérité » se caractérise par une étroite corrélation entre film et discours sur le film. D'une part car la première moitié de la décennie est marquée par de nombreuses rencontres entre les « cinéastes vérité », les critiques ou les constructeurs de caméras légères et de magnétophones synchrones ; rencontres qui contribuent à accentuer et à médiatiser les dissensions au sein du mouvement. D'autre part car la particularité de nombreux projets est d'inclure dans le film des séquences méta-discursives où les participants, les réalisateurs ou des experts débattent de la réussite du tournage. Ce travail montre que le succès du mouvement entre 1960 et 1964-65 ne se fait pas malgré une forte polémique, mais qu'au contraire, nombre de longs métrages intègrent la controverse en leur sein, interrogeant, sur un plan symbolique, l'abolition du filtre entre le film et son spectateur. Si les films qui s'inscrivent dans la mouvance du « cinéma vérité » octroient une large place à la confrontation, c'est parce que la « vérité » est pensée comme un processus dialectique, qui émerge dans une dynamique d'échanges (entre les réalisateurs de cette mouvance, entre les protagonistes, entre le film et son public). Les querelles internes ou publiques qui rythment ces quelques années font partie du dispositif « cinéma-vérité » et justifient de faire l'histoire de ce mouvement cinématographique par le biais des discours qu'il a suscité au sein de la cinéphilie française.
Resumo:
Fungal symbionts commonly occur in plants influencing host growth, physiology, and ecology (Carlile et al., 2001). However, while whole-plant growth responses to biotrophic fungi are readily demonstrated, it has been much more difficult to identify and detect the physiological mechanisms responsible. Previous work on the clonal grass Glyceria striata has revealed that the systemic fungal endophyte Epichloë glyceriae has a positive effect on clonal growth of its host (Pan & Clay, 2002; 2003). The latest study from these authors, in this issue (pp. 467- 475), now suggests that increased carbon movement in hosts infected by E. glyceriae may function as one mechanism by which endophytic fungi could increase plant growth. Given the widespread distribution of both clonal plants and symbiotic fungi, this research will have implications for our understanding of the ecology and evolution of fungus-plant associations in natural communities.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
We start with a generalization of the well-known three-door problem:the n-door problem. The solution of this new problem leads us toa beautiful representation system for real numbers in (0,1] as alternated series, known in the literature as Pierce expansions. A closer look to Pierce expansions will take us to some metrical properties of sets defined through the Pierce expansions of its elements. Finally, these metrical properties will enable us to present 'strange' sets, similar to the classical Cantor set.
Resumo:
This paper includes the derivations of the main expressions in the paper ``The Daily Market for Funds in Europe: Has Something Changed With the EMU?'' by G. Pérez Quirós and H. Rodríguez Mendizábal.