982 resultados para Separation methods
Resumo:
The State of Iowa currently has approximately 69,000 miles of unpaved secondary roads. Due to the low traffic count on these unpaved o nts as ng e two dust ed d roads, paving with asphalt or Portland cement concrete is not economical. Therefore to reduce dust production, the use of dust suppressants has been utilized for decades. This study was conducted to evaluate the effectiveness of several widely used dust suppressants through quantitative field testing on two of Iowa’s most widely used secondary road surface treatments: crushed limestone rock and alluvial sand/gravel. These commercially available dust suppressants included: lignin sulfonate, calcium chloride, and soybean oil soapstock. These suppressants were applied to 1000 ft test sections on four unpaved roads in Story County, Iowa. Tduplicate field conditions, the suppressants were applied as a surface spray once in early June and again in late August or early September. The four unpaved roads included two with crushed limestone rock and two with alluvial sand/gravel surface treatmewell as high and low traffic counts. The effectiveness of the dust suppressants was evaluated by comparing the dust produced on treated and untreated test sections. Dust collection was scheduled for 1, 2, 4, 6, and 8 weeks after each application, for a total testiperiod of 16 weeks. Results of a cost analysis between annual dust suppressant application and biennial aggregate replacement indicated that the cost of the dust suppressant, its transportation, and application were relatively high when compared to that of thaggregate types. Therefore, the biennial aggregate replacement is considered more economical than annual dust suppressant application, although the application of annual dust suppressant reduced the cost of road maintenance by 75 %. Results of thecollection indicated that the lignin sulfonate suppressant outperformed calcium chloride and soybean oil soapstock on all four unpavroads, the effect of the suppressants on the alluvial sand/gravel surface treatment was less than that on the crushed limestone rock, the residual effects of all the products seem reasonably well after blading, and the combination of alluvial sand/gravel surface treatment anhigh traffic count caused dust reduction to decrease dramatically.
Resumo:
The purpose of this research was to summarize existing nondestructive test methods that have the potential to be used to detect materials-related distress (MRD) in concrete pavements. The various nondestructive test methods were then subjected to selection criteria that helped to reduce the size of the list so that specific techniques could be investigated in more detail. The main test methods that were determined to be applicable to this study included two stress-wave propagation techniques (impact-echo and spectral analysis of surface waves techniques), infrared thermography, ground penetrating radar (GPR), and visual inspection. The GPR technique was selected for a preliminary round of “proof of concept” trials. GPR surveys were carried out over a variety of portland cement concrete pavements for this study using two different systems. One of the systems was a state-of-the-art GPR system that allowed data to be collected at highway speeds. The other system was a less sophisticated system that was commercially available. Surveys conducted with both sets of equipment have produced test results capable of identifying subsurface distress in two of the three sites that exhibited internal cracking due to MRD. Both systems failed to detect distress in a single pavement that exhibited extensive cracking. Both systems correctly indicated that the control pavement exhibited negligible evidence of distress. The initial positive results presented here indicate that a more thorough study (incorporating refinements to the system, data collection, and analysis) is needed. Improvements in the results will be dependent upon defining the optimum number and arrangement of GPR antennas to detect the most common problems in Iowa pavements. In addition, refining highfrequency antenna response characteristics will be a crucial step toward providing an optimum GPR system for detecting materialsrelated distress.
Resumo:
Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.
Resumo:
We evaluated 25 protocol variants of 14 independent computational methods for exon identification, transcript reconstruction and expression-level quantification from RNA-seq data. Our results show that most algorithms are able to identify discrete transcript components with high success rates but that assembly of complete isoform structures poses a major challenge even when all constituent elements are identified. Expression-level estimates also varied widely across methods, even when based on similar transcript models. Consequently, the complexity of higher eukaryotic genomes imposes severe limitations on transcript recall and splice product discrimination that are likely to remain limiting factors for the analysis of current-generation RNA-seq data.
Resumo:
Aim Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location World-wide.Methods Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.
Resumo:
Deterioration in portland cement concrete (PCC) pavements can occur due to distresses caused by a combination of traffic loads and weather conditions. Hot mix asphalt (HMA) overlay is the most commonly used rehabilitation technique for such deteriorated PCC pavements. However, the performance of these HMA overlaid pavements is hindered due to the occurrence of reflective cracking, resulting in significant reduction of pavement serviceability. Various fractured slab techniques, including rubblization, crack and seat, and break and seat are used to minimize reflective cracking by reducing the slab action. However, the design of structural overlay thickness for cracked and seated and rubblized pavements is difficult as the resulting structure is neither a “true” rigid pavement nor a “true” flexible pavement. Existing design methodologies use the empirical procedures based on the AASHO Road Test conducted in 1961. But, the AASHO Road Test did not employ any fractured slab technique, and there are numerous limitations associated with extrapolating its results to HMA overlay thickness design for fractured PCC pavements. The main objective of this project is to develop a mechanistic-empirical (ME) design approach for the HMA overlay thickness design for fractured PCC pavements. In this design procedure, failure criteria such as the tensile strain at the bottom of HMA layer and the vertical compressive strain on the surface of subgrade are used to consider HMA fatigue and subgrade rutting, respectively. The developed ME design system is also implemented in a Visual Basic computer program. A partial validation of the design method with reference to an instrumented trial project (IA-141, Polk County) in Iowa is provided in this report. Tensile strain values at the bottom of the HMA layer collected from the FWD testing at this project site are in agreement with the results obtained from the developed computer program.
Resumo:
The members of the Iowa Concrete Paving Association, the National Concrete Pavement Technology Center Research Committee, and the Iowa Highway Research Board commissioned a study to examine alternative ways of developing transverse joints in portland cement concrete pavements. The present study investigated six separate variations of vertical metal strips placed above and below the dowels in conventional baskets. In addition, the study investigated existing patented assemblies and a new assembly developed in Spain and used in Australia. The metal assemblies were placed in a new pavement and allowed to stay in place for 30 days before the Iowa Department of Transportation staff terminated the test by directing the contractor to saw and seal the joints. This report describes the design, construction, testing, and conclusions of the project.
Resumo:
RATIONALE The choice of containers for storage of aqueous samples between their collection, transport and water hydrogen (2H) and oxygen (18O) stable isotope analysis is a topic of concern for a wide range of fields in environmental, geological, biomedical, food, and forensic sciences. The transport and separation of water molecules during water vapor or liquid uptake by sorption or solution and the diffusive transport of water molecules through organic polymer material by permeation or pervaporation may entail an isotopic fractionation. An experiment was conducted to evaluate the extent of such fractionation. METHODS Sixteen bottle-like containers of eleven different organic polymers, including low and high density polyethylene (LDPE and HDPE), polypropylene (PP), polycarbonate (PC), polyethylene terephthalate (PET), and perfluoroalkoxy-Teflon (PFA), of different wall thickness and size were completely filled with the same mineral water and stored for 659?days under the same conditions of temperature and humidity. Particular care was exercised to keep the bottles tightly closed and prevent loss of water vapor through the seals. RESULTS Changes of up to +5 parts per thousand for d2H values and +2.0 parts per thousand for d18O values were measured for water after more than 1?year of storage within a plastic container, with the magnitude of change depending mainly on the type of organic polymer, wall thickness, and container size. The most important variations were measured for the PET and PC bottles. Waters stored in glass bottles with Polyseal (TM) cone-lined PP screw caps and thick-walled HDPE or PFA containers with linerless screw caps having an integrally molded inner sealing ring preserved their original d2H and d18O values. The carbon, hydrogen, and oxygen stable isotope compositions of the organic polymeric materials were also determined. CONCLUSIONS The results of this study clearly show that for precise and accurate measurements of the water stable isotope composition in aqueous solutions, rigorous sampling and storage procedures are needed both for laboratory standards and for unknown samples. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
A procedure for the dynamic generation of 1,6-hexamethylene diisocyanate (HDI) aerosol atmospheres of 70 micrograms m-3 (0.01 ppm) to 1.75 mg m-3 (0.25 ppm), based on the precise control of the evaporation of pure liquid HDI and subsequent dilution with air, was developed. The apparatus consisted of a home-made glass nebulizer coupled with a separation stage to exclude non-respirable droplets (greater than 10 microns). The aerosol concentrations were achieved by passing air through the nebulizer at 1.5-4.5 l. min-1 to generate dynamically 0.01-0.25 ppm of diisocyanate in an experimental chamber of 8.55 m3. The distribution of the liquid aerosol was established with an optical counter and the diisocyanate concentration was determined from samples collected in impingers by a high-pressure liquid chromatographic method. The atmospheres generated were suitable for the evaluation both of sampling procedures full scale, and of analytical methods: at 140 micrograms m-3 (0.02 ppm) they remained stable for 15-min provocation tests in clinical asthma, as verified by breath-zone sampling of exposed patients.
Resumo:
Proteomics has come a long way from the initial qualitative analysis of proteins present in a given sample at a given time ("cataloguing") to large-scale characterization of proteomes, their interactions and dynamic behavior. Originally enabled by breakthroughs in protein separation and visualization (by two-dimensional gels) and protein identification (by mass spectrometry), the discipline now encompasses a large body of protein and peptide separation, labeling, detection and sequencing tools supported by computational data processing. The decisive mass spectrometric developments and most recent instrumentation news are briefly mentioned accompanied by a short review of gel and chromatographic techniques for protein/peptide separation, depletion and enrichment. Special emphasis is placed on quantification techniques: gel-based, and label-free techniques are briefly discussed whereas stable-isotope coding and internal peptide standards are extensively reviewed. Another special chapter is dedicated to software and computing tools for proteomic data processing and validation. A short assessment of the status quo and recommendations for future developments round up this journey through quantitative proteomics.
Resumo:
Highway noise is one of the most pressing of the surface characteristics issues facing the concrete paving industry. This is particularly true in urban areas, where not only is there a higher population density near major thoroughfares, but also a greater volume of commuter traffic (Sandberg and Ejsmont 2002; van Keulen 2004). To help address this issue, the National Concrete Pavement Technology Center (CP Tech Center) at Iowa State University (ISU), Federal Highway Administration (FHWA), American Concrete Pavement Association (ACPA), and other organizations have partnered to conduct a multi-part, seven-year Concrete Pavement Surface Characteristics Project. This document contains the results of Part 1, Task 2, of the ISU-FHWA project, addressing the noise issue by evaluating conventional and innovative concrete pavement noise reduction methods. The first objective of this task was to determine what if any concrete surface textures currently constructed in the United States or Europe were considered quiet, had long-term friction characteristics, could be consistently built, and were cost effective. Any specifications of such concrete textures would be included in this report. The second objective was to determine whether any promising new concrete pavement surfaces to control tire-pavement noise and friction were in the development stage and, if so, what further research was necessary. The final objective was to identify measurement techniques used in the evaluation.
Resumo:
Integrative review (IR) has an international reputation in nursing research and evidence-based practice. This IR aimed at identifying and analyzing the concepts and methods recommended to undertaking IR in nursing. Nine information resources,including electronic databases and grey literature were searched. Seventeen studies were included. The results indicate that: primary studies were mostly from USA; it is possible to have several research questions or hypotheses and include primary studies in the review from different theoretical and methodological approaches; it is a type of review that can go beyond the analysis and synthesis of findings from primary studies allowing exploiting other research dimensions, and that presents potentialities for the development of new theories and new problems for research. Conclusion: IR is understood as a very complex type of review and it is expected to be developed using standardized and systematic methods to ensure the required rigor of scientific research and therefore the legitimacy of the established evidence.