949 resultados para Materials and the technique
Resumo:
The retrograde suppression of the synaptic transmission by the endocannabinoid sn-2-arachidonoylglycerol (2-AG) is mediated by the cannabinoid CB1 receptors and requires the elevation of intracellular Ca(2+) and the activation of specific 2-AG synthesizing (i.e., DAGLα) enzymes. However, the anatomical organization of the neuronal substrates that express 2-AG/CB1 signaling system-related molecules associated with selective Ca(2+)-binding proteins (CaBPs) is still unknown. For this purpose, we used double-label immunofluorescence and confocal laser scanning microscopy for the characterization of the expression of the 2-AG/CB1 signaling system (CB1 receptor, DAGLα, MAGL, and FAAH) and the CaBPs calbindin D28k, calretinin, and parvalbumin in the rat hippocampus. CB1, DAGLα, and MAGL labeling was mainly localized in fibers and neuropil, which were differentially organized depending on the hippocampal CaBPs-expressing cells. CB(+) 1 fiber terminals localized in all hippocampal principal cell layers were tightly attached to calbindin(+) cells (granular and pyramidal neurons), and calretinin(+) and parvalbumin(+) interneurons. DAGLα neuropil labeling was selectively found surrounding calbindin(+) principal cells in the dentate gyrus and CA1, and in the calretinin(+) and parvalbumin(+) interneurons in the pyramidal cell layers of the CA1/3 fields. MAGL(+) terminals were only observed around CA1 calbindin(+) pyramidal cells, CA1/3 calretinin(+) interneurons and CA3 parvalbumin(+) interneurons localized in the pyramidal cell layers. Interestingly, calbindin(+) pyramidal cells expressed FAAH specifically in the CA1 field. The identification of anatomically related-neuronal substrates that expressed 2-AG/CB1 signaling system and selective CaBPs should be considered when analyzing the cannabinoid signaling associated with hippocampal functions.
Resumo:
An adaptation technique based on the synoptic atmospheric circulation to forecast local precipitation, namely the analogue method, has been implemented for the western Swiss Alps. During the calibration procedure, relevance maps were established for the geopotential height data. These maps highlight the locations were the synoptic circulation was found of interest for the precipitation forecasting at two rain gauge stations (Binn and Les Marécottes) that are located both in the alpine Rhône catchment, at a distance of about 100 km from each other. These two stations are sensitive to different atmospheric circulations. We have observed that the most relevant data for the analogue method can be found where specific atmospheric circulation patterns appear concomitantly with heavy precipitation events. Those skilled regions are coherent with the atmospheric flows illustrated, for example, by means of the back trajectories of air masses. Indeed, the circulation recurrently diverges from the climatology during days with strong precipitation on the southern part of the alpine Rhône catchment. We have found that for over 152 days with precipitation amount above 50 mm at the Binn station, only 3 did not show a trajectory of a southerly flow, meaning that such a circulation was present for 98% of the events. Time evolution of the relevance maps confirms that the atmospheric circulation variables have significantly better forecasting skills close to the precipitation period, and that it seems pointless for the analogue method to consider circulation information days before a precipitation event as a primary predictor. Even though the occurrence of some critical circulation patterns leading to heavy precipitation events can be detected by precursors at remote locations and 1 week ahead (Grazzini, 2007; Martius et al., 2008), time extrapolation by the analogue method seems to be rather poor. This would suggest, in accordance with previous studies (Obled et al., 2002; Bontron and Obled, 2005), that time extrapolation should be done by the Global Circulation Model, which can process atmospheric variables that can be used by the adaptation method.
Investigation into Improved Pavement Curing Materials and Techniques: Part 2 - Phase III, March 2003
Resumo:
Appropriate curing is important for concrete to obtain the designed properties. This research was conducted to evaluate the curing effects of different curing materials and methods on pavement properties. At present the sprayed curing compound is a common used method for pavement and other concrete structure construction. Three curing compounds were selected for testing. Two different application rates were employed for the white-pigmented liquid curing compounds. The concrete properties of temperature, moisture content, conductivity, and permeability were examined at several test locations. It was found, in this project, that the concrete properties varied with the depth. Of the tests conducted (maturity, sorptivity, permeability, and conductivity), conductivity appears to be the best method to evaluate the curing effects in the field and bears potential for field application. The results indicated that currently approved curing materials in Iowa, when spread uniformly in a single or double application, provide adequate curing protection and meet the goals of the Iowa Department of Transportation. Experimental curing methods can be compared to this method through the use of conductivity testing to determine their application in the field.
Resumo:
Concrete curing is closely related to cement hydration, microstructure development, and concrete performance. Application of a liquid membrane-forming curing compound is among the most widely used curing methods for concrete pavements and bridge decks. Curing compounds are economical, easy to apply, and maintenance free. However, limited research has been done to investigate the effectiveness of different curing compounds and their application technologies. No reliable standard testing method is available to evaluate the effectiveness of curing, especially of the field concrete curing. The present research investigates the effects of curing compound materials and application technologies on concrete properties, especially on the properties of surface concrete. This report presents a literature review of curing technology, with an emphasis on curing compounds, and the experimental results from the first part of this research—lab investigation. In the lab investigation, three curing compounds were selected and applied to mortar specimens at three different times after casting. Two application methods, single- and double-layer applications, were employed. Moisture content, conductivity, sorptivity, and degree of hydration were measured at different depths of the specimens. Flexural and compressive strength of the specimens were also tested. Statistical analysis was conducted to examine the relationships between these material properties. The research results indicate that application of a curing compound significantly increased moisture content and degree of cement hydration and reduced sorptivity of the near-surface-area concrete. For given concrete materials and mix proportions, optimal application time of curing compounds depended primarily upon the weather condition. If a sufficient amount of a high-efficiency-index curing compound was uniformly applied, no double-layer application was necessary. Among all test methods applied, the sorptivity test is the most sensitive one to provide good indication for the subtle changes in microstructure of the near-surface-area concrete caused by different curing materials and application methods. Sorptivity measurement has a close relation with moisture content and degree of hydration. The research results have established a baseline for and provided insight into the further development of testing procedures for evaluation of curing compounds in field. Recommendations are provided for further field study.
Resumo:
Severe environmental conditions, coupled with the routine use of deicing chemicals and increasing traffic volume, tend to place extreme demands on portland cement concrete (PCC) pavements. In most instances, engineers have been able to specify and build PCC pavements that met these challenges. However, there have also been reports of premature deterioration that could not be specifically attributed to a single cause. Modern concrete mixtures have evolved to become very complex chemical systems. The complexity can be attributed to both the number of ingredients used in any given mixture and the various types and sources of the ingredients supplied to any given project. Local environmental conditions can also influence the outcome of paving projects. This research project investigated important variables that impact the homogeneity and rheology of concrete mixtures. The project consisted of a field study and a laboratory study. The field study collected information from six different projects in Iowa. The information that was collected during the field study documented cementitious material properties, plastic concrete properties, and hardened concrete properties. The laboratory study was used to develop baseline mixture variability information for the field study. It also investigated plastic concrete properties using various new devices to evaluate rheology and mixing efficiency. In addition, the lab study evaluated a strategy for the optimization of mortar and concrete mixtures containing supplementary cementitious materials. The results of the field studies indicated that the quality management concrete (QMC) mixtures being placed in the state generally exhibited good uniformity and good to excellent workability. Hardened concrete properties (compressive strength and hardened air content) were also satisfactory. The uniformity of the raw cementitious materials that were used on the projects could not be monitored as closely as was desired by the investigators; however, the information that was gathered indicated that the bulk chemical composition of most materials streams was reasonably uniform. Specific minerals phases in the cementitious materials were less uniform than the bulk chemical composition. The results of the laboratory study indicated that ternary mixtures show significant promise for improving the performance of concrete mixtures. The lab study also verified the results from prior projects that have indicated that bassanite is typically the major sulfate phase that is present in Iowa cements. This causes the cements to exhibit premature stiffening problems (false set) in laboratory testing. Fly ash helps to reduce the impact of premature stiffening because it behaves like a low-range water reducer in most instances. The premature stiffening problem can also be alleviated by increasing the water–cement ratio of the mixture and providing a remix cycle for the mixture.
Resumo:
INTRODUCTION: The presence of a pre-existing narrow spinal canal may have an important place in the ethiopathogenesis of lumbar spinal stenosis. By consequence the study of the development of the spinal canal is crucial. The first goal of this work is to do a comprehensive literature search and to give an essential view on the development of spinal canal and its depending factors studied until now. The second goal is to give some considerations and hypothesize new leads for clinically useful researches. MATERIALS AND METHODS: A bibliographical research was executed using different search engines: PubMed, Google Schoolar ©, Ovid ® and Web Of Science ©. Free sources and avaible from the University of Lausanne (UNIL) and Centre Hospitalier Universitaire Vaudois (CHUV) were used. At the end of the bibliographic researches 114 references were found, 85 were free access and just 41 were cited in this work. Most of the found references are in English or in French. RESULTS AND DISCUSSION: The spinal canal is principally limited by the vertebrae which have a mesodermal origin. The nervous (ectodermal) tissue significantly influences the growth of the canal. The most important structure participating in the spinal canal growth is the neurocentral synchondrosis in almost the entire vertebral column. The fusion of the half posterior arches seems to have less importance for the canal size. The growth is not homogeneous but, depends on the vertebral level. Timing, rate and growth potentials differ by regions. Especially in the case of the lumbar segment, there is a craniocaudal tendency which entails a greater post-natal catch-up growth for distal vertebrae. Trefoil-shape of the L5 canal is the consequence of a sagittal growth deficiency. The spinal canal shares some developmental characteristics with different structures and systems, especially with the central nervous system. It may be the consequence of the embryological origin. It is supposed that not all the related structures would be affected by a growth impairment because of the different catch-up potentials. Studies found that narrower spinal canals might be related with cardiovascular and gastrointestinal symptoms, lower thymic function, bone mineral content, dental hypoplasia and Harris' lines. Anthropometric correlations found at birth disappear during the pediatric age. All factors which can affect bone and nervous growth might be relevant. Genetic predispositions are the only factors that can never be changed but the real impact is to ascertain. During the antenatal period, all the elements determining a good supply of blood and oxygen may influence the vertebral canal development, for example smoking during pregnancy. Diet is a crucial factor having an impact on both antenatal and postnatal growth. Proteins intake is the only proved dietetic relationship found in the bibliographic research of this work. The mechanical effects due to locomotion changes are unknown. Socioeconomic situation has an impact on several influencing factors and it is difficult to study it owing to numerous bias. CONCLUSIONS: A correct growth of spinal canal is evidently relevant to prevent not-degenerative stenotic conditions. But a "congenital" narrower canal may aggravate degenerative stenosis. This concerns specific groups of patient. If the size of the canal is highly involved in the pathogenesis of common back pains, a hypothetical measure to prevent developmental impairments could have a not- negligible impact on the society. It would be interesting to study more about dietetic necessities for a good spinal canal development. Understanding the relationship between nervous tissues and vertebra it might be useful in identifying what is needed for the ideal development. Genetic importance and the post-natal influences of upright standing on the canal growth remain unsolved questions. All these tracks may have a double purpose: knowing if it is possible to decrease the incidence of narrower spinal canal and consequently finding possible preventive measures. The development of vertebral canal is a complex subject which ranges over a wide variety of fields. The knowledge of this subject is an indispensable tool to understand and hypothesize the influencing factors that might lead to stenotic conditions. Unfortunately, a lack of information makes difficult to have a complete and satisfactory interdisciplinary vision.
Resumo:
PURPOSE: The Gastro-Intestinal Working Party of the EORTC Radiation Oncology Group (GIWP-ROG) developed guidelines for target volume definition in neoadjuvant radiation of adenocarcinomas of the gastroesophageal junction (GEJ) and the stomach. METHODS AND MATERIALS: Guidelines about the definition of the clinical target volume (CTV) are based on a systematic literature review of the location and frequency of local recurrences and lymph node involvement in adenocarcinomas of the GEJ and the stomach. Therefore, MEDLINE was searched up to August 2008. Guidelines concerning prescription, planning and treatment delivery are based on a consensus between the members of the GIWP-ROG. RESULTS: In order to support a curative resection of GEJ and gastric cancer, an individualized preoperative treatment volume based on tumour location has to include the primary tumour and the draining regional lymph nodes area. Therefore we recommend to use the 2nd English Edition of the Japanese Classification of Gastric Carcinoma of the Japanese Gastric Cancer Association which developed the concept of assigning tumours of the GEJ and the stomach to anatomically defined sub-sites corresponding respectively to a distinct lymphatic spread pattern. CONCLUSION: The GIWP-ROG defined guidelines for preoperative irradiation of adenocarcinomas of the GEJ and the stomach to reduce variability in the framework of future clinical trials.
Resumo:
Two methods were evaluated for scaling a set of semivariograms into a unified function for kriging estimation of field-measured properties. Scaling is performed using sample variances and sills of individual semivariograms as scale factors. Theoretical developments show that kriging weights are independent of the scaling factor which appears simply as a constant multiplying both sides of the kriging equations. The scaling techniques were applied to four sets of semivariograms representing spatial scales of 30 x 30 m to 600 x 900 km. Experimental semivariograms in each set successfully coalesced into a single curve by variances and sills of individual semivariograms. To evaluate the scaling techniques, kriged estimates derived from scaled semivariogram models were compared with those derived from unscaled models. Differences in kriged estimates of the order of 5% were found for the cases in which the scaling technique was not successful in coalescing the individual semivariograms, which also means that the spatial variability of these properties is different. The proposed scaling techniques enhance interpretation of semivariograms when a variety of measurements are made at the same location. They also reduce computational times for kriging estimations because kriging weights only need to be calculated for one variable. Weights remain unchanged for all other variables in the data set whose semivariograms are scaled.
Resumo:
The corpus callosum (CC) plays a crucial role in interhemispheric communication. It has been shown that CC formation relies on the guidepost cells located in the midline region that include glutamatergic and GABAergic neurons as well as glial cells. However, the origin of these guidepost GABAergic neurons and their precise function in callosal axon pathfinding remain to be investigated. Here, we show that two distinct GABAergic neuronal subpopulations converge toward the midline prior to the arrival of callosal axons. Using in vivo and ex vivo fate mapping we show that CC GABAergic neurons originate in the caudal and medial ganglionic eminences (CGE and MGE) but not in the lateral ganglionic eminence (LGE). Time lapse imaging on organotypic slices and in vivo analyses further revealed that CC GABAergic neurons contribute to the normal navigation of callosal axons. The use of Nkx2.1 knockout (KO) mice confirmed a role of these neurons in the maintenance of proper behavior of callosal axons while growing through the CC. Indeed, using in vitro transplantation assays, we demonstrated that both MGE- and CGE-derived GABAergic neurons exert an attractive activity on callosal axons. Furthermore, by combining a sensitive RT-PCR technique with in situ hybridization, we demonstrate that CC neurons express multiple short and long range guidance cues. This study strongly suggests that MGE- and CGE-derived interneurons may guide CC axons by multiple guidance mechanisms and signaling pathways. © 2013 Wiley Periodicals, Inc. Develop Neurobiol 73: 647-672, 2013.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
Nanotechnology has been heralded as a "revolution" in science, for two reasons: first, because of its revolutionary view of the way in which chemicals and elements, such as gold and silver, behave, compared to traditional scientific understanding of their properties. Second, the impact of these new discoveries, as applied to commerce, can transform the daily life of consumer products ranging from sun tan lotions and cosmetics, food packaging and paints and coatings for cars, housing and fabrics, medicine and thousands of industrial processes.9 Beneficial consumer use of nanotechnologies, already in the stream of commerce, improves coatings on inks and paints in everything from food packaging to cars. Additionally, "Nanomedicine" offers the promise of diagnosis and treatment at the molecular level in order to detect and treat presymptomatic disease,10 or to rebuild neurons in Alzheimer's and Parkinson's disease. There is a possibility that severe complications such as stroke or heart attack may be avoided by means of prophylactic treatment of people at risk, and bone regeneration may keep many people active who never expected rehabilitation. Miniaturisation of diagnostic equipment can also reduce the amount of sampling materials required for testing and medical surveillance. Miraculous developments, that sound like science fiction to those people who eagerly anticipate these medical products, combined with the emerging commercial impact of nanotechnology applications to consumer products will reshape civil society - permanently. Thus, everyone within the jurisdiction of the Council of Europe is an end-user of nanotechnology, even without realising that nanotechnology has touched daily life.
Resumo:
The National Academies has stressed the need to develop quantifiable measures for methods that are currently qualitative in nature, such as the examination of fingerprints. Current protocols and procedures to perform these examinations rely heavily on a succession of subjective decisions, from the initial acceptance of evidence for probative value to the final assessment of forensic results. This project studied the concept of sufficiency associated with the decisions made by latent print examiners at the end of the various phases of the examination process. During this 2-year effort, a web‐based interface was designed to capture the observations of 146 latent print examiners and trainees on 15 pairs of latent/control prints. Two main findings resulted from the study: The concept of sufficiency is driven mainly by the number and spatial relationships between the minutiae observed on the latent and control prints. Data indicate that demographics (training, certification, years of experience) or non‐minutiae based features (such as level 3 features) do not play a major role in examiners' decisions; Significant variability was observed between detecting and interpreting friction ridge features and at all levels of details, as well as for factors that have the potential to influence the examination process, such as degradation, distortion, or influence of the background and the development technique.
Resumo:
(from the journal abstract) Scientific interest for the concept of alliance has been maintained and stimulated by repeated findings that a strong alliance is associated with facilitative treatment process and favourable treatment outcome. However, because the alliance is not in itself a therapeutic technique, these findings were unsuccessful in bringing about significant improvements in clinical practice. An essential issue in modern psychotherapeutic research concerns the relation between common factors which are known to explain great variance in empirical results and the specific therapeutic techniques which are the primary basis of clinical training and practice. This pilot study explored sequences in therapist interventions over four sessions of brief psychodynamic investigation. It aims at determining if patterns of interventions can be found during brief psychodynamic investigation and if these patterns can be associated with differences in the therapeutic alliance. Therapist interventions where coded using the Psychodynamic Intervention Rating Scale (PIRS) which enables the classification of each therapist utterance into one of 9 categories of interpretive interventions (defence interpretation, transference interpretation), supportive interventions (question, clarification, association, reflection, supportive strategy) or interventions about the therapeutic frame (work-enhancing statement, contractual arrangement). Data analysis was done using lag sequential analysis, a statistical procedure which identifies contingent relationships in time among a large number of behaviours. The sample includes N = 20 therapist-patient dyads assigned to three groups with: (1) a high and stable alliance profile, (2) a low and stable alliance profile and (3) an improving alliance profile. Results suggest that therapists most often have one single intention when interacting with patients. Large sequences of questions, associations and clarifications were found, which indicate that if a therapist asks a question, clarifies or associates, there is a significant probability that he will continue doing so. A single theme sequence involving frame interventions was also observed. These sequences were found in all three alliance groups. One exception was found for mixed sequences of interpretations and supportive interventions. The simultaneous use of these two interventions was associated with a high or an improving alliance over the course of treatment, but not with a low and stable alliance where only single theme sequences of interpretations were found. In other words, in this last group, therapists were either supportive or interpretative, whereas with high or improving alliance, interpretations were always given along with supportive interventions. This finding provides evidence that examining therapist interpretation individually can only yield incomplete findings. How interpretations were given is important for alliance building. It also suggests that therapists should carefully dose their interpretations and be supportive when necessary in order to build a strong therapeutic alliance. And from a research point of view, to study technical interventions, we must look into dynamic variables such as dosage, the supportive quality of an intervention, and timing. (PsycINFO Database Record (c) 2005 APA, all rights reserved)
Resumo:
This report covers the construction in 1961 of the soil-cement base and related pavement structure on Iowa 37 from Soldier to Dunlap, (F-861(6), Crawford, Harrison, Monona). The report also contains an account of the experimental work performed on the same road under research project HR-75.
Resumo:
The Falling Weight Deflectometer (FWD) has become the "standard" for deflection testing of pavements. Iowa has used a Road Rater since 1976 to obtain deflection information. A correlation between the Road Rater and the FWD was needed if Iowa was going to continue with the Road Rater. Comparative deflection testing was done using a Road Rater Model 400 and a Pynatest 8000 FWD on 26 pavement sections. The SHRP contractor, Braun Intertec Pavement, Inc., provided the FWD testing. The r^2 for the linear correlations ranged from 0.90 to 0.99 for the different pavement types and sensor locations.