999 resultados para Seismic test
Resumo:
With the use of supplementary cementing materials (SCMs) in concrete mixtures, salt scaling tests such as ASTM C672 have been found to be overly aggressive and do correlate well with field scaling performance. The reasons for this are thought to be because at high replacement levels, SCM mixtures can take longer to set and to develop their properties: neither of these factors is taken into account in the standard laboratory finishing and curing procedures. As a result, these variables were studied as well as a modified scaling test, based on the Quebec BNQ scaling test that had shown promise in other research. The experimental research focused on the evaluation of three scaling resistance tests, including the ASTM C672 test with normal curing as well as an accelerated curing regime used by VDOT for ASTM C1202 rapid chloride permeability tests and now included as an option in ASTM C1202. As well, several variations on the proposed draft ASTM WK9367 deicer scaling resistance test, based on the Quebec Ministry of Transportation BNQ test method, were evaluated for concretes containing varying amounts of slag cement. A total of 16 concrete mixtures were studied using both high alkali cement and low alkali cement, Grade 100 slag and Grade 120 slag with 0, 20, 35 and 50 percent slag replacement by mass of total cementing materials. Vinsol resin was used as the primary air entrainer and Micro Air® was used in two replicate mixes for comparison. Based on the results of this study, a draft alternative test method to ASTM C762 is proposed.
Resumo:
Understanding and anticipating biological invasions can focus either on traits that favour species invasiveness or on features of the receiving communities, habitats or landscapes that promote their invasibility. Here, we address invasibility at the regional scale, testing whether some habitats and landscapes are more invasible than others by fitting models that relate alien plant species richness to various environmental predictors. We use a multi-model information-theoretic approach to assess invasibility by modelling spatial and ecological patterns of alien invasion in landscape mosaics and testing competing hypotheses of environmental factors that may control invasibility. Because invasibility may be mediated by particular characteristics of invasiveness, we classified alien species according to their C-S-R plant strategies. We illustrate this approach with a set of 86 alien species in Northern Portugal. We first focus on predictors influencing species richness and expressing invasibility and then evaluate whether distinct plant strategies respond to the same or different groups of environmental predictors. We confirmed climate as a primary determinant of alien invasions and as a primary environmental gradient determining landscape invasibility. The effects of secondary gradients were detected only when the area was sub-sampled according to predictions based on the primary gradient. Then, multiple predictor types influenced patterns of alien species richness, with some types (landscape composition, topography and fire regime) prevailing over others. Alien species richness responded most strongly to extreme land management regimes, suggesting that intermediate disturbance induces biotic resistance by favouring native species richness. Land-use intensification facilitated alien invasion, whereas conservation areas hosted few invaders, highlighting the importance of ecosystem stability in preventing invasions. Plants with different strategies exhibited different responses to environmental gradients, particularly when the variations of the primary gradient were narrowed by sub-sampling. Such differential responses of plant strategies suggest using distinct control and eradication approaches for different areas and alien plant groups.
Resumo:
This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).
Resumo:
Seismic methods used in the study of snow avalanches may be employed to detect and characterize landslides and other mass movements, using standard spectrogram/sonogram analysis. For snow avalanches, the spectrogram for a station that is approached by a sliding mass exhibits a triangular time/frequency signature due to an increase over time in the higher-frequency constituents. Recognition of this characteristic footprint in a spectrogram suggests a useful metric for identifying other mass-movement events such as landslides. The 1 June 2005 slide at Laguna Beach, California is examined using data obtained from the Caltech/USGS Regional Seismic Network. This event exhibits the same general spectrogram features observed in studies of Alpine snow avalanches. We propose that these features are due to the systematic relative increase in high-frequency energy transmitted to a seismometer in the path of a mass slide owing to a reduction of distance from the source signal. This phenomenon is related to the path of the waves whose high frequencies are less attenuated as they traverse shorter source-receiver paths. Entrainment of material in the course of the slide may also contribute to the triangular time/frequency signature as a consequence of the increase in the energy involved in the process; in this case the contribution would be a source effect. By applying this commonly observed characteristic to routine monitoring algorithms, along with custom adjustments for local site effects, we seek to contribute to the improvement in automatic detection and monitoring methods of landslides and other mass movements.
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Resumo:
Early detection of neural-tude defects is possible by determining Alpha-fetoprotein (AFP) in maternal serum. 16'685 pregnant women were observed. Three methods for the determination of the "normal" range are compared. The first one, already used in similar studies, makes use of a constant multiple of the median. The other two ones make use of robust estimates of location and scale. Their comparison shows the interest of the robust methods to reduce the interlaboratory variability.
Resumo:
Le but de cette étude est de répondre aux 3 questions suivantes: - 1) Le test de MAST est-il applicable, dans sa traduction française, à la population d'un service de médecine interne d'un hôpital universitaire en Suisse romande ? - 2) Le test de MAST apporte-t-il des résultats concordants avec le diagnostic clinique d'une part, et avec les résultats tirés de la littérature d'autre part ? - 3) De quelles façons peut-on définir et choisir deux valeurs critiques du test afin d'optimaliser l'utilisation du test de MAST dans l'étude comparative projetée ? ANNEXE: Traduction littérale en langue française du : "Michigan Alcoholism Screening Test" (MAST); etc.
Resumo:
Several methods and algorithms have recently been proposed that allow for the systematic evaluation of simple neuron models from intracellular or extracellular recordings. Models built in this way generate good quantitative predictions of the future activity of neurons under temporally structured current injection. It is, however, difficult to compare the advantages of various models and algorithms since each model is designed for a different set of data. Here, we report about one of the first attempts to establish a benchmark test that permits a systematic comparison of methods and performances in predicting the activity of rat cortical pyramidal neurons. We present early submissions to the benchmark test and discuss implications for the design of future tests and simple neurons models
Resumo:
The objective of the investigation was the development of a test that would readily identify the potential of an aggregate to cause D-cracking because of its susceptivity to critical saturation. A Press-Ur-Meter was modified by replacing the air chamber with a one-inch diameter plastic tube calibrated in milli-. It was concluded that the pore index was sufficiently reliable to determine the D-cracking potential of limestone aggregates in all but a few cases where marginal results were obtained. Consistently poor or good results were always in agreement with established service records or concrete durability testing. In those instances where marginal results are obtained, the results of concrete durability testing should be considered when making the final determination of the D-cracking susceptibility of the aggregate in question. The following applications for the pore index test have been recommended for consideration: concrete durability testing be discontinued in the evaluation process of new aggregate sources with pore index results between 0-20 (Class 2 durability) and over 35 (Class 1) durability; composite aggregates with intermediate pore index results of 20-35 be tested on each stone type to facilitate the possible removal of low durability stone from the production process; and additional investigation should be made to evaluate the possibility of using the test to monitor and upgrade the acceptance of aggregate from sources associated with D-cracking.
Resumo:
The compressive strength of concrete is an important factor in the design of concrete structures and pavements. To assure the quality of the concrete placed at the project, concrete compressive cylinders are made at the jobsite. These cylinders undergo a destructive test to determine their compressive strength. However, the determination of concrete compressive strength of the concrete actually in the structure or pavement is frequently desirable. For this reason, a nondestructive test of the concrete is required. A nondestructive test of concrete compressive strength should be economical, easily performed by field personnel, and capable of producing accurate, reproducible results. The nondestructive test should be capable of detecting the extent of poor concrete in a pavement or structure due to improper handling, placement, or variations in mixing or materials.
Resumo:
Cardiovascular risk assessment might be improved with the addition of emerging, new tests derived from atherosclerosis imaging, laboratory tests or functional tests. This article reviews relative risk, odds ratios, receiver-operating curves, posttest risk calculations based on likelihood ratios, the net reclassification improvement and integrated discrimination. This serves to determine whether a new test has an added clinical value on top of conventional risk testing and how this can be verified statistically. Two clinically meaningful examples serve to illustrate novel approaches. This work serves as a review and basic work for the development of new guidelines on cardiovascular risk prediction, taking into account emerging tests, to be proposed by members of the 'Taskforce on Vascular Risk Prediction' under the auspices of the Working Group 'Swiss Atherosclerosis' of the Swiss Society of Cardiology in the future.