926 resultados para precision experiment
Resumo:
The concept of metacontingency was taught to undergraduate students of Psychology by using a "game" simulation proposed originally by Vichi, Andery and Glenn (2009). Twenty-five students, distributed into three groups were exposed to six experimental sessions in which they had to make bets and divide the amounts gained. The three groups competed against each other for photocopies quotas. Two contingencies shifted over the sessions. Under Contingency B, the group would win points only if in the previous round each member had received the same amount of points and under Contingency A, winning was contingent on an unequal distribution of the points. We observed that proportional divisions predominated independent of the contingency in course. The manipulation of cultural consequences (winning or losing points) produced consistent modifications in two response categories: 1) choices of the value bet in each round, and 2) divisions of the points among group members. Controlling relations between cultural consequences and the behavior of dividing were statistically significant in one of the groups, whereas in the other two groups controlling relations were observed only in Contingency B. A review of the reinforcement criteria used in the original experiment is suggested.
Resumo:
A measurement of the multi-strange Xi(-) and Omega(-) baryons and their antiparticles by the ALICE experiment at the CERN Large Hadron Collider (LHC) is presented for inelastic proton-proton collisions at a centre-of-mass energy of 7 TeV. The transverse momentum (p(T)) distributions were studied at mid-rapidity (vertical bar y vertical bar < 0.5) in the range of 0.6 < p(T) < 8.5 GeV/c Xi(-) for and Xi(+) baryons, and in the range of 0.8 < P-T < 5 GeV/c for Omega(-) and<(Omega)over bar>(+). Baryons and antibaryons were measured as separate particles and we find that the baryon to antibaryon ratio of both particle species is consistent with unity over the entire range of the measurement. The statistical precision of the current data has allowed us to measure a difference between the mean p(T) of Xi(-) ((Xi) over bar)(+) and Omega(-) ((Omega) over bar (+)). Particle yields, mean pi, and the spectra in the intermediate pi range are not well described by the PYTHIA Perugia 2011 tune Monte Carlo event generator, which has been tuned to reproduce the early LHC data. The discrepancy is largest for Omega(-)((Omega) over bar (+)). This PYTHIA tune approaches the pi spectra of Xi(-) and Xi(+) baryons below p(T) <0.85 GeV/c and describes the Xi(-) and Xi(+) spectra above p(T) > 6.0 GeV/c. We also illustrate the difference between the experimental data and model by comparing the corresponding ratios of (Omega(-) +(Omega) over bar (+))/(Xi(-) + Xi(+)) as a function of transverse mass. (C) 2012 CERN. Published by Elsevier B.V. All rights reserved.
Resumo:
The continued growth of large cities is producing increasing volumes of urban sewage sludge. Disposing of this waste without damaging the environment requires careful management. The application of large quantities of biosolids (treated sewage sludge) to agricultural lands for many years may result in the excessive accumulation of nutrients like phosphorus (P) and thereby raise risks of eutrophication in nearby water bodies. We evaluated the fractionation of P in samples of an Oxisol collected as part of a field experiment in which biosolids were added at three rates to a maize (Zea mays L) plantation over four consecutive years. The biosolids treatments were equivalent to one, two and four times the recommended N rate for maize crops. In a fourth treatment, mineral fertilizer was applied at the rate recommended for maize. Inorganic P forms were extracted with ammonium chloride to remove soluble and loosely bound P; P bound to aluminum oxide (P-Al) was extracted with ammonium fluoride; P bound to iron oxide (P-Fe) was extracted with sodium hydroxide; and P bound to calcium (P-Ca) was extracted with sulfuric acid. Organic P was calculated as the difference between total P and inorganic P. The predominant fraction of P was P-Fe, followed by P-Al and P-Ca. P fractions were positively correlated to the amounts of P applied, except for P-Ca. The low values of P-Ca were due to the advanced weathering processes to which the Oxisol have been subjected, under which forms of P-Ca are converted to P-Fe and P-Al. The fertilization with P via biosolids increased P availability for maize plants even when a large portion of P was converted to more stable forms. Phosphorus content in maize leaves and grains was positively correlated with P fractions in soils. From these results it can be concluded that the application of biosolids in highly weathered tropical clayey soils for many years, even above the recommended rate based on N requirements for maize, tend to be less potentially hazardous to the environment than in less weathered sandy soils because the non-readily P fractions are predominant after the addition of biosolids. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The Carr-Purcell pulse sequence, with low refocusing flip angle, produces echoes midway between refocusing pulses that decay to a minimum value dependent on T*(2). When the refocusing flip angle was pi/2 (CP90) and tau > T*(2), the signal after the minimum value, increased to reach a steady-state free precession regime (SSFP), composed of a free induction decay signal after each pulse and an echo, before the next pulse. When tau < T*(2), the signal increased from the minimum value to the steady-state regime with a time constant (T*) = 2T(1)T(2)/(T-1 + T-2). identical to the time constant observed in the SSFP sequence, known as the continuous wave free precession (CWFP). The steady-state amplitude obtained with M-cp90 = M0T2/(T-1+T-2) was identical to CWFP. Therefore, this sequence was named CP-CWFP because it is a Carr-Purcell sequence that produces results similar to the CWFP. However, CP-CWFP is a better sequence for measuring the longitudinal and transverse relaxation times in single scan, when the sample exhibits T-1 similar to T-2. Therefore, this sequence can be a useful method in time domain NMR and can be widely used in the agriculture, food and petrochemical industries because those samples tend to have similar relaxation times in low magnetic fields. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Abstract Background The expression of glucocorticoid-receptor (GR) seems to be a key mechanism in the regulation of glucocorticoid (GC) sensitivity and is potentially involved in cases of GC resistance or hypersensitivity. The aim of this study is to describe a method for quantitation of GR alpha isoform (GRα) expression using real-time PCR (qrt-PCR) with analytical capabilities to monitor patients, offering standard-curve reproducibility as well as intra- and inter-assay precision. Results Standard-curves were constructed by employing standardized Jurkat cell culture procedures, both for GRα and BCR (breakpoint cluster region), as a normalizing gene. We evaluated standard-curves using five different sets of cell culture passages, RNA extraction, reverse transcription, and qrt-PCR quantification. Intra-assay precision was evaluated using 12 replicates of each gene, for 2 patients, in a single experiment. Inter-assay precision was evaluated on 8 experiments, using duplicate tests of each gene for two patients. Standard-curves were reproducible, with CV (coefficient of variation) of less than 11%, and Pearson correlation coefficients above 0,990 for most comparisons. Intra-assay and inter-assay were 2% and 7%, respectively. Conclusion This is the first method for quantitation of GRα expression with technical characteristics that permit patient monitoring, in a fast, simple and robust way.
Resumo:
Variable rate sprinklers (VRS) have been developed to promote localized water application of irrigated areas. In Precision Irrigation, VRS permits better control of flow adjustment and, at the same time, provides satisfactory radial distribution profiles for various pressures and flow rates are really necessary. The objective of this work was to evaluate the performance and radial distribution profiles of a developed VRS which varies the nozzle cross sectional area by moving a pin in or out using a stepper motor. Field tests were performed under different conditions of service pressure, rotation angles imposed on the pin and flow rate which resulted in maximal water throw radiuses ranging from 7.30 to 10.38 m. In the experiments in which the service pressure remained constant, the maximal throw radius varied from 7.96 to 8.91 m. Averages were used of repetitions performed under conditions without wind or with winds less than 1.3 m s-1. The VRS with the four stream deflector resulted in greater water application throw radius compared to the six stream deflector. However, the six stream deflector had greater precipitation intensities, as well as better distribution. Thus, selection of the deflector to be utilized should be based on project requirements, respecting the difference in the obtained results. With a small opening of the nozzle, the VRS produced small water droplets that visually presented applicability for foliar chemigation. Regarding the comparison between the estimated and observed flow rates, the stepper motor produced excellent results.
Resumo:
The objective of this study was to evaluate accuracy, precision and robustness of two methods to obtain silage samples, in comparison with extraction of liquor by manual screw-press. Wet brewery residue alone or combined with soybean hulls and citrus pulp were ensiled in laboratory silos. Liquor was extracted by a manual screw-press and a 2-mL aliquot was fixed with 0.4 mL formic acid. Two 10-g silage samples from each silo were diluted in 20 mL deionized water or 17% formic acid solution (alternative methods). Aliquots obtained by the three methods were used to determine the silage contents of fermentation end-products. The accuracy of the alternative methods was evaluated by comparing mean bias of estimates obtained by manual screw-press and by alternative methods, whereas precision was assessed by the root mean square prediction error and the residual error. Robustness was determined by studying the interaction between bias and chemical components, pH, in vitro dry matter digestibility (IVDMD) and buffer capacity. The 17% formic acid method was more accurate for estimating acetic, butyric and lactic acids, although it resulted in low overestimates of propionic acid and underestimates of ethanol. The deionized water method overestimated acetic and propionic acids and slightly underestimated ethanol. The 17% formic acid method was more precise than deionized water for estimating all organic acids and ethanol. The robustness of each method with respect to variation in the silage chemical composition, IVDMD and pH is dependent on the fermentation end-product at evaluation. The robustness of the alternative methods seems to be critical at the determination of lactic acid and ethanol contents.
Resumo:
This work investigated the effects of frequency and precision of feedback on the learning of a dual-motor task. One hundred and twenty adults were randomly assigned to six groups of different knowledge of results (KR), frequency (100%, 66% or 33%) and precision (specific or general) levels. In the stabilization phase, participants performed the dual task (combination of linear positioning and manual force control) with the provision of KR. Ten non-KR adaptation trials were performed for the same task, but with the introduction of an electromagnetic opposite traction force. The analysis showed a significant main effect for frequency of KR. The participants who received KR in 66% of the stabilization trials showed superior adaptation performance than those who received 100% or 33%. This finding reinforces that there is an optimal level of information, neither too high nor too low, for motor learning to be effective.
Resumo:
The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.
Resumo:
Cirrus clouds are an interesting point in the research of the atmosphere due their behavior and the effect on the earth radiation budget. They can affect the atmospheric radiation budget by reflecting the incoming solar radiation and absorbing the outgoing terrestrial radiation. Also, this cloud type is involved in the dehydration of the upper troposphere and lower stratosphere. So, it is interesting to increment the measurements of this type of clouds from the ground. During November and December 2012, through the CHUVA-SUL campaign, measurements with lidar in Santa Maria, Rio Grande do Sul were conducted. The system installed in Santa Maria site (29.8 °S; 53.7 °W, 100 m asl) was a single elastic-backscatter lidar using the wavelength of 532 nm. Some days with cirrus clouds lidar measurements were detected. Four days with presence of cirrus cloud are showed in the present study. These days, 7, 8, 19 and 28 November 2012, was selected due the persistence of cirrus clouds over many hours. The raw retrieval lidar signals and inverted backscatter coefficient profiles were analyzed for the selected days. Base and top height was obtained by analysis of raw signal and backscatter coefficient. Extinction coefficient profiles were obtained by the assumption of the lidar ratio. Cirrus cloud optical depth (COD) values were calculated, from the integration of the extinction coefficient between the base and top altitudes of the cirrus clouds.
Resumo:
Biomass burning represents one of the largest sources of particulate matter to the atmosphere, which results in a significant perturbation to the Earth’s radiative balance coupled with serious negative impacts on public health. Globally, biomass burning aerosols are thought to exert a small warming effect of 0.03 Wm-2, however the uncertainty is 4 times greater than the central estimate. On regional scales, the impact is substantially greater, particularly in areas such as the Amazon Basin where large, intense and frequent burning occurs on an annual basis for several months (usually from August-October). Furthermore, a growing number of people live within the Amazon region, which means that they are subject to the deleterious effects on their health from exposure to substantial volumes of polluted air. Initial results from the South American Biomass Burning Analysis (SAMBBA) field experiment, which took place during September and October 2012 over Brazil, are presented here. A suite of instrumentation was flown on-board the UK Facility for Airborne Atmospheric Measurement (FAAM) BAe-146 research aircraft and was supported by ground based measurements, with extensive measurements made in Porto Velho, Rondonia. The aircraft sampled a range of conditions with sampling of fresh biomass burning plumes, regional haze and elevated biomass burning layers within the free troposphere. The physical, chemical and optical properties of the aerosols across the region will be characterized in order to establish the impact of biomass burning on regional air quality, weather and climate.
Resumo:
This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
Precision horticulture and spatial analysis applied to orchards are a growing and evolving part of precision agriculture technology. The aim of this discipline is to reduce production costs by monitoring and analysing orchard-derived information to improve crop performance in an environmentally sound manner. Georeferencing and geostatistical analysis coupled to point-specific data mining allow to devise and implement management decisions tailored within the single orchard. Potential applications range from the opportunity to verify in real time along the season the effectiveness of cultural practices to achieve the production targets in terms of fruit size, number, yield and, in a near future, fruit quality traits. These data will impact not only the pre-harvest but their effect will extend to the post-harvest sector of the fruit chain. Chapter 1 provides an updated overview on precision horticulture , while in Chapter 2 a preliminary spatial statistic analysis of the variability in apple orchards is provided before and after manual thinning; an interpretation of this variability and how it can be managed to maximize orchard performance is offered. Then in Chapter 3 a stratification of spatial data into management classes to interpret and manage spatial variation on the orchard is undertaken. An inverse model approach is also applied to verify whether the crop production explains environmental variation. In Chapter 4 an integration of the techniques adopted before is presented. A new key for reading the information gathered within the field is offered. The overall goal of this Dissertation was to probe into the feasibility, the desirability and the effectiveness of a precision approach to fruit growing, following the lines of other areas of agriculture that already adopt this management tool. As existing applications of precision horticulture already had shown, crop specificity is an important factor to be accounted for. This work focused on apple because of its importance in the area where the work was carried out, and worldwide.