940 resultados para Detector alignment and calibration methods (lasers, sources, particle-beams)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY The aim of this work was to compare, from a parasitological ( Cryptosporidiumspp. and Giardia duodenalis), bacteriological (total and thermotolerants coliforms) and physicochemical perspective, water sources used for drinking and irrigation of vegetables intended to be sold for human consumption. From January 2010 to May 2011, samples of different water sources from vegetable producing properties were collected; 100 liters for parasitological analysis, 200 mL for bacteriological analysis, and five liters for physicochemical analysis. Water samples were filtered under vacuum with a kit containing a cellulose acetate membrane filter, 1.2 µm (Millipore(r), Barueri, SP, Brazil). The material retained on the membrane was mechanically extracted and analyzed by direct immunofluorescence (Merifluor(r)kit). From 20 rural properties investigated, 10 had artesian wells (40 samples), 10 had common wells (40 samples), and one had a mine (four samples), the latter contaminated by Cryptosporidiumspp. In samples from artesian wells, 90 to 130 meters depth, 42.5% were positive for total coliforms and 5.0% were identified to have abnormal coloration. From the samples of common wells, 14 to 37 meters depth, 87.5% were contaminated with total coliforms, 82.5% were positive for thermotolerant coliforms, and 12.5% had color abnormalities. We did not detect the presence of Giardiaspp. or Cryptosporidiumspp. in artesian and common wells. The use of artesian or common wells is an important step in the control of the spreading of zoonoses, particularly Cryptosporidiumspp. and Giardiaspp., as well as artesian wells for coliform control in local production of vegetables to be marketed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As an introduction to a series of articles focused on the exploration of particular tools and/or methods to bring together digital technology and historical research, the aim of this paper is mainly to highlight and discuss in what measure those methodological approaches can contribute to improve analytical and interpretative capabilities available to historians. In a moment when the digital world present us with an ever-increasing variety of tools to perform extraction, analysis and visualization of large amounts of text, we thought it would be relevant to bring the digital closer to the vast historical academic community. More than repeating an idea of digital revolution introduced in the historical research, something recurring in the literature since the 1980s, the aim was to show the validity and usefulness of using digital tools and methods, as another set of highly relevant tools that the historians should consider. For this several case studies were used, combining the exploration of specific themes of historical knowledge and the development or discussion of digital methodologies, in order to highlight some changes and challenges that, in our opinion, are already affecting the historians' work, such as a greater focus given to interdisciplinarity and collaborative work, and a need for the form of communication of historical knowledge to become more interactive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the civil engineering field, the use of the Finite Element Method has acquired a significant importance, since numerical simulations have been employed in a broad field, which encloses the design, analysis and prediction of the structural behaviour of constructions and infrastructures. Nevertheless, these mathematical simulations can only be useful if all the mechanical properties of the materials, boundary conditions and damages are properly modelled. Therefore, it is required not only experimental data (static and/or dynamic tests) to provide references parameters, but also robust calibration methods able to model damage or other special structural conditions. The present paper addresses the model calibration of a footbridge bridge tested with static loads and ambient vibrations. Damage assessment was also carried out based on a hybrid numerical procedure, which combines discrete damage functions with sets of piecewise linear damage functions. Results from the model calibration shows that the model reproduces with good accuracy the experimental behaviour of the bridge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for a heavy, CP-odd Higgs boson, A, decaying into a Z boson and a 125 GeV Higgs boson, h, with the ATLAS detector at the LHC is presented. The search uses proton–proton collision data at a centre-of-mass energy of 8 TeV corresponding to an integrated luminosity of 20.3 fb−1. Decays of CP-even h bosons to ττ or bb pairs with the Z boson decaying to electron or muon pairs are considered, as well as h→bbh→bb decays with the Z boson decaying to neutrinos. No evidence for the production of an A boson in these channels is found and the 95% confidence level upper limits derived for View the MathML sourceσ(gg→A)×BR(A→Zh)×BR(h→ff¯) are 0.098–0.013 pb for f=τf=τ and 0.57–0.014 pb for f=bf=b in a range of mA=220–1000 GeVmA=220–1000 GeV. The results are combined and interpreted in the context of two-Higgs-doublet models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing interest for greener and biological methods of synthesis has led to the development of non-toxic and comparatively more bioactive nanoparticles. Unlike physical and chemical methods of nanoparticle synthesis, microbial synthesis in general and mycosynthesis in particular is cost-effective and environment-friendly. However, different aspects, such as the rate of synthesis, monodispersity and downstream processing, need to be improved. Many fungal-based mechanisms have been proposed for the formation of silver nanoparticles (AgNPs), mainly those involving the presence of nitrate reductase, which has been detected in filtered fungus cell used for AgNPs production. There is a general acceptance that nitrate reductase is the main responsible for the reduction of Ag ions for the formation of AgNPs. However, this generally accepted mechanism for fungal AgNPs production is not totally understood. In order to elucidate the molecules participating in the mechanistic formation of metal nanoparticles, the current study is focused on the enzymes and other organic compounds involved in the biosynthesis of AgNPs. The use of each free fungal mycelium of both Stereum hirsutum and Fusarium oxysporum will be assessed. In order to identify defective mutants on the nitrate reductase structural gene niaD, fungal cultures of S.hirsutum and F.oxysporum will be selected by chlorate resistance. In addition, in order to verify if each compound identified as key-molecule influenced on the production of nanoparticles, an in vitro assay using different nitrogen sources will be developed. Lately, fungal extracellular enzymes will be measured and an in vitro assay will be done. Finally, The nanoparticle formation and its characterization will be evaluated by UV-visible spectroscopy, electron microscopy (TEM), X-ray diffraction analysis (XRD), Fourier transforms infrared spectroscopy (FTIR), and LC-MS/MS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter presents a search at the LHC for s-channel single top-quark production in proton-proton collisions at a centre-of-mass energy of 8 TeV. The analyzed data set was recorded by the ATLAS detector and corresponds to an integrated luminosity of 20.3 fb−1. Selected events contain one charged lepton, large missing transverse momentum and exactly two b-tagged jets. A multivariate event classifier based on boosted decision trees is developed to discriminate s-channel single top-quark events from the main background contributions. The signal extraction is based on a binned maximum-likelihood fit of the output classifier distribution. The analysis leads to an upper limit on the s-channel single top-quark production cross-section of 14.6 pb at the 95% confidence level. The fit gives a cross-section of σs=5.0±4.3 pb, consistent with the Standard Model expectation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for new charged massive gauge bosons, called W′, is performed with the ATLAS detector at the LHC, in proton--proton collisions at a centre-of-mass energy of s√ = 8 TeV, using a dataset corresponding to an integrated luminosity of 20.3 fb−1. This analysis searches for W′ bosons in the W′→tb¯ decay channel in final states with electrons or muons, using a multivariate method based on boosted decision trees. The search covers masses between 0.5 and 3.0 TeV, for right-handed or left-handed W′ bosons. No significant deviation from the Standard Model expectation is observed and limits are set on the W′→tb¯ cross-section times branching ratio and on the W′-boson effective couplings as a function of the W′-boson mass using the CLs procedure. For a left-handed (right-handed) W′ boson, masses below 1.70 (1.92) TeV are excluded at 95% confidence level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter reports a measurement of the exclusive γγ→ℓ+ℓ−(ℓ=e,μ) cross-section in proton--proton collisions at a centre-of-mass energy of 7 TeV by the ATLAS experiment at the LHC, based on an integrated luminosity of 4.6 fb−1. For the electron or muon pairs satisfying exclusive selection criteria, a fit to the dilepton acoplanarity distribution is used to extract the fiducial cross-sections. The cross-section in the electron channel is determined to be σexcl.γγ→e+e−=0.428±0.035(stat.)±0.018(syst.) pb for a phase-space region with invariant mass of the electron pairs greater than 24 GeV, in which both electrons have transverse momentum pT>12 GeV and pseudorapidity |η|<2.4. For muon pairs with invariant mass greater than 20 GeV, muon transverse momentum pT>10 GeV and pseudorapidity |η|<2.4, the cross-section is determined to be σexcl.γγ→μ+μ−=0.628±0.032(stat.)±0.021(syst.) pb. When proton absorptive effects due to the finite size of the proton are taken into account in the theory calculation, the measured cross-sections are found to be consistent with the theory prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter presents measurements of correlated production of nearby jets in Pb+Pb collisions at sNN−−−√=2.76 TeV using the ATLAS detector at the Large Hadron Collider. The measurement was performed using 0.14 nb−1 of data recorded in 2011. The production of correlated jet pairs was quantified using the rate, RΔR, of ``neighbouring'' jets that accompany ``test'' jets within a given range of angular distance, ΔR, in the pseudorapidity--azimuthal angle plane. The jets were measured in the ATLAS calorimeter and were reconstructed using the anti-kt algorithm with radius parameters d=0.2, 0.3, and 0.4. RΔR was measured in different Pb+Pb collision centrality bins, characterized by the total transverse energy measured in the forward calorimeters. A centrality dependence of RΔR is observed for all three jet radii with RΔR found to be lower in central collisions than in peripheral collisions. The ratios formed by the RΔR values in different centrality bins and the values in the 40--80 % centrality bin are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An observation of the Λ0b→ψ(2S)Λ0 decay and a comparison of its branching fraction with that of the Λ0b→J/ψΛ0 decay has been made with the ATLAS detector in proton--proton collisions at s√=8TeV at the LHC using an integrated luminosity of 20.6fb−1. The J/ψ and ψ(2S) mesons are reconstructed in their decays to a muon pair, while the Λ0→pπ− decay is exploited for the Λ0 baryon reconstruction. The Λ0b baryons are reconstructed with transverse momentum pT>10GeV and pseudorapidity |η|<2.1. The measured branching ratio of the Λ0b→ψ(2S)Λ0 and Λ0b→J/ψΛ0 decays is Γ(Λ0b→ψ(2S)Λ0)/Γ(Λ0b→J/ψΛ0)=0.501±0.033(stat)±0.019(syst), lower than the expectation from the covariant quark model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the sphygmomanometers calibration accuracy and the physical conditions of the cuff-bladder, bulb, pump, and valve. METHODS: Sixty hundred and forty five aneroid sphygmomanometers were evaluated, 521 used in private practice and 124 used in hospitals. Aneroid manometers were tested against a properly calibrated mercury manometer and were considered calibrated when the error was <=3mm Hg. The physical conditions of the cuffs-bladder, bulb, pump, and valve were also evaluated. RESULTS: Of the aneroid sphygmomanometers tested, 51% of those used in private practice and 56% of those used in hospitals were found to be not accurately calibrated. Of these, the magnitude of inaccuracy ranged from 4 to 8mm Hg in 70% and 51% of the devices, respectively. The problems found in the cuffs - bladders, bulbs, pumps, and valves of the private practice and hospital devices were bladder damage (34% vs. 21%, respectively), holes/leaks in the bulbs (22% vs. 4%, respectively), and rubber aging (15% vs. 12%, respectively). Of the devices tested, 72% revealed at least one problem interfering with blood pressure measurement accuracy. CONCLUSION: Most of the manometers evaluated, whether used in private practice or in hospitals, were found to be inaccurate and unreliable, and their use may jeopardize the diagnosis and treatment of arterial hypertension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study on the ecology of phlebotomine sandfly fauna in a restricted focus of cutaneous leishmaniasis in northern Venezuela was undertaken in order to investigate the species responsible for the transmission. The study area and catching methods for phlebotomine sandflies are described. A total of 9,061 females and 1,662 males were collected during a year-term study. 12 species of Lutzomya and 1 species of Brumptomya sp. were identified. Absolute and relative abundance and ocurrence for each species were determined. The rel ative occurrence allowed to distinguish the common species, viz. L. panamensis, L. ovallesi, L. gomezi, L. tinidadensis, L. atroclavata, L. cayennensis, L. shannoni and L. olmeca bicolor from the rare species vis., L. punctigeniculata, L. rangeliana, L. evansi and L. dubitans. General comments on the species composition of the sandfly fauna in this locality are made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-structural protein 2 (NS2) plays an important role in hepatitis C virus (HCV) assembly, but neither the exact contribution of this protein to the assembly process nor its complete structure are known. In this study we used a combination of genetic, biochemical and structural methods to decipher the role of NS2 in infectious virus particle formation. A large panel of NS2 mutations targeting the N-terminal membrane binding region was generated. They were selected based on a membrane topology model that we established by determining the NMR structures of N-terminal NS2 transmembrane segments. Mutants affected in virion assembly, but not RNA replication, were selected for pseudoreversion in cell culture. Rescue mutations restoring virus assembly to various degrees emerged in E2, p7, NS3 and NS2 itself arguing for an interaction between these proteins. To confirm this assumption we developed a fully functional JFH1 genome expressing an N-terminally tagged NS2 demonstrating efficient pull-down of NS2 with p7, E2 and NS3 and, to a lower extent, NS5A. Several of the mutations blocking virus assembly disrupted some of these interactions that were restored to various degrees by those pseudoreversions that also restored assembly. Immunofluorescence analyses revealed a time-dependent NS2 colocalization with E2 at sites close to lipid droplets (LDs) together with NS3 and NS5A. Importantly, NS2 of a mutant defective in assembly abrogates NS2 colocalization around LDs with E2 and NS3, which is restored by a pseudoreversion in p7, whereas NS5A is recruited to LDs in an NS2-independent manner. In conclusion, our results suggest that NS2 orchestrates HCV particle formation by participation in multiple protein-protein interactions required for their recruitment to assembly sites in close proximity of LDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.