930 resultados para Large detector systems for particle and astroparticle physics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The smart grid concept is a key issue in the future power systems, namely at the distribution level, with deep concerns in the operation and planning of these systems. Several advantages and benefits for both technical and economic operation of the power system and of the electricity markets are recognized. The increasing integration of demand response and distributed generation resources, all of them mostly with small scale distributed characteristics, leads to the need of aggregating entities such as Virtual Power Players. The operation business models become more complex in the context of smart grid operation. Computational intelligence methods can be used to give a suitable solution for the resources scheduling problem considering the time constraints. This paper proposes a methodology for a joint dispatch of demand response and distributed generation to provide energy and reserve by a virtual power player that operates a distribution network. The optimal schedule minimizes the operation costs and it is obtained using a particle swarm optimization approach, which is compared with a deterministic approach used as reference methodology. The proposed method is applied to a 33-bus distribution network with 32 medium voltage consumers and 66 distributed generation units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transverse polarization of Λ and Λ¯ hyperons produced in proton--proton collisions at a center-of-mass energy of 7 TeV is measured. The analysis uses 760 μb−1 of minimum bias data collected by the ATLAS detector at the LHC in the year 2010. The measured transverse polarization averaged over Feynman xF from 5×10−5 to 0.01 and transverse momentum pT from 0.8 to 15 GeV is −0.010±0.005(stat)±0.004(syst) for Λ and 0.002±0.006(stat)±0.004(syst) for Λ¯. It is also measured as a function of xF and pT, but no significant dependence on these variables is observed. Prior to this measurement, the polarization was measured at fixed-target experiments with center-of-mass energies up to about 40 GeV. The ATLAS results are compatible with the extrapolation of a fit from previous measurements to the xF range covered by this mesurement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for a heavy, CP-odd Higgs boson, A, decaying into a Z boson and a 125 GeV Higgs boson, h, with the ATLAS detector at the LHC is presented. The search uses proton–proton collision data at a centre-of-mass energy of 8 TeV corresponding to an integrated luminosity of 20.3 fb−1. Decays of CP-even h bosons to ττ or bb pairs with the Z boson decaying to electron or muon pairs are considered, as well as h→bbh→bb decays with the Z boson decaying to neutrinos. No evidence for the production of an A boson in these channels is found and the 95% confidence level upper limits derived for View the MathML sourceσ(gg→A)×BR(A→Zh)×BR(h→ff¯) are 0.098–0.013 pb for f=τf=τ and 0.57–0.014 pb for f=bf=b in a range of mA=220–1000 GeVmA=220–1000 GeV. The results are combined and interpreted in the context of two-Higgs-doublet models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The results of a dedicated search for pair production of scalar partners of charm quarks are reported. The search is based on an integrated luminosity of 20.3fb−1 of pp collisions at s√=8 TeV recorded with the ATLAS detector at the LHC. The search is performed using events with large missing transverse momentum and at least two jets, where the two leading jets are each tagged as originating from c-quarks. Events containing isolated electrons or muons are vetoed. In an R-parity-conserving minimal supersymmetric scenario in which a single scalar-charm state is kinematically accessible, and where it decays exclusively into a charm quark and a neutralino, 95% confidence-level upper limits are obtained in the scalar-charm—neutralino mass plane such that, for neutralino masses below 200 GeV, scalar-charm masses up to 490 GeV are excluded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter presents a search for a hidden-beauty counterpart of the X(3872) in the mass ranges 10.05--10.31 GeV and 10.40--11.00 GeV, in the channel Xb→π+π−Υ(1S)(→μ+μ−), using 16.2 fb−1 of s√=8 TeV pp collision data collected by the ATLAS detector at the LHC. No evidence for new narrow states is found, and upper limits are set on the product of the Xb cross section and branching fraction, relative to those of the Υ(2S), at the 95% confidence level using the CLS approach. These limits range from 0.8% to 4.0%, depending on mass. For masses above 10.1 GeV, the expected upper limits from this analysis are the most restrictive to date. Searches for production of the Υ(13DJ), Υ(10860), and Υ(11020) states also reveal no significant signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for new charged massive gauge bosons, called W′, is performed with the ATLAS detector at the LHC, in proton--proton collisions at a centre-of-mass energy of s√ = 8 TeV, using a dataset corresponding to an integrated luminosity of 20.3 fb−1. This analysis searches for W′ bosons in the W′→tb¯ decay channel in final states with electrons or muons, using a multivariate method based on boosted decision trees. The search covers masses between 0.5 and 3.0 TeV, for right-handed or left-handed W′ bosons. No significant deviation from the Standard Model expectation is observed and limits are set on the W′→tb¯ cross-section times branching ratio and on the W′-boson effective couplings as a function of the W′-boson mass using the CLs procedure. For a left-handed (right-handed) W′ boson, masses below 1.70 (1.92) TeV are excluded at 95% confidence level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter reports a measurement of the exclusive γγ→ℓ+ℓ−(ℓ=e,μ) cross-section in proton--proton collisions at a centre-of-mass energy of 7 TeV by the ATLAS experiment at the LHC, based on an integrated luminosity of 4.6 fb−1. For the electron or muon pairs satisfying exclusive selection criteria, a fit to the dilepton acoplanarity distribution is used to extract the fiducial cross-sections. The cross-section in the electron channel is determined to be σexcl.γγ→e+e−=0.428±0.035(stat.)±0.018(syst.) pb for a phase-space region with invariant mass of the electron pairs greater than 24 GeV, in which both electrons have transverse momentum pT>12 GeV and pseudorapidity |η|<2.4. For muon pairs with invariant mass greater than 20 GeV, muon transverse momentum pT>10 GeV and pseudorapidity |η|<2.4, the cross-section is determined to be σexcl.γγ→μ+μ−=0.628±0.032(stat.)±0.021(syst.) pb. When proton absorptive effects due to the finite size of the proton are taken into account in the theory calculation, the measured cross-sections are found to be consistent with the theory prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of the total and differential cross sections of Higgs boson production are performed using 20.3 fb−1 of pp collisions produced by the Large Hadron Collider at a center-of-mass energy of s√=8 TeV and recorded by the ATLAS detector. Cross sections are obtained from measured H→γγ and H→ZZ∗→4ℓ event yields, which are combined accounting for detector efficiencies, fiducial acceptances and branching fractions. Differential cross sections are reported as a function of Higgs boson transverse momentum, Higgs boson rapidity, number of jets in the event, and transverse momentum of the leading jet. The total production cross section is determined to be σpp→H=33.0±5.3(stat)±1.6(sys)pb. The measurements are compared to state-of-the-art predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for the decay of neutral, weakly interacting, long-lived particles using data collected by the ATLAS detector at the LHC is presented. This analysis uses the full dataset recorded in 2012: 20.3 fb−1 of proton--proton collision data at s√=8 TeV. The search employs techniques for reconstructing decay vertices of long-lived particles decaying to jets in the inner tracking detector and muon spectrometer. Signal events require at least two reconstructed vertices. No significant excess of events over the expected background is found, and limits as a function of proper lifetime are reported for the decay of the Higgs boson and other scalar bosons to long-lived particles and for Hidden Valley Z′ and Stealth SUSY benchmark models. The first search results for displaced decays in Z′ and Stealth SUSY models are presented. The upper bounds of the excluded proper lifetimes are the most stringent to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An observation of the Λ0b→ψ(2S)Λ0 decay and a comparison of its branching fraction with that of the Λ0b→J/ψΛ0 decay has been made with the ATLAS detector in proton--proton collisions at s√=8TeV at the LHC using an integrated luminosity of 20.6fb−1. The J/ψ and ψ(2S) mesons are reconstructed in their decays to a muon pair, while the Λ0→pπ− decay is exploited for the Λ0 baryon reconstruction. The Λ0b baryons are reconstructed with transverse momentum pT>10GeV and pseudorapidity |η|<2.1. The measured branching ratio of the Λ0b→ψ(2S)Λ0 and Λ0b→J/ψΛ0 decays is Γ(Λ0b→ψ(2S)Λ0)/Γ(Λ0b→J/ψΛ0)=0.501±0.033(stat)±0.019(syst), lower than the expectation from the covariant quark model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study particle dispersion advected by a synthetic turbulent flow from a Lagrangian perspective and focus on the two-particle and cluster dispersion by the flow. It has been recently reported that Richardson¿s law for the two-particle dispersion can stem from different dispersion mechanisms, and can be dominated by either diffusive or ballistic events. The nature of the Richardson dispersion depends on the parameters of our flow and is discussed in terms of the values of a persistence parameter expressing the relative importance of the two above-mentioned mechanisms. We support this analysis by studying the distribution of interparticle distances, the relative velocity correlation functions, as well as the relative trajectories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Particle density, gravimetric and volumetric water contents and porosity are important basic concepts to characterize porous systems such as soils. This paper presents a proposal of an experimental method to measure these physical properties, applicable in experimental physics classes, in porous media samples consisting of spheres with the same diameter (monodisperse medium) and with different diameters (polydisperse medium). Soil samples are not used given the difficulty of working with this porous medium in laboratories dedicated to teaching basic experimental physics. The paper describes the method to be followed and results of two case studies, one in monodisperse medium and the other in polydisperse medium. The particle density results were very close to theoretical values for lead spheres, whose relative deviation (RD) was -2.9 % and +0.1 % RD for the iron spheres. The RD of porosity was also low: -3.6 % for lead spheres and -1.2 % for iron spheres, in the comparison of procedures – using particle and porous medium densities and saturated volumetric water content – and monodisperse and polydisperse media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study particle dispersion advected by a synthetic turbulent flow from a Lagrangian perspective and focus on the two-particle and cluster dispersion by the flow. It has been recently reported that Richardson¿s law for the two-particle dispersion can stem from different dispersion mechanisms, and can be dominated by either diffusive or ballistic events. The nature of the Richardson dispersion depends on the parameters of our flow and is discussed in terms of the values of a persistence parameter expressing the relative importance of the two above-mentioned mechanisms. We support this analysis by studying the distribution of interparticle distances, the relative velocity correlation functions, as well as the relative trajectories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.