985 resultados para IMPACT-PARAMETER


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The possibility of generalizing gravity in 2+1 dimensions to include higher-derivative terms, thereby allowing for a dynamical theory, opens up a variety of new interesting questions. This is in great contrast with pure Einstein gravity which is a generally covariant theory that has no degrees of freedom - a peculiarity that, in a sense, renders it a little insipid and odorless. The research on gravity of particles moving in a plane, that is, living in flatland, within the context of higher-derivative gravity, leads to novel and interesting effects. For instance, the generation of gravity, antigravity, and gravitational shielding by the interaction of massive scalar bosons via a graviton exchange. In addition, the gravitational deffection angle of a photon, unlike that of Einstein gravity, is dependent of the impact parameter. On the other hand, the great drawback to using linearized general relativity for describing a gravitating string is that this description leads to some unphysical results such as: (i) lack of a gravity force in the nonrelativistic limit; (ii) gravitational deffection independent of the impact parameter. Interesting enough, the effective cure for these pathologies is the replacement of linearized gravity by linearized higher-derivative gravity. We address these issues here

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the problems in the analysis of nucleus-nucleus collisions is to get information on the value of the impact parameter b. This work consists in the application of pattern recognition techniques aimed at associating values of b to groups of events. To this end, a support vec- tor machine (SVM) classifier is adopted to analyze multifragmentation reactions. This method allows to backtracing the values of b through a particular multidimensional analysis. The SVM classification con- sists of two main phase. In the first one, known as training phase, the classifier learns to discriminate the events that are generated by two different model:Classical Molecular Dynamics (CMD) and Heavy- Ion Phase-Space Exploration (HIPSE) for the reaction: 58Ni +48 Ca at 25 AMeV. To check the classification of events in the second one, known as test phase, what has been learned is tested on new events generated by the same models. These new results have been com- pared to the ones obtained through others techniques of backtracing the impact parameter. Our tests show that, following this approach, the central collisions and peripheral collisions, for the CMD events, are always better classified with respect to the classification by the others techniques of backtracing. We have finally performed the SVM classification on the experimental data measured by NUCL-EX col- laboration with CHIMERA apparatus for the previous reaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the main targets of the CMS experiment is to search for the Standard Model Higgs boson. The 4-lepton channel (from the Higgs decay h->ZZ->4l, l = e,mu) is one of the most promising. The analysis is based on the identification of two opposite-sign, same-flavor lepton pairs: leptons are required to be isolated and to come from the same primary vertex. The Higgs would be statistically revealed by the presence of a resonance peak in the 4-lepton invariant mass distribution. The 4-lepton analysis at CMS is presented, spanning on its most important aspects: lepton identification, variables of isolation, impact parameter, kinematics, event selection, background control and statistical analysis of results. The search leads to an evidence for a signal presence with a statistical significance of more than four standard deviations. The excess of data, with respect to the background-only predictions, indicates the presence of a new boson, with a mass of about 126 GeV/c2 , decaying to two Z bosons, whose characteristics are compatible with the SM Higgs ones.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Antihydrogen holds the promise to test, for the first time, the universality of freefall with a system composed entirely of antiparticles. The AEgIS experiment at CERN’s antiproton decelerator aims to measure the gravitational interaction between matter and antimatter by measuring the deflection of a beam of antihydrogen in the Earths gravitational field (g). The principle of the experiment is as follows: cold antihydrogen atoms are synthesized in a Penning-Malberg trap and are Stark accelerated towards a moir´e deflectometer, the classical counterpart of an atom interferometer, and annihilate on a position sensitive detector. Crucial to the success of the experiment is the spatial precision of the position sensitive detector.We propose a novel free-fall detector based on a hybrid of two technologies: emulsion detectors, which have an intrinsic spatial resolution of 50 nm but no temporal information, and a silicon strip / scintillating fiber tracker to provide timing and positional information. In 2012 we tested emulsion films in vacuum with antiprotons from CERN’s antiproton decelerator. The annihilation vertices could be observed directly on the emulsion surface using the microscope facility available at the University of Bern. The annihilation vertices were successfully reconstructed with a resolution of 1–2 μmon the impact parameter. If such a precision can be realized in the final detector, Monte Carlo simulations suggest of order 500 antihydrogen annihilations will be sufficient to determine gwith a 1 % accuracy. This paper presents current research towards the development of this technology for use in the AEgIS apparatus and prospects for the realization of the final detector.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of the current study is to evaluate the fidelity of load cell reading during impact testing in a drop-weight impactor using lumped parameter modeling. For the most common configuration of a moving impactor-load cell system in which dynamic load is transferred from the impactor head to the load cell, a quantitative assessment is made of the possible discrepancy that can result in load cell response. A 3-DOF (degrees-of-freedom) LPM (lumped parameter model) is considered to represent a given impact testing set-up. In this model, a test specimen in the form of a steel hat section similar to front rails of cars is represented by a nonlinear spring while the load cell is assumed to behave in a linear manner due to its high stiffness. Assuming a given load-displacement response obtained in an actual test as the true behavior of the specimen, the numerical solution of the governing differential equations following an implicit time integration scheme is shown to yield an excellent reproduction of the mechanical behavior of the specimen thereby confirming the accuracy of the numerical approach. The spring representing the load cell, however,predicts a response that qualitatively matches the assumed load-displacement response of the test specimen with a perceptibly lower magnitude of load.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the bi-dimensional parameter space of an impact-pair system, shrimp-shaped periodic windows are embedded in chaotic regions. We show that a weak periodic forcing generates new periodic windows near the unperturbed one with its shape and periodicity. Thus, the new periodic windows are parameter range extensions for which the controlled periodic oscillations substitute the chaotic oscillations. We identify periodic and chaotic attractors by their largest Lyapunov exponents. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In a general purpose cloud system efficiencies are yet to be had from supporting diverse applications and their requirements within a storage system used for a private cloud. Supporting such diverse requirements poses a significant challenge in a storage system that supports fine grained configuration on a variety of parameters. This paper uses the Ceph distributed file system, and in particular its global parameters, to show how a single changed parameter can effect the performance for a range of access patterns when tested with an OpenStack cloud system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The common approach to estimate bus dwell time at a BRT station is to apply the traditional dwell time methodology derived for suburban bus stops. In spite of being sensitive to boarding and alighting passenger numbers and to some extent towards fare collection media, these traditional dwell time models do not account for the platform crowding. Moreover, they fall short in accounting for the effects of passenger/s walking along a relatively longer BRT platform. Using the experience from Brisbane busway (BRT) stations, a new variable, Bus Lost Time (LT), is introduced in traditional dwell time model. The bus lost time variable captures the impact of passenger walking and platform crowding on bus dwell time. These are two characteristics which differentiate a BRT station from a bus stop. This paper reports the development of a methodology to estimate bus lost time experienced by buses at a BRT platform. Results were compared with the Transit Capacity and Quality of Servce Manual (TCQSM) approach of dwell time and station capacity estimation. When the bus lost time was used in dwell time calculations it was found that the BRT station platform capacity reduced by 10.1%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vertical vegetation is vegetation growing on, or adjacent to, the unused sunlit exterior surfaces of buildings in cities. Vertical vegetation can improve the energy efficiency of the building on which it is installed mainly by insulating, shading and transpiring moisture from foliage and substrate. Several design parameters may affect the extent of the vertical vegetation's improvement of energy performance. Examples are choice of vegetation, growing medium geometry, north/south aspect and others. The purpose of this study is to quantitatively map out the contribution of several parameters to energy savings in a subtropical setting. The method is thermal simulation based on EnergyPlus configured to reflect the special characteristics of vertical vegetation. Thermal simulation results show that yearly cooling energy savings can reach 25% with realistic design choices in subtropical environments. Heating energy savings are negligible. The most important parameter is the aspect of walls covered by vegetation. Vertical vegetation covering walls facing north (south for the northern hemisphere) will result in the highest energy savings. In making plant selections, the most significant parameter is Leaf Area Index (LAI). Plants with larger LAI, preferably LAI>4, contribute to greater savings whereas vertical vegetation with LAI<2 can actually consume energy. The choice of growing media and its thickness influence both heating and cooling energy consumption. Change of growing medium thickness from 6cm to 8cm causes dramatic increase in energy savings from 2% to 18%. For cooling, it is best to use a growing material with high water retention, due to the importance of evapotranspiration for cooling. Similarly, for increased savings in cooling energy, sufficient irrigation is required. Insufficient irrigation results in the vertical vegetation requiring more energy to cool the building. To conclude, the choice of design parameters for vertical vegetation is crucial in making sure that it contributes to energy savings rather than energy consumption. Optimal design decisions can create a dramatic sustainability enhancement for the built environment in subtropical climates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In January 2011, Brisbane, Australia, experienced a major river flooding event. We aimed to investigate its effects on air quality and assess the role of prompt cleaning activities in reducing the airborne exposure risk. A comprehensive, multi-parameter indoor and outdoor measurement campaign was conducted in 41 residential houses, 2 and 6 months after the flood. The median indoor air concentrations of supermicrometer particle number (PN), PM10, fungi and bacteria 2 months after the flood were comparable to those previously measured in Brisbane. These were 2.88 p cm-3, 15 µg m-3, 804 cfu m-3 and 177 cfu m-3 for flood-affected houses (AFH), and 2.74 p cm-3, 15 µg m-3, 547 cfu m-3 and 167 cfu m-3 for non-affected houses (NFH), respectively. The I/O (indoor/outdoor) ratios of these pollutants were 1.08, 1.38, 0.74 and 1.76 for AFH and 1.03, 1.32, 0.83 and 2.17 for NFH, respectively. The average of total elements (together with transition metals) in indoor dust was 2296 ± 1328 µg m-2 for AFH and 1454 ± 678 µg m-2 for NFH, respectively. In general, the differences between AFH and NFH were not statistically significant, implying the absence of a measureable effect on air quality from the flood. We postulate that this was due to the very swift and effective cleaning of the flooded houses by 60,000 volunteers. Among the various cleaning methods, the use of both detergent and bleach was the most efficient at controlling indoor bacteria. All cleaning methods were equally effective for indoor fungi. This study provides quantitative evidence of the significant impact of immediate post-flood cleaning on mitigating the effects of flooding on indoor bioaerosol contamination and other pollutants.