919 resultados para one-boson-exchange models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cash-in-advance models usually require agents to reallocate money and bonds in fixed periods, every month or quarter, for example. I show that fixed periods underestimate the welfare cost of inflation. I use a model in which agents choose how often they exchange bonds for money. In the benchmark specification, the welfare cost of ten percent instead of zero inflation increases from 0.1 percent of income with fixed periods to one percent with optimal periods. The results are robust to different preferences, to different compositions of income in bonds or money, and to the introduction of capital and labor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Working Project studies five portfolios of currency carry trades formed with the G10 currencies. Performance varies among strategies and the most basic one presents the worst results. I also study the equity and Pure FX risk factors which can explain the portfolios’ returns. Equity factors do not explain these returns while the Pure FX do for some of the strategies. Downside risk measures indicate the importance of using regime indicators to avoid losses. I conclude that although using VAR and threshold regression models with a variety of regime indicators do not allow the perception of different regimes, with a defined exogenous threshold on real exchange rates, an indicator of liquidity and the volatilities of the spot exchange rates it is possible to increase the average returns and reduce drawdowns of the carry trades

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both culture coverage and digital journalism are contemporary phenomena that have undergone several transformations within a short period of time. Whenever the media enters a period of uncertainty such as the present one, there is an attempt to innovate in order to seek sustainability, skip the crisis or find a new public. This indicates that there are new trends to be understood and explored, i.e., how are media innovating in a digital environment? Not only does the professional debate about the future of journalism justify the need to explore the issue, but so do the academic approaches to cultural journalism. However, none of the studies so far have considered innovation as a motto or driver and tried to explain how the media are covering culture, achieving sustainability and engaging with the readers in a digital environment. This research examines how European media which specialize in culture or have an important cultural section are innovating in a digital environment. Specifically, we see how these innovation strategies are being taken in relation to the approach to culture and dominant cultural areas, editorial models, the use of digital tools for telling stories, overall brand positioning and extensions, engagement with the public and business models. We conducted a mixed methods study combining case studies of four media projects, which integrates qualitative web features and content analysis, with quantitative web content analysis. Two major general-interest journalistic brands which started as physical newspapers – The Guardian (London, UK) and Público (Lisbon, Portugal) – a magazine specialized in international affairs, culture and design – Monocle (London, UK) – and a native digital media project that was launched by a cultural organization – Notodo, by La Fábrica – were the four case studies chosen. Findings suggest, on one hand, that we are witnessing a paradigm shift in culture coverage in a digital environment, challenging traditional boundaries related to cultural themes and scope, angles, genres, content format and delivery, engagement and business models. Innovation in the four case studies lies especially along the product dimensions (format and content), brand positioning and process (business model and ways to engage with users). On the other hand, there are still perennial values that are crucial to innovation and sustainability, such as commitment to journalism, consistency (to the reader, to brand extensions and to the advertiser), intelligent differentiation and the capability of knowing what innovation means and how it can be applied, since this thesis also confirms that one formula doesn´t suit all. Changing minds, exceeding cultural inertia and optimizing the memory of the websites, looking at them as living, organic bodies, which continuously interact with the readers in many different ways, and not as a closed collection of articles, are still the main challenges for some media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Finanças

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various differential cross-sections are measured in top-quark pair (tt¯) events produced in proton--proton collisions at a centre-of-mass energy of s√=7 TeV at the LHC with the ATLAS detector. These differential cross-sections are presented in a data set corresponding to an integrated luminosity of 4.6 fb−1. The differential cross-sections are presented in terms of kinematic variables of a top-quark proxy referred to as the pseudo-top-quark whose dependence on theoretical models is minimal. The pseudo-top-quark can be defined in terms of either reconstructed detector objects or stable particles in an analogous way. The measurements are performed on tt¯ events in the lepton+jets channel, requiring exactly one charged lepton and at least four jets with at least two of them tagged as originating from a b-quark. The hadronic and leptonic pseudo-top-quarks are defined via the leptonic or hadronic decay mode of the W boson produced by the top-quark decay in events with a single charged lepton.The cross-section is measured as a function of the transverse momentum and rapidity of both the hadronic and leptonic pseudo-top-quark as well as the transverse momentum, rapidity and invariant mass of the pseudo-top-quark pair system. The measurements are corrected for detector effects and are presented within a kinematic range that closely matches the detector acceptance. Differential cross-section measurements of the pseudo-top-quark variables are compared with several Monte Carlo models that implement next-to-leading order or leading-order multi-leg matrix-element calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the observation of Higgs boson decays to WW∗ based on an excess over background of 6.1 standard deviations in the dilepton final state, where the Standard Model expectation is 5.8 standard deviations. Evidence for the vector-boson fusion (VBF) production process is obtained with a significance of 3.2 standard deviations. The results are obtained from a data sample corresponding to an integrated luminosity of 25 pb−1 from s√=7 and 8 TeV pp collisions recorded by the ATLAS detector at the LHC. For a Higgs boson mass of 125.36 GeV, the ratio of the measured value to the expected value of the total production cross section times branching fraction is 1.09+0.16−0.15 (stat.)+0.17−0.14 (syst.). The corresponding ratios for the gluon fusion and vector-boson fusion production mechanisms are 1.02±0.19 (stat.)+0.22−0.18 (syst.) and 1.27+0.44−0.40 (stat.)+0.30−0.21 (syst.), respectively. At s√=8 TeV, the total production cross sections are measured to be σ(gg→ H→WW∗)=4.6±0.9(stat.)+0.8−0.7(syst.)pb and σ(VBF H→WW∗)=0.51+0.17−0.15(stat.)+0.13−0.08(syst.)pb. The fiducial cross section is determined for the gluon-fusion process in exclusive final states with zero or one associated jet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for a heavy, CP-odd Higgs boson, A, decaying into a Z boson and a 125 GeV Higgs boson, h, with the ATLAS detector at the LHC is presented. The search uses proton–proton collision data at a centre-of-mass energy of 8 TeV corresponding to an integrated luminosity of 20.3 fb−1. Decays of CP-even h bosons to ττ or bb pairs with the Z boson decaying to electron or muon pairs are considered, as well as h→bbh→bb decays with the Z boson decaying to neutrinos. No evidence for the production of an A boson in these channels is found and the 95% confidence level upper limits derived for View the MathML sourceσ(gg→A)×BR(A→Zh)×BR(h→ff¯) are 0.098–0.013 pb for f=τf=τ and 0.57–0.014 pb for f=bf=b in a range of mA=220–1000 GeVmA=220–1000 GeV. The results are combined and interpreted in the context of two-Higgs-doublet models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search is presented for the direct pair production of a chargino and a neutralino pp→χ~±1χ~02, where the chargino decays to the lightest neutralino and the W boson, χ~±1→χ~01(W±→ℓ±ν), while the neutralino decays to the lightest neutralino and the 125 GeV Higgs boson, χ~02→χ~01(h→bb/γγ/ℓ±νqq). The final states considered for the search have large missing transverse momentum, an isolated electron or muon, and one of the following: either two jets identified as originating from bottom quarks, or two photons, or a second electron or muon with the same electric charge. The analysis is based on 20.3 fb−1 of s√=8 TeV proton-proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with the Standard Model expectations, and limits are set in the context of a simplified supersymmetric model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for heavy leptons decaying to a Z boson and an electron or a muon is presented. The search is based on pp collision data taken at s√=8 TeV by the ATLAS experiment at the CERN Large Hadron Collider, corresponding to an integrated luminosity of 20.3 fb−1. Three high-transverse-momentum electrons or muons are selected, with two of them required to be consistent with originating from a Z boson decay. No significant excess above Standard Model background predictions is observed, and 95% confidence level limits on the production cross section of high-mass trilepton resonances are derived. The results are interpreted in the context of vector-like lepton and type-III seesaw models. For the vector-like lepton model, most heavy lepton mass values in the range 114-176 GeV are excluded. For the type-III seesaw model, most mass values in the range 100-468 GeV are excluded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for the Standard Model Higgs boson produced in association with a pair of top quarks, tt¯H, is presented. The analysis uses 20.3 fb−1 of pp collision data at s√ = 8 TeV, collected with the ATLAS detector at the Large Hadron Collider during 2012. The search is designed for the H to bb¯ decay mode and uses events containing one or two electrons or muons. In order to improve the sensitivity of the search, events are categorised according to their jet and b-tagged jet multiplicities. A neural network is used to discriminate between signal and background events, the latter being dominated by tt¯+jets production. In the single-lepton channel, variables calculated using a matrix element method are included as inputs to the neural network to improve discrimination of the irreducible tt¯+bb¯ background. No significant excess of events above the background expectation is found and an observed (expected) limit of 3.4 (2.2) times the Standard Model cross section is obtained at 95% confidence level. The ratio of the measured tt¯H signal cross section to the Standard Model expectation is found to be μ=1.5±1.1 assuming a Higgs boson mass of 125 GeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search has been performed for pair production of heavy vector-like down-type (B) quarks. The analysis explores the lepton-plus-jets final state, characterized by events with one isolated charged lepton (electron or muon), significant missing transverse momentum and multiple jets. One or more jets are required to be tagged as arising from b-quarks, and at least one pair of jets must be tagged as arising from the hadronic decay of an electroweak boson. The analysis uses the full data sample of pp collisions recorded in 2012 by the ATLAS detector at the LHC, operating at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. No significant excess of events is observed above the expected background. Limits are set on vector-like B production, as a function of the B branching ratios, assuming the allowable decay modes are B→Wt/Zb/Hb. In the chiral limit with a branching ratio of 100% for the decay B→Wt, the observed (expected) 95% CL lower limit on the vector-like B mass is 810 GeV (760 GeV). In the case where the vector-like B quark has branching ratio values corresponding to those of an SU(2) singlet state, the observed (expected) 95% CL lower limit on the vector-like B mass is 640 GeV (505 GeV). The same analysis, when used to investigate pair production of a colored, charge 5/3 exotic fermion T5/3, with subsequent decay T5/3→Wt, sets an observed (expected) 95% CL lower limit on the T5/3 mass of 840 GeV (780 GeV).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of a W boson decaying to eν or μν in association with a W or Z boson decaying to two jets is studied using 4.6 fb−1 of proton--proton collision data at s√=7 TeV recorded with the ATLAS detector at the LHC. The combined WW+WZ cross section is measured with a significance of 3.4σ and is found to be 68±7 (stat.)±19 (syst.) pb, in agreement with the Standard Model expectation of 61.1±2.2 pb. The distribution of the transverse momentum of the dijet system is used to set limits on anomalous contributions to the triple gauge coupling vertices and on parameters of an effective-field-theory model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of s√=8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT>120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between EmissT>150 GeV and EmissT>700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with large extra spatial dimensions, pair production of weakly interacting dark matter candidates, and production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presented.