942 resultados para Search for the Truth


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for the pair-production of heavy leptons (N0,L±) predicted by the type-III seesaw theory formulated to explain the origin of small neutrino masses is presented. The decay channels N0→W±l∓ (ℓ=e,μ,τ) and L±→W±ν (ν=νe,νμ,ντ) are considered. The analysis is performed using the final state that contains two leptons (electrons or muons), two jets from a hadronically decaying W boson, and large missing transverse momentum. The data used in the measurement correspond to an integrated luminosity of 20.3fb−1 of pp collisions at s√=8 TeV collected by the ATLAS detector at the LHC. No evidence of heavy lepton pair-production is observed. Heavy leptons with masses below 325--540 GeV are excluded at the 95% confidence level, depending on the theoretical scenario considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for the production of single-top-quarks in association with missing energy is performed in proton--proton collisions at a centre-of-mass energy of s√ = 8 TeV with the ATLAS experiment at the Large Hadron Collider using data collected in 2012, corresponding to an integrated luminosity of 20.3 fb−1. In this search, the W boson from the top quark is required to decay into an electron or a muon and a neutrino. No deviation from the Standard Model prediction is observed, and upper limits are set on the production cross-section for resonant and non-resonant production of an invisible exotic state in association with a right-handed top quark. In the case of resonant production, for a spin-0 resonance with a mass of 500 GeV, an effective coupling strength above 0.15 is excluded at 95% confidence level for the top quark and an invisible spin-1/2 state with mass between 0 GeV and 100 GeV. In the case of non-resonant production, an effective coupling strength above 0.2 is excluded at 95% confidence level for the top quark and an invisible spin-1 state with mass between 0 GeV and 657 GeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ATLAS detector at the Large Hadron Collider at CERN is used to search for the decay of a scalar boson to a pair of long-lived particles, neutral under the Standard Model gauge group, in 20.3 fb−1 of data collected in proton--proton collisions at s√ = 8 TeV. This search is sensitive to long-lived particles that decay to Standard Model particles producing jets at the outer edge of the ATLAS electromagnetic calorimeter or inside the hadronic calorimeter. No significant excess of events is observed. Limits are reported on the product of the scalar boson production cross section times branching ratio into long-lived neutral particles as a function of the proper lifetime of the particles. Limits are reported for boson masses from 100 GeV to 900 GeV, and a long-lived neutral particle mass from 10 GeV to 150 GeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for flavour-changing neutral current decays of a top quark to an uptype quark (q = u, c) and the Standard Model Higgs boson, where the Higgs boson decays to bb¯¯, is presented. The analysis searches for top quark pair events in which one top quark decays to Wb, with the W boson decaying leptonically, and the other top quark decays to Hq. The search is based on pp collisions at s√=8 TeV recorded in 2012 with the ATLAS detector at the CERN Large Hadron Collider and uses an integrated luminosity of 20.3 fb−1. Data are analysed in the lepton-plus-jets final state, characterised by an isolated electron or muon and at least four jets. The search exploits the high multiplicity of b-quark jets characteristic of signal events, and employs a likelihood discriminant that uses the kinematic differences between the signal and the background, which is dominated by tt¯→WbWb decays. No significant excess of events above the background expectation is found, and observed (expected) 95% CL upper limits of 0.56% (0.42%) and 0.61% (0.64%) are derived for the t → Hc and t → Hu branching ratios respectively. The combination of this search with other ATLAS searches in the H → γγ and H → WW *, ττ decay modes significantly improves the sensitivity, yielding observed (expected) 95% CL upper limits on the t → Hc and t → Hu branching ratios of 0.46% (0.25%) and 0.45% (0.29%) respectively. The corresponding combined observed (expected) upper limits on the |λ tcH | and |λ tuH | couplings are 0.13 (0.10) and 0.13 (0.10) respectively. These are the most restrictive direct bounds on tqH interactions measured so far.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. This article presents the results of a search for events containing at least one long-lived particle that decays at a significant distance from its production point into two leptons or into five or more charged particles. This analysis uses a data sample of proton-proton collisions at s√ = 8 TeV corresponding to an integrated luminosity of 20.3 fb−1 collected in 2012 by the ATLAS detector operating at the Large Hadron Collider. No events are observed in any of the signal regions, and limits are set on model parameters within supersymmetric scenarios involving R-parity violation, split supersymmetry, and gauge mediation. In some of the search channels, the trigger and search strategy are based only on the decay products of individual long-lived particles, irrespective of the rest of the event. In these cases, the provided limits can easily be reinterpreted in different scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We accomplish two goals. First, we provide a non-cooperative foundation for the use of the Nash bargaining solution in search markets. This finding should help to close the rift between the search and the matching-and-bargaining literature. Second, we establish that the diversity of quality offered (at an increasing price-quality ratio) in a decentralized market is an equilibrium phenomenon - even in the limit as search frictions disappear.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The vast majority of users don’t seek results beyond the second page offered by the search engine, so if a site fails to be among the top 20 (second page), it says that this page does not have good SEO and, therefore, is not visible to the user. The overall objective of this project is to conduct a study to discover the factors that determine (or not) the positioning of websites in a search engine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper I analyze the difficult question of the truth of mature scientific theories by tackling the problem of the truth of laws. After introducing the main philosophical positions in the field of scientific realism, I discuss and then counter the two main arguments against realism, namely the pessimistic meta-induction and the abstract and idealized character of scientific laws. I conclude by defending the view that well-confirmed physical theories are true only relatively to certain values of the variables that appear in the laws.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis introduces the Salmon Algorithm, a search meta-heuristic which can be used for a variety of combinatorial optimization problems. This algorithm is loosely based on the path finding behaviour of salmon swimming upstream to spawn. There are a number of tunable parameters in the algorithm, so experiments were conducted to find the optimum parameter settings for different search spaces. The algorithm was tested on one instance of the Traveling Salesman Problem and found to have superior performance to an Ant Colony Algorithm and a Genetic Algorithm. It was then tested on three coding theory problems - optimal edit codes, optimal Hamming distance codes, and optimal covering codes. The algorithm produced improvements on the best known values for five of six of the test cases using edit codes. It matched the best known results on four out of seven of the Hamming codes as well as three out of three of the covering codes. The results suggest the Salmon Algorithm is competitive with established guided random search techniques, and may be superior in some search spaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2008, the Truth and Reconciliation Commission of Canada (TRC) was initiated to address the historical and contemporary injustices and impacts of Indian Residential Schools. Of the many goals of the TRC, I focus on reconciliation and how the TRC aims to promote this through public education and engagement. To explore this, I consider two questions: 1) who does the TRC include in the process of reconciliation? And 2) how might I, as someone who is not Indigenous (specifically, as someone who is “white”), be engaged by the TRC? Ethical queries arise which speak to broader concerns about the TRC’s capability to fulfill its public education goals. I raise several concerns about whether the TRC’s plan to convoke the col- lective will result in over-simplifying the process by relying on blunt, poorly defined identity categories that erase the heterogeneity of those residing in Canada, as well as the complexity of the conflict among us. I attempt to situate myself in-between proclamations of “success” or “failure” of the TRC, to better understand what can be learned from contested truths and experiences of uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Truth commissions and criminal trials have come to be perceived as complementary transitional justice mechanisms. However, where effective prosecutions are dependent on the exchange of information and transfer of suspects between states under existing mutual legal assistance and extradition arrangements, the operation of a truth commission in the state of territoriality may act as an obstacle to international cooperation. At the same time, requests for assistance from a third state pursuing prosecutions may impact negatively on the truth commission process in the requested state by inhibiting those reluctant to become involved in criminal proceedings from offering testimony. This article demonstrates a practical discord between these bodies when they operate in different states and questions whether they can truly be considered “complementary”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the evolution of prices in markets with Internet price-comparison search engines. The empirical study analyzes laboratory data of prices available to informed consumers, for two industry sizes and two conditions on the sample (complete and incomplete). Distributions are typically bimodal. One of the two modes of distribution, corresponding to monopoly pricing, tends to attract such pricing strategies increasingly over time. The second one, corresponding to interior pricing, follows a decreasing trend. Monopoly pricing can serve as a means of insurance against more competitive (but riskier) behavior. In fact, experimental subjects who initially earn low profits due to interior pricing are more likely to switch to monopoly pricing than subjects who experience good returns from the start.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the results of a search for the effects of large extra spatial dimensions in p (p) over bar collisions at root s = 1: 96 TeV in events containing a pair of energetic muons. The data correspond to 246 pb(-1) of integrated luminosity collected by the D0 experiment at the Fermilab Tevatron Collider. Good agreement with the expected background was found, yielding no evidence for large extra dimensions. We set 95% C. L. lower limits on the fundamental Planck scale between 0.85 and 1.27 TeV within several formalisms. These are the most stringent limits achieved in the dimuon channel to date.