85 resultados para R-matrix theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the observation of electroweak single top quark production in 3.2  fb-1 of pp̅ collision data collected by the Collider Detector at Fermilab at √s=1.96  TeV. Candidate events in the W+jets topology with a leptonically decaying W boson are classified as signal-like by four parallel analyses based on likelihood functions, matrix elements, neural networks, and boosted decision trees. These results are combined using a super discriminant analysis based on genetically evolved neural networks in order to improve the sensitivity. This combined result is further combined with that of a search for a single top quark signal in an orthogonal sample of events with missing transverse energy plus jets and no charged lepton. We observe a signal consistent with the standard model prediction but inconsistent with the background-only model by 5.0 standard deviations, with a median expected sensitivity in excess of 5.9 standard deviations. We measure a production cross section of 2.3-0.5+0.6(stat+sys)  pb, extract the value of the Cabibbo-Kobayashi-Maskawa matrix element |Vtb|=0.91-0.11+0.11(stat+sys)±0.07  (theory), and set a lower limit |Vtb|>0.71 at the 95% C.L., assuming mt=175  GeV/c2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the observation of electroweak single top quark production in 3.2 fb-1 of ppbar collision data collected by the Collider Detector at Fermilab at sqrt{s}=1.96 TeV. Candidate events in the W+jets topology with a leptonically decaying W boson are classified as signal-like by four parallel analyses based on likelihood functions, matrix elements, neural networks, and boosted decision trees. These results are combined using a super discriminant analysis based on genetically evolved neural networks in order to improve the sensitivity. This combined result is further combined with that of a search for a single top quark signal in an orthogonal sample of events with missing transverse energy plus jets and no charged lepton. We observe a signal consistent with the standard model prediction but inconsistent with the background-only model by 5.0 standard deviations, with a median expected sensitivity in excess of 5.9 standard deviations. We measure a production cross section of 2.3+0.6-0.5(stat+sys) pb, extract the CKM matrix element value |Vtb|=0.91+0.11-0.11 (stat+sys)+-0.07(theory), and set a lower limit |Vtb|>0.71 at the 95% confidence level, assuming m_t=175 GeVc^2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a measurement of the single top quark production cross section in 2.2 ~fb-1 of p-pbar collision data collected by the Collider Detector at Fermilab at sqrt{s}=1.96 TeV. Candidate events are classified as signal-like by three parallel analyses which use likelihood, matrix element, and neural network discriminants. These results are combined in order to improve the sensitivity. We observe a signal consistent with the standard model prediction, but inconsistent with the background-only model by 3.7 standard deviations with a median expected sensitivity of 4.9 standard deviations. We measure a cross section of 2.2 +0.7 -0.6(stat+sys) pb, extract the CKM matrix element value |V_{tb}|=0.88 +0.13 -0.12 (stat+sys) +- 0.07(theory), and set the limit |V_{tb}|>0.66 at the 95% C.L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A precision measurement of the top quark mass m_t is obtained using a sample of ttbar events from ppbar collisions at the Fermilab Tevatron with the CDF II detector. Selected events require an electron or muon, large missing transverse energy, and exactly four high-energy jets, at least one of which is tagged as coming from a b quark. A likelihood is calculated using a matrix element method with quasi-Monte Carlo integration taking into account finite detector resolution and jet mass effects. The event likelihood is a function of m_t and a parameter DJES to calibrate the jet energy scale /in situ/. Using a total of 1087 events, a value of m_t = 173.0 +/- 1.2 GeV/c^2 is measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a measurement of the top quark mass, m_t, obtained from ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron using the CDF II detector. We analyze a sample corresponding to an integrated luminosity of 1.9 fb^-1. We select events with an electron or muon, large missing transverse energy, and exactly four high-energy jets in the central region of the detector, at least one of which is tagged as coming from a b quark. We calculate a signal likelihood using a matrix element integration method, with effective propagators to take into account assumptions on event kinematics. Our event likelihood is a function of m_t and a parameter JES that determines /in situ/ the calibration of the jet energies. We use a neural network discriminant to distinguish signal from background events. We also apply a cut on the peak value of each event likelihood curve to reduce the contribution of background and badly reconstructed events. Using the 318 events that pass all selection criteria, we find m_t = 172.7 +/- 1.8 (stat. + JES) +/- 1.2 (syst.) GeV/c^2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a measurement of the top quark mass in the all-hadronic channel (\tt $\to$ \bb$q_{1}\bar{q_{2}}q_{3}\bar{q_{4}}$) using 943 pb$^{-1}$ of \ppbar collisions at $\sqrt {s} = 1.96$ TeV collected at the CDF II detector at Fermilab (CDF). We apply the standard model production and decay matrix-element (ME) to $\ttbar$ candidate events. We calculate per-event probability densities according to the ME calculation and construct template models of signal and background. The scale of the jet energy is calibrated using additional templates formed with the invariant mass of pairs of jets. These templates form an overall likelihood function that depends on the top quark mass and on the jet energy scale (JES). We estimate both by maximizing this function. Given 72 observed events, we measure a top quark mass of 171.1 $\pm$ 3.7 (stat.+JES) $\pm$ 2.1 (syst.) GeV/$c^{2}$. The combined uncertainty on the top quark mass is 4.3 GeV/$c^{2}$.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the first observation of single top quark production using 3.2 fb^-1 of pbar p collision data with sqrt{s}=1.96 TeV collected by the Collider Detector at Fermilab. The significance of the observed data is 5.0 standard deviations, and the expected sensitivity for standard model production and decay is in excess of 5.9 standard deviations. Assuming m_t=175 GeV/c^2, we measure a cross section of 2.3 +0.6 -0.5 (stat+syst) pb, extract the CKM matrix element value |V_{tb}|=0.91 +-0.11 (stat+syst) 0.07(theory), and set the limit |V_{tb}|>0.71 at the 95% C.L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Governance has been one of the most popular buzzwords in recent political science. As with any term shared by numerous fields of research, as well as everyday language, governance is encumbered by a jungle of definitions and applications. This work elaborates on the concept of network governance. Network governance refers to complex policy-making situations, where a variety of public and private actors collaborate in order to produce and define policy. Governance is processes of autonomous, self-organizing networks of organizations exchanging information and deliberating. Network governance is a theoretical concept that corresponds to an empirical phenomenon. Often, this phenomenon is used to descirbe a historical development: governance is often used to describe changes in political processes of Western societies since the 1980s. In this work, empirical governance networks are used as an organizing framework, and the concepts of autonomy, self-organization and network structure are developed as tools for empirical analysis of any complex decision-making process. This work develops this framework and explores the governance networks in the case of environmental policy-making in the City of Helsinki, Finland. The crafting of a local ecological sustainability programme required support and knowledge from all sectors of administration, a number of entrepreneurs and companies and the inhabitants of Helsinki. The policy process relied explicitly on networking, with public and private actors collaborating to design policy instruments. Communication between individual organizations led to the development of network structures and patterns. This research analyses these patterns and their effects on policy choice, by applying the methods of social network analysis. A variety of social network analysis methods are used to uncover different features of the networked process. Links between individual network positions, network subgroup structures and macro-level network patterns are compared to the types of organizations involved and final policy instruments chosen. By using governance concepts to depict a policy process, the work aims to assess whether they contribute to models of policy-making. The conclusion is that the governance literature sheds light on events that would otherwise go unnoticed, or whose conceptualization would remain atheoretical. The framework of network governance should be in the toolkit of the policy analyst.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wealthy individuals - business angels who invest a share of their net worth in entrepreneurial ventures - form an essential part of an informal venture capital market that can secure funding for entrepreneurial ventures. In Finland, business angels represent an untapped pool of capital that can contribute to fostering entrepreneurial development. In addition, business angels can bridge knowledge gaps in new business ventures by means of making their human capital available. This study has two objectives. The first is to gain an understanding of the characteristics and investment behaviour of Finnish business angels. The strongest focus here is on the due diligence procedures and their involvement post investment. The second objective is to assess whether agency theory and the incomplete contacting theory are useful theoretical lenses in the arena of business angels. To achieve the second objective, this study investigates i) how risk is mitigated in the investment process, ii) how uncertainty influences the comprehensiveness of due diligence as well as iii) how control is allocated post investment. Research hypotheses are derived from assumptions underlying agency theory and the incomplete contacting theory. The data for this study comprise interviews with 53 business angels. In terms of sample size this is the largest on Finnish business angels. The research hypotheses in this study are tested using regression analysis. This study suggests that the Finnish informal venture capital market appears to be comprised of a limited number of business angels whose style of investing much resembles their formal counterparts’. Much focus is placed on managing risks prior to making the investment by strong selectiveness and by a relatively comprehensive due diligence. The involvement is rarely on a day-to-day basis and many business angels seem to see board membership as a more suitable alternative than involvement in the operations of an entrepreneurial venture. The uncertainty involved does not seem to drive an increase in due diligence. On the contrary, it would appear that due diligence is more rigorous in safer later stage investments and when the business angels have considerable previous experience as investors. Finnish business angels’ involvement post investment is best explained by their degree of ownership in the entrepreneurial venture. It seems that when investors feel they are sufficiently rewarded, in terms of an adequate equity stake, they are willing to involve themselves actively in their investments. The lack of support for a relationship between increased uncertainty and the comprehensiveness of due diligence may partly be explained by an increasing trend towards portfolio diversification. This is triggered by a taxation system that favours investments through investment companies rather than direct investments. Many business angels appear to have substituted a specialization strategy that builds on reducing uncertainty for a diversification strategy that builds on reducing firm specific (idiosyncratic) risk by holding shares in ventures whose returns are not expected to exhibit a strong positive correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents a new theory of internal marketing. The thesis has developed as a case study in retrospective action research. This began with the personal involvement of the author in an action research project for customer service improvement at a large Australian retail bank. In other words, much of the theory generating ‘research’ took place after the original project ‘action’ had wound down. The key theoretical proposition is that internal marketing is a relationship development strategy for the purpose of knowledge renewal. In the banking case, exchanges of value between employee participants emerged as the basis for relationship development, with synergistic benefits for customers, employees and the bank. Relationship development turned out to be the mediating variable between the learning activity of employee participants at the project level and success in knowledge renewal at the organisational level. Relationship development was also a pivotal factor in the motivation and customer consciousness of employees. The conclusion reached is that the strength of relationship-mediated internal marketing is in combining a market focused commitment and employee freedom in project work to achieve knowledge renewal. The forgotten truth is that organisational knowledge can be renewed through dialogue and learning, through being trustworthy, and by gaining the trust of employees in return.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goods-dominated marketing model has major shortcomings as a guiding marketing theory. Its marketing mix approach is mainly geared towards buying and does not include consumption as an integral part of marketing theory. Although it is during the process of consuming goods and services that value is generated for customers and the foundation for repeat purchasing and customer relationships are laid, this process is left outside the scope of marketing. The focus in service marketing is not on a product but on interactions in service encounters. Consumption has become an integral part of a holistic marketing model. Other than standardized goods-based value propositions can be better understood when taking a servicebased approach. It is concluded that marketing based on a goods logic is but a special case of marketing based on a service logic and applicable only in certain contexts with standardized products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to deepen the understanding of market segmentation theory by studying the evolution of the concept and by identifying the antecedents and consequences of the theory. The research method was influenced by content analysis and meta-analysis. The evolution of market segmentation theory was studied as a reflection of evolution of marketing theory. According to this study, the theory of market segmentation has its roots in microeconomics and it has been influenced by different disciplines, such as motivation research and buyer behaviour theory. Furthermore, this study suggests that the evolution of market segmentation theory can be divided into four major eras: the era of foundations, development and blossoming, stillness and stagnation, and the era of re-emergence. Market segmentation theory emerged in the mid-1950’s and flourished during the period between mid-1950’s and the late 1970’s. During the 1980’s the theory lost its interest in the scientific community and no significant contributions were made. Now, towards the dawn of the new millennium, new approaches have emerged and market segmentation has gained new attention.