29 resultados para Graph energy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on a search for the standard model Higgs boson produced in association with a $W$ or $Z$ boson in $p\bar{p}$ collisions at $\sqrt{s} = 1.96$ TeV recorded by the CDF II experiment at the Tevatron in a data sample corresponding to an integrated luminosity of 2.1 fb$^{-1}$. We consider events which have no identified charged leptons, an imbalance in transverse momentum, and two or three jets where at least one jet is consistent with originating from the decay of a $b$ hadron. We find good agreement between data and predictions. We place 95% confidence level upper limits on the production cross section for several Higgs boson masses ranging from 110$\gevm$ to 150$\gevm$. For a mass of 115$\gevm$ the observed (expected) limit is 6.9 (5.6) times the standard model prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precision measurement of the top quark mass m_t is obtained using a sample of ttbar events from ppbar collisions at the Fermilab Tevatron with the CDF II detector. Selected events require an electron or muon, large missing transverse energy, and exactly four high-energy jets, at least one of which is tagged as coming from a b quark. A likelihood is calculated using a matrix element method with quasi-Monte Carlo integration taking into account finite detector resolution and jet mass effects. The event likelihood is a function of m_t and a parameter DJES to calibrate the jet energy scale /in situ/. Using a total of 1087 events, a value of m_t = 173.0 +/- 1.2 GeV/c^2 is measured.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a signature-based search for anomalous production of events containing a photon, two jets, of which at least one is identified as originating from a b quark, and missing transverse energy. The search uses data corresponding to 2.0/fb of integrated luminosity from p-pbar collisions at a center-of-mass energy of sqrt(s)=1.96 TeV, collected with the CDF II detector at the Fermilab Tevatron. From 6,697,466 events with a photon candidate with transverse energy ET> 25 GeV, we find 617 events with missing transverse energy > 25 GeV and two or more jets with ET> 15 GeV, at least one identified as originating from a b quark, versus an expectation of 607+- 113 events. Increasing the requirement on missing transverse energy to 50 GeV, we find 28 events versus an expectation of 30+-11 events. We find no indications of non-standard-model phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All companies have a portfolio of customer relationships. From a managerial standpoint the value of these customer relationships is a key issue. The aim of the paper is to introduce a conceptual framework for customers’ energy towards a service provider. Customer energy is defined as the cognitive, affective and behavioural effort a customer puts into the purchase of an offering. It is based on two dimensions: life theme involvement and relationship commitment. Data from a survey study of 425 customers of an online gambling site was combined with data about their individual purchases and activity. Analysis showed that involvement and commitment influence both customer behaviour and attitudes. Customer involvement was found to be strongly related to overall spending within a consumption area, whereas relationship commitment is a better predictor of the amount of money spent at a particular company. Dividing the customers into four different involvement / commitment segments revealed differences in churn rates, word-of-mouth, brand attitude, switching propensity and the use of the service for socializing. The framework provides a tool for customer management by revealing differences in fundamental drivers of customer behaviour resulting in completely new customer portfolios. Knowledge of customer energy allows companies to manage their communication and offering development better and provides insight into the risk of losing a customer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A distributed system is a collection of networked autonomous processing units which must work in a cooperative manner. Currently, large-scale distributed systems, such as various telecommunication and computer networks, are abundant and used in a multitude of tasks. The field of distributed computing studies what can be computed efficiently in such systems. Distributed systems are usually modelled as graphs where nodes represent the processors and edges denote communication links between processors. This thesis concentrates on the computational complexity of the distributed graph colouring problem. The objective of the graph colouring problem is to assign a colour to each node in such a way that no two nodes connected by an edge share the same colour. In particular, it is often desirable to use only a small number of colours. This task is a fundamental symmetry-breaking primitive in various distributed algorithms. A graph that has been coloured in this manner using at most k different colours is said to be k-coloured. This work examines the synchronous message-passing model of distributed computation: every node runs the same algorithm, and the system operates in discrete synchronous communication rounds. During each round, a node can communicate with its neighbours and perform local computation. In this model, the time complexity of a problem is the number of synchronous communication rounds required to solve the problem. It is known that 3-colouring any k-coloured directed cycle requires at least ½(log* k - 3) communication rounds and is possible in ½(log* k + 7) communication rounds for all k ≥ 3. This work shows that for any k ≥ 3, colouring a k-coloured directed cycle with at most three colours is possible in ½(log* k + 3) rounds. In contrast, it is also shown that for some values of k, colouring a directed cycle with at most three colours requires at least ½(log* k + 1) communication rounds. Furthermore, in the case of directed rooted trees, reducing a k-colouring into a 3-colouring requires at least log* k + 1 rounds for some k and possible in log* k + 3 rounds for all k ≥ 3. The new positive and negative results are derived using computational methods, as the existence of distributed colouring algorithms corresponds to the colourability of so-called neighbourhood graphs. The colourability of these graphs is analysed using Boolean satisfiability (SAT) solvers. Finally, this thesis shows that similar methods are applicable in capturing the existence of distributed algorithms for other graph problems, such as the maximal matching problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hollow atoms in which the K shell is empty while the outer shells are populated allow studying a variety of important and unusual properties of atoms. The diagram x-ray emission lines of such atoms, the K-h alpha(1,2) hypersatellites (HSs), were measured for the 3d transition metals, Z=23-30, with a high energy resolution using photoexcitation by monochromatized synchrotron radiation. Good agreement with ab initio relativistic multiconfigurational Dirac-Fock calculations was found. The measured HS intensity variation with the excitation energy yields accurate values for the excitation thresholds, excludes contributions from shake-up processes, and indicates domination near threshold of a nonshake process. The Z variation of the HS shifts from the diagram line K alpha(1,2), the K-h alpha(1)-K-h alpha(2) splitting, and the K-h alpha(1)/K-h alpha(2) intensity ratio, derived from the measurements, are also discussed with a particular emphasis on the QED corrections and Breit interaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene mapping is a systematic search for genes that affect observable characteristics of an organism. In this thesis we offer computational tools to improve the efficiency of (disease) gene-mapping efforts. In the first part of the thesis we propose an efficient simulation procedure for generating realistic genetical data from isolated populations. Simulated data is useful for evaluating hypothesised gene-mapping study designs and computational analysis tools. As an example of such evaluation, we demonstrate how a population-based study design can be a powerful alternative to traditional family-based designs in association-based gene-mapping projects. In the second part of the thesis we consider a prioritisation of a (typically large) set of putative disease-associated genes acquired from an initial gene-mapping analysis. Prioritisation is necessary to be able to focus on the most promising candidates. We show how to harness the current biomedical knowledge for the prioritisation task by integrating various publicly available biological databases into a weighted biological graph. We then demonstrate how to find and evaluate connections between entities, such as genes and diseases, from this unified schema by graph mining techniques. Finally, in the last part of the thesis, we define the concept of reliable subgraph and the corresponding subgraph extraction problem. Reliable subgraphs concisely describe strong and independent connections between two given vertices in a random graph, and hence they are especially useful for visualising such connections. We propose novel algorithms for extracting reliable subgraphs from large random graphs. The efficiency and scalability of the proposed graph mining methods are backed by extensive experiments on real data. While our application focus is in genetics, the concepts and algorithms can be applied to other domains as well. We demonstrate this generality by considering coauthor graphs in addition to biological graphs in the experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating–dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating–dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs – these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating–dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study evaluates the feasibility of undelimbed Scots pine (Pinus sylvestris L.) for integrated production of pulp and energy in a kraft pulp mill from the technical, economic and environmental points of view, focusing on the potential of bundle harvesting. The feasibility of tree sections for pulp production was tested by conducting an industrial wood-handling experiment, laboratory cooking and bleaching trials, using conventional small-diameter Scots pine pulpwood as a reference. These trials showed that undelimbed Scots pine sections can be processed in favourable conditions as a blend with conventional small-diameter pulpwood without reducing the pulp quality. However, fibre losses at various phases of the process may increase when using undelimbed material. In the economic evaluation, both pulp production and wood procurement costs were considered, using the relative wood paying capability of a kraft pulp mill as a determinant. The calculations were made for three Scots pine first-thinning stands with the breast-height diameter of the removal (6 12 cm) as the main distinctive factor. The supply chains included in the comparison were based on cut-to-length harvesting, whole-tree harvesting and bundle harvesting (whole-tree bundling). With the current ratio of pulp and energy prices, the wood paying capability declines with an increase in the proportion of the energy fraction of the raw material. The supply system based on the cut-to-length method was the most efficient option, resulting in the highest residual value at stump in most cases. A decline in the pulp price and an increase in the energy price improved the competitiveness of the whole-tree systems. With short truck transportation distances and low pulp prices, however, the harvesting of loose whole trees can result in higher residual value at stump in small-diameter stands. While savings in transportation costs did not compensate for the high cutting and compaction costs by the second prototype of the bundle harvester, an increase in transportation distances improved its competitiveness. Since harvesting undelimbed assortments increases nutrient export from the site, which can affect soil productivity, the whole-tree alternatives included in the present study cannot be recommended on infertile peatlands and mineral soils. The harvesting of loose whole trees or bundled whole trees implies a reduction in protective logging residues and an increase in site traffic or payloads. These factors increase the risk of soil damage, especially on peat soils with poor bearing capacity. Within the wood procurement parameters which were examined, the CO2 emissions of the supply systems varied from 13 27 kg m3. Compaction of whole trees into bundles reduced emissions from transportation by 30 39%, but these reductions were insufficient to compensate for the increased emissions from cutting and compaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

‪This dissertation examines the impacts of energy and climate policies on the energy and forest sectors, focusing on the case of Finland. The thesis consists of an introduction article and four separate studies. The dissertation was motivated by the climate concern and the increasing demand of renewable energy. In particular, the renewable energy consumption and greenhouse gas emission reduction targets of the European Union were driving this work. In Finland, both forest and energy sectors are in key roles in achieving these targets. In fact, the separation between forest and energy sector is diminishing as the energy sector is utilizing increasing amounts of wood in energy production and as the forest sector is becoming more and more important energy producer.‬ ‪The objective of this dissertation is to find out and measure the impacts of climate and energy policies on the forest and energy sectors. In climate policy, the focus is on emissions trading, and in energy policy the dissertation focuses on the promotion of renewable forest-based energy use. The dissertation relies on empirical numerical models that are based on microeconomic theory. Numerical partial equilibrium mixed complementarity problem models were constructed to study the markets under scrutiny. The separate studies focus on co-firing of wood biomass and fossil fuels, liquid biofuel production in the pulp and paper industry, and the impacts of climate policy on the pulp and paper sector.‬ ‪The dissertation shows that the policies promoting wood-based energy may have have unexpected negative impacts. When feed-in tariff is imposed together with emissions trading, in some plants the production of renewable electricity might decrease as the emissions price increases. The dissertation also shows that in liquid biofuel production, investment subsidy may cause high direct policy costs and other negative impacts when compared to other policy instruments. The results of the dissertation also indicate that from the climate mitigation perspective, perfect competition is the favored wood market competition structure, at least if the emissions trading system is not global.‬ ‪In conclusion, this dissertation suggests that when promoting the use of wood biomass in energy production, the favored policy instruments are subsidies that promote directly the renewable energy production (i.e. production subsidy, renewables subsidy or feed-in premium). Also, the policy instrument should be designed to be dependent on the emissions price or on the substitute price. In addition, this dissertation shows that when planning policies to promote wood-based renewable energy, the goals of the policy scheme should be clear before decisions are made on the choice of the policy instruments.‬