978 resultados para IMPROVED PROTOCOL
Resumo:
We present a search for the standard model Higgs boson produced with a Z boson in 4.1 fb^-1 of data collected with the CDF II detector at the Tevatron. In events consistent with the decay of the Higgs boson to a bottom-quark pair and the Z boson to electrons or muons, we set 95% credibility level upper limits on the ZH production cross section times the H -> b bbar branching ratio. Improved analysis methods enhance signal sensitivity by 20% relative to previous searches beyond the gain due to the larger data sample. At a Higgs boson mass of 115 GeV/c^2 we set a limit of 5.9 times the standard model value.
Resumo:
A popular dynamic imaging technique, k-t BLAST (ktB) is studied here for BAR imaging. ktB utilizes correlations in k-space and time, to reconstruct the image time series with only a fraction of the data. The algorithm works by unwrapping the aliased Fourier conjugate space of k-t (y-f-space). The unwrapping process utilizes the estimate of the true y-f-space, by acquiring densely sampled low k-space data. The drawbacks of this method include separate training scan, blurred training estimates and aliased phase maps. The proposed changes are incorporation of phase information from the training map and using generalized-series-extrapolated training map. The proposed technique is compared with ktB on real fMRI data. The proposed changes allow for ktB to operate at an acceleration factor of 6. Performance is evaluated by comparing activation maps obtained using reconstructed images. An improvement of up to 10 dB is observed in thePSNR of activation maps. Besides, a 10% reduction in RMSE is obtained over the entire time series of fMRI images. Peak improvement of the proposed method over ktB is 35%, averaged over five data sets. (C)2010 Elsevier Inc. All rights reserved.
Resumo:
Concerning the L2-stability of feedback systems containing a linear time-varying operator, some of the stringent restrictions imposed on the multiplier as well as the linear part of the system, in the criteria presented earlier, are relaxed.
Resumo:
Suvi Nenonen Customer asset management in action: using customer portfolios for allocating resources across business-to-business relationships for improved shareholder value Customers are crucial assets to all firms as customers are the ultimate source of all cash flows. Regardless this financial importance of customer relationships, for decades there has been a lack of suitable frameworks explaining how customer relationships contribute to the firm financial performance and how this contribution can be actively managed. In order to facilitate a better understanding of the customer asset, contemporary marketing has investigated the use of financial theories and asset management practices in the customer relationship context. Building on this, marketing academics have promoted the customer lifetime value concept as a solution for valuating and managing customer relationships for optimal financial outcomes. However, the empirical investigation of customer asset management lags behind the conceptual development steps taken. Additionally, the practitioners have not embraced the use of customer lifetime value in guiding managerial decisions - especially in the business-to-business context. The thesis points out that there are fundamental differences between customer relationships and investment instruments as investment targets, effectively eliminating the possibility to use financial theories in a customer relationships context or to optimize the customer base as a single investment portfolio. As an alternative, the thesis proposes the use of customer portfolio approach for allocating resources across the customer base for improved shareholder value. In the customer portfolio approach, the customer base of a firm is divided into multiple portfolios based on customer relationships’ potential to contribute to the shareholder value creation. After this, customer management concepts are tailored to each customer portfolio, designed to improve the shareholder value in their own respect. Therefore, effective customer asset management with the customer portfolio approach necessitates that firms are able to manage multiple parallel customer management concepts, or business models, simultaneously. The thesis is one of the first empirical studies on customer asset management, bringing empirical evidence from multiple business-to-business case studies on how customer portfolio models can be formed, how customer portfolios can be managed, and how customer asset management has contributed to the firm financial performance.
Resumo:
Suvi Nenonen Customer asset management in action: using customer portfolios for allocating resources across business-to-business relationships for improved shareholder value Customers are crucial assets to all firms as customers are the ultimate source of all cash flows. Regardless this financial importance of customer relationships, for decades there has been a lack of suitable frameworks explaining how customer relationships contribute to the firm financial performance and how this contribution can be actively managed. In order to facilitate a better understanding of the customer asset, contemporary marketing has investigated the use of financial theories and asset management practices in the customer relationship context. Building on this, marketing academics have promoted the customer lifetime value concept as a solution for valuating and managing customer relationships for optimal financial outcomes. However, the empirical investigation of customer asset management lags behind the conceptual development steps taken. Additionally, the practitioners have not embraced the use of customer lifetime value in guiding managerial decisions - especially in the business-to-business context. The thesis points out that there are fundamental differences between customer relationships and investment instruments as investment targets, effectively eliminating the possibility to use financial theories in a customer relationships context or to optimize the customer base as a single investment portfolio. As an alternative, the thesis proposes the use of customer portfolio approach for allocating resources across the customer base for improved shareholder value. In the customer portfolio approach, the customer base of a firm is divided into multiple portfolios based on customer relationships’ potential to contribute to the shareholder value creation. After this, customer management concepts are tailored to each customer portfolio, designed to improve the shareholder value in their own respect. Therefore, effective customer asset management with the customer portfolio approach necessitates that firms are able to manage multiple parallel customer management concepts, or business models, simultaneously. The thesis is one of the first empirical studies on customer asset management, bringing empirical evidence from multiple business-to-business case studies on how customer portfolio models can be formed, how customer portfolios can be managed, and how customer asset management has contributed to the firm financial performance.
Resumo:
In this two-part series of papers, a generalized non-orthogonal amplify and forward (GNAF) protocol which generalizes several known cooperative diversity protocols is proposed. Transmission in the GNAF protocol comprises of two phases - the broadcast phase and the cooperation phase. In the broadcast phase, the source broadcasts its information to the relays as well as the destination. In the cooperation phase, the source and the relays together transmit a space-time code in a distributed fashion. The GNAF protocol relaxes the constraints imposed by the protocol of Jing and Hassibi on the code structure. In Part-I of this paper, a code design criteria is obtained and it is shown that the GNAF protocol is delay efficient and coding gain efficient as well. Moreover GNAF protocol enables the use of sphere decoders at the destination with a non-exponential Maximum likelihood (ML) decoding complexity. In Part-II, several low decoding complexity code constructions are studied and a lower bound on the Diversity-Multiplexing Gain tradeoff of the GNAF protocol is obtained.
Resumo:
In many applications of wireless ad hoc networks, wireless nodes are owned by rational and intelligent users. In this paper, we call nodes selfish if they are owned by independent users and their only objective is to maximize their individual goals. In such situations, it may not be possible to use the existing protocols for wireless ad hoc networks as these protocols assume that nodes follow the prescribed protocol without deviation. Stimulating cooperation among these nodes is an interesting and challenging problem. Providing incentives and pricing the transactions are well known approaches to stimulate cooperation. In this paper, we present a game theoretic framework for truthful broadcast protocol and strategy proof pricing mechanism called Immediate Predecessor Node Pricing Mechanism (IPNPM). The phrase strategy proof here means that truth revelation of cost is a weakly dominant-strategy (in game theoretic terms) for each node. In order to steer our mechanism-design approach towards practical implementation, we compute the payments to nodes using a distributed algorithm. We also propose a new protocol for broadcast in wireless ad hoc network with selfish nodes based on IPNPM. The features of the proposed broadcast protocol are reliability and a significantly reduced number of packet forwards compared to the number of network nodes, which in turn leads to less system-wide power consumption to broadcast a single packet. Our simulation results show the efficacy of the proposed broadcast protocol.
Resumo:
We describe the on-going design and implementation of a sensor network for agricultural management targeted at resource-poor farmers in India. Our focus on semi-arid regions led us to concentrate on water-related issues. Throughout 2004, we carried out a survey on the information needs of the population living in a cluster of villages in our study area. The results highlighted the potential that environment-related information has for the improvement of farming strategies in the face of highly variable conditions, in particular for risk management strategies (choice of crop varieties, sowing and harvest periods, prevention of pests and diseases, efficient use of irrigation water etc.). This leads us to advocate an original use of Information and Communication Technologies (ICT). We believe our demand-driven approach for the design of appropriate ICT tools that are targeted at the resource-poor to be relatively new. In order to go beyond a pure technocratic approach, we adopted an iterative, participatory methodology.
Resumo:
Existing protocols for archival systems make use of verifiability of shares in conjunction with a proactive secret sharing scheme to achieve high availability and long term confidentiality, besides data integrity. In this paper, we extend an existing protocol (Wong et al. [9]) to take care of more realistic situations. For example, it is assumed in the protocol of Wong et al. that the recipients of the secret shares are all trustworthy; we relax this by requiring that only a majority is trustworthy.
Resumo:
A variety of ketoxime ethyl carbonates-easily prepared from the oximes and ethyl chloroformate-undergo the Beckmann rearrangement upon treatment with 1 equivalent of boron trifluoride etherate, in dichloromethane solution at room temperature in excellent yields (generally 75-99%). (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
A variety of ketoxime ethyl carbonates-easily prepared from the oximes and ethyl chloroformate-undergo the Beckmann rearrangement upon treatment with 1 equivalent of boron trifluoride etherate, in dichloromethane solution at room temperature in excellent yields (generally 75-99%). (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
We consider the slotted ALOHA protocol on a channel with a capture effect. There are M
Resumo:
An important issue in the design of a distributed computing system (DCS) is the development of a suitable protocol. This paper presents an effort to systematize the protocol design procedure for a DCS. Protocol design and development can be divided into six phases: specification of the DCS, specification of protocol requirements, protocol design, specification and validation of the designed protocol, performance evaluation, and hardware/software implementation. This paper describes techniques for the second and third phases, while the first phase has been considered by the authors in their earlier work. Matrix and set theoretic based approaches are used for specification of a DCS and for specification of the protocol requirements. These two formal specification techniques form the basis of the development of a simple and straightforward procedure for the design of the protocol. The applicability of the above design procedure has been illustrated by considering an example of a computing system encountered on board a spacecraft. A Petri-net based approach has been adopted to model the protocol. The methodology developed in this paper can be used in other DCS applications.
Resumo:
With the objective of better understanding the significance of New Car Assessment Program (NCAP) tests conducted by the National Highway Traffic Safety Administration (NHTSA), head-on collisions between two identical cars of different sizes and between cars and a pickup truck are studied in the present paper using LS-DYNA models. Available finite element models of a compact car (Dodge Neon), midsize car (Dodge Intrepid), and pickup truck (Chevrolet C1500) are first improved and validated by comparing theanalysis-based vehicle deceleration pulses against corresponding NCAP crash test histories reported by NHTSA. In confirmation of prevalent perception, simulation-bascd results indicate that an NCAP test against a rigid barrier is a good representation of a collision between two similar cars approaching each other at a speed of 56.3 kmph (35 mph) both in terms of peak deceleration and intrusions. However, analyses carried out for collisions between two incompatible vehicles, such as an Intrepid or Neon against a C1500, point to the inability of the NCAP tests in representing the substantially higher intrusions in the front upper regions experienced by the cars, although peak decelerations in cars arc comparable to those observed in NCAP tests. In an attempt to improve the capability of a front NCAP test to better represent real-world crashes between incompatible vehicles, i.e., ones with contrasting ride height and lower body stiffness, two modified rigid barriers are studied. One of these barriers, which is of stepped geometry with a curved front face, leads to significantly improved correlation of intrusions in the upper regions of cars with respect to those yielded in the simulation of collisions between incompatible vehicles, together with the yielding of similar vehicle peak decelerations obtained in NCAP tests.