107 resultados para Sense and signification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show how to build full-diversity product codes under both iterative encoding and decoding over non-ergodic channels, in presence of block erasure and block fading. The concept of a rootcheck or a root subcode is introduced by generalizing the same principle recently invented for low-density parity-check codes. We also describe some channel related graphical properties of the new family of product codes, a familyreferred to as root product codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For single-user MIMO communication with uncoded and coded QAM signals, we propose bit and power loading schemes that rely only on channel distribution information at the transmitter. To that end, we develop the relationship between the average bit error probability at the output of a ZF linear receiver and the bit rates and powers allocated at the transmitter. This relationship, and the fact that a ZF receiver decouples the MIMO parallel channels, allow leveraging bit loading algorithms already existing in the literature. We solve dual bit rate maximization and power minimization problems and present performance resultsthat illustrate the gains of the proposed scheme with respect toa non-optimized transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of the pilot overhead in wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-usedblock fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with thenormalized Doppler frequency multiplied by the number of transmit antennas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents our investigation on iterativedecoding performances of some sparse-graph codes on block-fading Rayleigh channels. The considered code ensembles are standard LDPC codes and Root-LDPC codes, first proposed in and shown to be able to attain the full transmission diversity. We study the iterative threshold performance of those codes as a function of fading gains of the transmission channel and propose a numerical approximation of the iterative threshold versus fading gains, both both LDPC and Root-LDPC codes.Also, we show analytically that, in the case of 2 fading blocks,the iterative threshold root of Root-LDPC codes is proportional to (α1 α2)1, where α1 and α2 are corresponding fading gains.From this result, the full diversity property of Root-LDPC codes immediately follows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to examine (1) some of the models commonly used to represent fading,and (2) the information-theoretic metrics most commonly used to evaluate performance over those models. We raise the question of whether these models and metrics remain adequate in light of the advances that wireless systems haveundergone over the last two decades. Weaknesses are pointedout, and ideas on possible fixes are put forth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a standard environment of adverse selection in credit markets. In our environment, entrepreneurs who are privately informed about the quality of their projects need to borrow in order to invest. Conventional wisdom says that, in this class of economies, the competitive equilibrium is typically inefficient. We show that this conventional wisdom rests on one implicit assumption: entrepreneurs can only access monitored lending. If a new set of markets is added to provide entrepreneurs with additional funds, efficiency can be attained in equilibrium. An important characteristic of these additional markets is that lending in them must be unmonitored, in the sense that it does not condition total borrowing or investment by entrepreneurs. This makes it possible to attain efficiency by pooling all entrepreneurs in the new markets while separating them in the markets for monitored loans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excess entry refers to the high failure rate of new entrepreneurial ventures. Economic explanations suggest 'hit and run' entrants and risk-seeking behavior. A psychological explanation is that people (entrepreneurs) are overconfident in their abilities (Camerer & Lovallo, 1999). Characterizing entry decisions as ambiguous gambles, we alternatively suggest following Heath and Tversky (1991) that people seek ambiguity when the source of uncertainty is related to their competence. Overconfidence, as such, plays no role. This hypothesis is confirmed in an experimental study that also documents the phenomenon of reference group neglect. Finally, we emphasize the utility that people gain from engaging in activities that contribute to a sense of competence. This is an important force in economic activity that deserves more explicit attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we proose the infimum of the Arrow-Pratt index of absoluterisk aversion as a measure of global risk aversion of a utility function.We then show that, for any given arbitrary pair of distributions, thereexists a threshold level of global risk aversion such that all increasingconcave utility functions with at least as much global risk aversion wouldrank the two distributions in the same way. Furthermore, this thresholdlevel is sharp in the sense that, for any lower level of global riskaversion, we can find two utility functions in this class yielding oppositepreference relations for the two distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the current economic environment, high-growth companies are particularly relevant for their contribution to employment generation and wealth.This paper discusses the results of a survey that was conducted in order to gain a deeper understanding of high-growth cooperatives through analyzing their financial profiles and then identifying key contributing factors to their growth. To do this, we compared this particular sample with other cooperatives and other high-growth mercantile companies.The results show the main drivers related to high-growth companies success. They are the competitive advantages based on the surveyed group, modern management techniques, quality and productivity, innovation and internationalization. Additionally, we have observed some financial strengths and weaknesses. In this sense, they are under capitalized companies with an unbalanced growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a standard environment of adverse selection in credit markets. In our environment,entrepreneurs who are privately informed about the quality of their projects need toborrow from banks. Conventional wisdom says that, in this class of economies, the competitiveequilibrium is typically inefficient.We show that this conventional wisdom rests on one implicit assumption: entrepreneurscan only borrow from banks. If an additional market is added to provide entrepreneurs withadditional funds, efficiency can be attained in equilibrium. An important characteristic of thisadditional market is that it must be non-exclusive, in the sense that entrepreneurs must be ableto simultaneously borrow from many different lenders operating in it. This makes it possible toattain efficiency by pooling all entrepreneurs in the new market while separating them in themarket for bank loans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a standard environment of adverse selection in credit markets. In our environment,entrepreneurs who are privately informed about the quality of their projects needto borrow in order to invest. Conventional wisdom says that, in this class of economies, thecompetitive equilibrium is typically inefficient.We show that this conventional wisdom rests on one implicit assumption: entrepreneurscan only access monitored lending. If a new set of markets is added to provide entrepreneurswith additional funds, efficiency can be attained in equilibrium. An important characteristic ofthese additional markets is that lending in them must be unmonitored, in the sense that it doesnot condition total borrowing or investment by entrepreneurs. This makes it possible to attainefficiency by pooling all entrepreneurs in the new markets while separating them in the marketsfor monitored loans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excess entry refers to the high failure rate of new entrepreneurial ventures. Economic explanations suggest 'hit and run' entrants and risk-seeking behavior. A psychological explanation is that people (entrepreneurs) are overconfident in their abilities (Camerer & Lovallo, 1999). Characterizing entry decisions as ambiguous gambles, we alternatively suggest following Heath and Tversky (1991) that people seek ambiguity when the source of uncertainty is related to their competence. Overconfidence, as such, plays no role. This hypothesis is confirmed in an experimental study that also documents the phenomenon of reference group neglect. Finally, we emphasize the utility that people gain from engaging in activities that contribute to a sense of competence. This is an important force in economic activity that deserves more explicit attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the realism of mechanisms that implementsocial choice functions in the traditional sense. Will agents actually playthe equilibrium assumed by the analysis? As an example, we study theconvergence and stability properties of Sj\"ostr\"om's (1994) mechanism, onthe assumption that boundedly rational players find their way to equilibriumusing monotonic learning dynamics and also with fictitious play. Thismechanism implements most social choice functions in economic environmentsusing as a solution concept the iterated elimination of weakly dominatedstrategies (only one round of deletion of weakly dominated strategies isneeded). There are, however, many sets of Nash equilibria whose payoffs maybe very different from those desired by the social choice function. Withmonotonic dynamics we show that many equilibria in all the sets ofequilibria we describe are the limit points of trajectories that havecompletely mixed initial conditions. The initial conditions that lead tothese equilibria need not be very close to the limiting point. Furthermore,even if the dynamics converge to the ``right'' set of equilibria, it stillcan converge to quite a poor outcome in welfare terms. With fictitious play,if the agents have completely mixed prior beliefs, beliefs and play convergeto the outcome the planner wants to implement.