871 resultados para Optimal Redundancy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of fading channels it is well established that, with a constrained transmit power, the bit rates achievable by signals that are not peaky vanish as the bandwidth grows without bound. Stepping back from the limit, we characterize the highest bit rate achievable by such non-peaky signals and the approximate bandwidth where that apex occurs. As it turns out, the gap between the highest rate achievable without peakedness and the infinite-bandwidth capacity (with unconstrained peakedness) is small for virtually all settings of interest to wireless communications. Thus, although strictly achieving capacity in wideband fading channels does require signal peakedness, bit rates not far from capacity can be achieved with conventional signaling formats that do not exhibit the serious practical drawbacks associated with peakedness. In addition, we show that the asymptotic decay of bit rate in the absence of peakedness usually takes hold at bandwidths so large that wideband fading models are called into question. Rather, ultrawideband models ought to be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study optimal public rationing of an indivisible good and private sector price responses. Consumers differ in their wealth and costs of provisions. Due to a limited budget, some consumers must be rationed. Public rationing determines the characteristics of consumers who seek supply from the private sector, where a firm sets prices based on consumers' cost information and in response to the rationing rule. We consider two information regimes. In the first, the public supplier rations consumers according to their wealth information. In equilibrium, the public supplier must ration both rich and poor consumers. Supplying all poor consumers would leave only rich consumers in the private market, and the firm would react by setting a high price. Rationing some poor consumers is optimal, and implements price reduction in the private market. In the second information regime, the public supplier rations consumers according to consumers' wealth and cost information. In equilibrium, consumers are allocated the good if and only if their costs are below a threshold. Wealth information is not used. Rationing based on cost results in higher equilibrium total consumer surplus than rationing based on wealth. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abiotic factors such as climate and soil determine the species fundamental niche, which is further constrained by biotic interactions such as interspecific competition. To parameterize this realized niche, species distribution models (SDMs) most often relate species occurrence data to abiotic variables, but few SDM studies include biotic predictors to help explain species distributions. Therefore, most predictions of species distributions under future climates assume implicitly that biotic interactions remain constant or exert only minor influence on large-scale spatial distributions, which is also largely expected for species with high competitive ability. We examined the extent to which variance explained by SDMs can be attributed to abiotic or biotic predictors and how this depends on species traits. We fit generalized linear models for 11 common tree species in Switzerland using three different sets of predictor variables: biotic, abiotic, and the combination of both sets. We used variance partitioning to estimate the proportion of the variance explained by biotic and abiotic predictors, jointly and independently. Inclusion of biotic predictors improved the SDMs substantially. The joint contribution of biotic and abiotic predictors to explained deviance was relatively small (similar to 9%) compared to the contribution of each predictor set individually (similar to 20% each), indicating that the additional information on the realized niche brought by adding other species as predictors was largely independent of the abiotic (topo-climatic) predictors. The influence of biotic predictors was relatively high for species preferably growing under low disturbance and low abiotic stress, species with long seed dispersal distances, species with high shade tolerance as juveniles and adults, and species that occur frequently and are dominant across the landscape. The influence of biotic variables on SDM performance indicates that community composition and other local biotic factors or abiotic processes not included in the abiotic predictors strongly influence prediction of species distributions. Improved prediction of species' potential distributions in future climates and communities may assist strategies for sustainable forest management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Health literacy is defined as "the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions." Low health literacy mainly affects certain populations at risk limiting access to care, interaction with caregivers and self-management. If there are screening tests, their routine use is not advisable and recommended interventions in practice consist rather to reduce barriers to patient-caregiver communication. It is thus important to include not only population's health literacy but also communication skills of a health system wich tend to become more complex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A critical feature of cooperative animal societies is the reproductive skew, a shorthand term for the degree to which a dominant individual monopolizes overall reproduction in the group. Our theoretical analysis of the evolutionarily stable skew in matrifilial (i.e., mother-daughter) societies, in which relatednesses to offspring are asymmetrical, predicts that reproductive skews in such societies should tend to be greater than those of semisocial societies (i.e., societies composed of individuals of the same generation, such as siblings), in which relatednesses to offspring are symmetrical. Quantitative data on reproductive skews in semisocial and matrifilial associations within the same species for 17 eusocial Hymenoptera support this prediction. Likewise, a survey of reproductive partitioning within 20 vertebrate societies demonstrates that complete reproductive monopoly is more likely to occur in matrifilial than in semisocial societies, also as predicted by the optimal skew model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Surface magnetic resonance imaging (MRI) for aortic plaque assessment is limited by the trade-off between penetration depth and signal-to-noise ratio (SNR). For imaging the deep seated aorta, a combined surface and transesophageal MRI (TEMRI) technique was developed 1) to determine the individual contribution of TEMRI and surface coils to the combined signal, 2) to measure the signal improvement of a combined surface and TEMRI over surface MRI, and 3) to assess for reproducibility of plaque dimension analysis. METHODS AND RESULTS: In 24 patients six black blood proton-density/T2-weighted fast-spin echo images were obtained using three surface and one TEMRI coil for SNR measurements. Reproducibility of plaque dimensions (combined surface and TEMRI) was measured in 10 patients. TEMRI contributed 68% of the signal in the aortic arch and descending aorta, whereas the overall signal gain using the combined technique was up to 225%. Plaque volume measurements had an intraclass correlation coefficient of as high as 0.97. CONCLUSION: Plaque volume measurements for the quantification of aortic plaque size are highly reproducible for combined surface and TEMRI. The TEMRI coil contributes considerably to the aortic MR signal. The combined surface and TEMRI approach improves aortic signal significantly as compared to surface coils alone. CONDENSED ABSTRACT: Conventional MRI aortic plaque visualization is limited by the penetration depth of MRI surface coils and may lead to suboptimal image quality with insufficient reproducibility. By combining a transesophageal MRI (TEMRI) with surface MRI coils we enhanced local and overall image SNR for improved image quality and reproducibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An incentives based theory of policing is developed which can explain the phenomenon of random “crackdowns,” i.e., intermittent periods of high interdiction/surveillance. For a variety of police objective functions, random crackdowns can be part of the optimal monitoring strategy. We demonstrate support for implications of the crackdown theory using traffic data gathered by the Belgian Police Department and use the model to estimate the deterrence effectof additional resources spent on speeding interdiction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies monetary and fiscal policy interactions in a two country model, where taxes on firms sales are optimally chosen and the monetary policy is set cooperatively.It turns out that in a two country setting non-cooperative fiscal policy makers have an incentive to change taxes on sales depending on shocks realizations in order to reduce output production. Therefore whether the fiscal policy is set cooperatively or not matters for optimal monetary policy decisions. Indeed, as already shown in the literature, the cooperative monetary policy maker implements the flexible price allocation only when special conditions on the value of the distortions underlying the economy are met. However, if non-cooperative fiscal policy makers set the taxes on firms sales depending on shocks realizations, these conditions cannot be satisfied; conversely, when fiscal policy is cooperative, these conditions are fulfilled. We conclude that whether implementing the flexible price allocation is optimal or not depends on the fiscal policy regime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The achievable region approach seeks solutions to stochastic optimisation problems by: (i) characterising the space of all possible performances(the achievable region) of the system of interest, and (ii) optimisingthe overall system-wide performance objective over this space. This isradically different from conventional formulations based on dynamicprogramming. The approach is explained with reference to a simpletwo-class queueing system. Powerful new methodologies due to the authorsand co-workers are deployed to analyse a general multiclass queueingsystem with parallel servers and then to develop an approach to optimalload distribution across a network of interconnected stations. Finally,the approach is used for the first time to analyse a class of intensitycontrol problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To understand whether retailers should consider consumer returns when merchandising, we study howthe optimal assortment of a price-taking retailer is influenced by its return policy. The retailer selects itsassortment from an exogenous set of horizontally differentiated products. Consumers make purchase andkeep/return decisions in nested multinomial logit fashion. Our main finding is that the optimal assortmenthas a counterintuitive structure for relatively strict return policies: It is optimal to offer a mix of the mostpopular and most eccentric products when the refund amount is sufficiently low, which can be viewed asa form of risk sharing between the retailer and consumers. In contrast, if the refund is sufficiently high, orwhen returns are disallowed, optimal assortment is composed of only the most popular products (a commonfinding in the literature). We provide preliminary empirical evidence for one of the key drivers of our results:more eccentric products have higher probability of return conditional on purchase. In light of our analyticalfindings and managerial insights, we conclude that retailers should take their return policies into accountwhen merchandising.