967 resultados para optimal foraging theory


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many theorists have wrestled with the notion of how people balance their need to be included in social groups with their need to be different and distinctive. This question is particularly salient to researchers from the social identify perspective, who have traditionally viewed individual differentiation within groups as being inimical to group identification. In this article we present a number of strategies that people can use to balance their need to belong and their need to be different, without violating social identity principles. First, drawing from optimal distinctiveness theory, we discuss 4 ways in which the need for belonging and the need to be different can be resolved by maximizing group distinctiveness. We then discuss 4 ways in which it is possible to achieve individual differentiation within a group at the same time demonstrating group identification. These strategies are discussed and integrated with reference to recent empirical research and to the social identity perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The goal of this manuscript is to introduce a framework for consideration of designs for population pharmacokinetic orpharmacokinetic-pharmacodynamic studies. A standard one compartment pharmacokinetic model with first-order input and elimination is considered. A series of theoretical designs are considered that explore the influence of optimizing the allocation of sampling times, allocating patients to elementary designs, consideration of sparse sampling and unbalanced designs and also the influence of single vs. multiple dose designs. It was found that what appears to be relatively sparse sampling (less blood samples per patient than the number of fixed effects parameters to estimate) can also be highly informative. Overall, it is evident that exploring the population design space can yield many parsimonious designs that are efficient for parameter estimation and that may not otherwise have been considered without the aid of optimal design theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

123 p.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper analyzes the impact of a geographical social grouping (neighborhood) and its relative perceived size in the spontaneous group’s identiication level and place satisfaction, as well as the intensity of and motives for discrimination against inhabitants of other places. Two studies are presented: an experimental one using the minimal group categorization paradigm and an onsite investigation of a city neighborhood. Consistent with the predictions, the results showed that smaller neighborhoods reported higher identiication and satisfaction with the place of residence, as well as higher discrimination of other neighborhoods. In line with the optimal distinctiveness theory (ODT), the indings showed that the motivation for discrimination varies as a function of the in-group size. Thus, the members of larger groups discriminate by increasing the diferentiation between the in-group and the out-group, whereas the members of smaller groups increased the value of the in-group. Furthermore, the results were consistent with a social identity theory and ODT explanation of diverse research that shows the non-trivial nature of geographical bounded social grouping and its importance in a diverse set of contexts and its impact in inter-neighborhood relationships.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An incentives based theory of policing is developed which can explain the phenomenon of random “crackdowns,” i.e., intermittent periods of high interdiction/surveillance. For a variety of police objective functions, random crackdowns can be part of the optimal monitoring strategy. We demonstrate support for implications of the crackdown theory using traffic data gathered by the Belgian Police Department and use the model to estimate the deterrence effectof additional resources spent on speeding interdiction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The method of stochastic dynamic programming is widely used in ecology of behavior, but has some imperfections because of use of temporal limits. The authors presented an alternative approach based on the methods of the theory of restoration. Suggested method uses cumulative energy reserves per time unit as a criterium, that leads to stationary cycles in the area of states. This approach allows to study the optimal feeding by analytic methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper considers tests which maximize the weighted average power (WAP). The focus is on determining WAP tests subject to an uncountable number of equalities and/or inequalities. The unifying theory allows us to obtain tests with correct size, similar tests, and unbiased tests, among others. A WAP test may be randomized and its characterization is not always possible. We show how to approximate the power of the optimal test by sequences of nonrandomized tests. Two alternative approximations are considered. The rst approach considers a sequence of similar tests for an increasing number of boundary conditions. This discretization allows us to implement the WAP tests in practice. The second method nds a sequence of tests which approximate the WAP test uniformly. This approximation allows us to show that WAP similar tests are admissible. The theoretical framework is readily applicable to several econometric models, including the important class of the curved-exponential family. In this paper, we consider the instrumental variable model with heteroskedastic and autocorrelated errors (HAC-IV) and the nearly integrated regressor model. In both models, we nd WAP similar and (locally) unbiased tests which dominate other available tests.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the social loss function cannot serve as a direct loss function for the central bank. Accordingly, we employ implementation theory to design a central bank loss function (mechanism design) with consistent targets, while the social loss function serves as a social welfare criterion. That is, with the correct mechanism design for the central bank loss function, optimal policy and consistent policy become identical. In other words, optimal policy proves implementable (consistent).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Monitoring of marine reserves has traditionally focused on the task of rejecting the null hypothesis that marine reserves have no impact on the population and community structure of harvested populations. We consider the role of monitoring of marine reserves to gain information needed for management decisions. In particular we use a decision theoretic framework to answer the question: how long should we monitor the recovery of an over-fished stock to determine the fraction of that stock to reserve? This exposes a natural tension between the cost (in terms of time and money) of additional monitoring, and the benefit of more accurately parameterizing a population model for the stock, that in turn leads to a better decision about the optimal size for the reserve with respect to harvesting. We found that the optimal monitoring time frame is rarely more than 5 years. A higher economic discount rate decreased the optimal monitoring time frame, making the expected benefit of more certainty about parameters in the system negligible compared with the expected gain from earlier exploitation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A theory of an optimal distribution of the gain of in-line amplifiers in dispersion-managed transmission systems is developed. As an example of the application of the general method, a design of the line with periodically imbalanced in-line amplification is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimal discrimination of nonorthogonal quantum states with minimum error probability is a fundamental task in quantum measurement theory as well as an important primitive in optical communication. In this work, we propose and experimentally realize a new and simple quantum measurement strategy capable of discriminating two coherent states with smaller error probabilities than can be obtained using the standard measurement devices: the Kennedy receiver and the homodyne receiver.