142 resultados para stochastic optimisation threshold policy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animals can often coordinate their actions to achieve mutually beneficial outcomes. However, this can result in a social dilemma when uncertainty about the behavior of partners creates multiple fitness peaks. Strategies that minimize risk ("risk dominant") instead of maximizing reward ("payoff dominant") are favored in economic models when individuals learn behaviors that increase their payoffs. Specifically, such strategies are shown to be "stochastically stable" (a refinement of evolutionary stability). Here, we extend the notion of stochastic stability to biological models of continuous phenotypes at a mutation-selection-drift balance. This allows us to make a unique prediction for long-term evolution in games with multiple equilibria. We show how genetic relatedness due to limited dispersal and scaled to account for local competition can crucially affect the stochastically-stable outcome of coordination games. We find that positive relatedness (weak local competition) increases the chance the payoff dominant strategy is stochastically stable, even when it is not risk dominant. Conversely, negative relatedness (strong local competition) increases the chance that strategies evolve that are neither payoff nor risk dominant. Extending our results to large multiplayer coordination games we find that negative relatedness can create competition so extreme that the game effectively changes to a hawk-dove game and a stochastically stable polymorphism between the alternative strategies evolves. These results demonstrate the usefulness of stochastic stability in characterizing long-term evolution of continuous phenotypes: the outcomes of multiplayer games can be reduced to the generic equilibria of two-player games and the effect of spatial structure can be analyzed readily.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Hypoglycaemia (glucose <2.2 mmol/l) is a defining feature of severe malaria, but the significance of other levels of blood glucose has not previously been studied in children with severe malaria. METHODS: A prospective study of 437 consecutive children with presumed severe malaria was conducted in Mali. We defined hypoglycaemia as <2.2 mmol/l, low glycaemia as 2.2-4.4 mmol/l and hyperglycaemia as >8.3 mmol/l. Associations between glycaemia and case fatality were analysed for 418 children using logistic regression models and a receiver operator curve (ROC). RESULTS: There was a significant difference between blood glucose levels in children who died (median 4.6 mmol/l) and survivors (median 7.6 mmol/l, P < 0.001). Case fatality declined from 61.5% of the hypoglycaemic children to 46.2% of those with low glycaemia, 13.4% of those with normal glycaemia and 7.6% of those with hyperglycaemia (P < 0.001). Logistic regression showed an adjusted odds ratio (AOR) of 0.75 (0.64-0.88) for case fatality per 1 mmol/l increase in baseline blood glucose. Compared to a normal blood glucose, hypoglycaemia and low glycaemia both significantly increased the odds of death (AOR 11.87, 2.10-67.00; and 5.21, 1.86-14.63, respectively), whereas hyperglycaemia reduced the odds of death (AOR 0.34, 0.13-0.91). The ROC [area under the curve at 0.753 (95% CI 0.684-0.820)] indicated that glycaemia had a moderate predictive value for death and identified an optimal threshold at glycaemia <6.1 mmol/l, (sensitivity 64.5% and specificity 75.1%). CONCLUSIONS: If there is a threshold of blood glucose which defines a worse prognosis, it is at a higher level than the current definition of 2.2 mmol/l.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To explore detainees and staff's attitudes towards tobacco use, in order to assist prison administrators to develop an ethically acceptable tobacco control policy based on stakeholders' opinion. DESIGN: Qualitative study based on in-depth semi-structured interviews with 31 prisoners and 27 staff prior (T1) and after the implementation (T2) of a new smoke-free regulation (2009) in a Swiss male post-trial prison consisting of 120 detainees and 120 employees. RESULTS: At T1, smoking was allowed in common indoor rooms and most working places. Both groups of participants expressed the need for a more uniform and stricter regulation, with general opposition towards a total smoking ban. Expressed fears and difficulties regarding a stricter regulation were increased stress on detainees and strain on staff, violence, riots, loss of control on detainees, and changes in social life. At T2, participants expressed predominantly satisfaction. They reported reduction in their own tobacco use and a better protection against second-hand smoke. However, enforcement was incomplete. The debate was felt as being concentrated on regulation only, leaving aside the subject of tobacco reduction or cessation support. CONCLUSION: Besides an appropriate smoke-free regulation, further developments are necessary in order to have a comprehensive tobacco control policy in prisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cultural variation in a population is affected by the rate of occurrence of cultural innovations, whether such innovations are preferred or eschewed, how they are transmitted between individuals in the population, and the size of the population. An innovation, such as a modification in an attribute of a handaxe, may be lost or may become a property of all handaxes, which we call "fixation of the innovation." Alternatively, several innovations may attain appreciable frequencies, in which case properties of the frequency distribution-for example, of handaxe measurements-is important. Here we apply the Moran model from the stochastic theory of population genetics to study the evolution of cultural innovations. We obtain the probability that an initially rare innovation becomes fixed, and the expected time this takes. When variation in cultural traits is due to recurrent innovation, copy error, and sampling from generation to generation, we describe properties of this variation, such as the level of heterogeneity expected in the population. For all of these, we determine the effect of the mode of social transmission: conformist, where there is a tendency for each naïve newborn to copy the most popular variant; pro-novelty bias, where the newborn prefers a specific variant if it exists among those it samples; one-to-many transmission, where the variant one individual carries is copied by all newborns while that individual remains alive. We compare our findings with those predicted by prevailing theories for rates of cultural change and the distribution of cultural variation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of CT applications might become a public health problem if no effort is made on the justification and the optimisation of the examinations. This paper presents some hints to assure that the risk-benefit compromise remains in favour of the patient, especially when one deals with the examinations of young patients. In this context a particular attention has to be made on the justification of the examination. When performing the acquisition one needs to optimise the extension of the volume investigated together with the number of acquisition sequences used. Finally, the use of automatic exposure systems, now available on all the units, and the use of the Diagnostic Reference Levels (DRL) should allow help radiologists to control the exposure of their patients.