940 resultados para Complex network. Optimal path. Optimal path cracks
Resumo:
The deep drop of the fertility rate in Italy to among the lowest in the world challenges contemporary theories of childbearing and family building. Among high-income countries, Italy was presumed to have characteristics of family values and female labor force participation that would favor higher fertility than its European neighbors to the north. We test competing economic and cultural explanations, drawing on new nationally representative, longitudinal data to examine first union, first birth, and second birth. Our event history analysis finds some support for economic determinants of family formation and fertility, but the clear importance of regional differences and of secularization suggests that such an explanation is at best incomplete and that cultural and ideational factors must be considered.
Resumo:
We study optimal public rationing of an indivisible good and private sector price responses. Consumers differ in their wealth and costs of provisions. Due to a limited budget, some consumers must be rationed. Public rationing determines the characteristics of consumers who seek supply from the private sector, where a firm sets prices based on consumers' cost information and in response to the rationing rule. We consider two information regimes. In the first, the public supplier rations consumers according to their wealth information. In equilibrium, the public supplier must ration both rich and poor consumers. Supplying all poor consumers would leave only rich consumers in the private market, and the firm would react by setting a high price. Rationing some poor consumers is optimal, and implements price reduction in the private market. In the second information regime, the public supplier rations consumers according to consumers' wealth and cost information. In equilibrium, consumers are allocated the good if and only if their costs are below a threshold. Wealth information is not used. Rationing based on cost results in higher equilibrium total consumer surplus than rationing based on wealth. [Authors]
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.
Resumo:
The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.
Resumo:
A critical feature of cooperative animal societies is the reproductive skew, a shorthand term for the degree to which a dominant individual monopolizes overall reproduction in the group. Our theoretical analysis of the evolutionarily stable skew in matrifilial (i.e., mother-daughter) societies, in which relatednesses to offspring are asymmetrical, predicts that reproductive skews in such societies should tend to be greater than those of semisocial societies (i.e., societies composed of individuals of the same generation, such as siblings), in which relatednesses to offspring are symmetrical. Quantitative data on reproductive skews in semisocial and matrifilial associations within the same species for 17 eusocial Hymenoptera support this prediction. Likewise, a survey of reproductive partitioning within 20 vertebrate societies demonstrates that complete reproductive monopoly is more likely to occur in matrifilial than in semisocial societies, also as predicted by the optimal skew model.
Resumo:
OBJECTIVE: Surface magnetic resonance imaging (MRI) for aortic plaque assessment is limited by the trade-off between penetration depth and signal-to-noise ratio (SNR). For imaging the deep seated aorta, a combined surface and transesophageal MRI (TEMRI) technique was developed 1) to determine the individual contribution of TEMRI and surface coils to the combined signal, 2) to measure the signal improvement of a combined surface and TEMRI over surface MRI, and 3) to assess for reproducibility of plaque dimension analysis. METHODS AND RESULTS: In 24 patients six black blood proton-density/T2-weighted fast-spin echo images were obtained using three surface and one TEMRI coil for SNR measurements. Reproducibility of plaque dimensions (combined surface and TEMRI) was measured in 10 patients. TEMRI contributed 68% of the signal in the aortic arch and descending aorta, whereas the overall signal gain using the combined technique was up to 225%. Plaque volume measurements had an intraclass correlation coefficient of as high as 0.97. CONCLUSION: Plaque volume measurements for the quantification of aortic plaque size are highly reproducible for combined surface and TEMRI. The TEMRI coil contributes considerably to the aortic MR signal. The combined surface and TEMRI approach improves aortic signal significantly as compared to surface coils alone. CONDENSED ABSTRACT: Conventional MRI aortic plaque visualization is limited by the penetration depth of MRI surface coils and may lead to suboptimal image quality with insufficient reproducibility. By combining a transesophageal MRI (TEMRI) with surface MRI coils we enhanced local and overall image SNR for improved image quality and reproducibility.
Resumo:
An incentives based theory of policing is developed which can explain the phenomenon of random “crackdowns,” i.e., intermittent periods of high interdiction/surveillance. For a variety of police objective functions, random crackdowns can be part of the optimal monitoring strategy. We demonstrate support for implications of the crackdown theory using traffic data gathered by the Belgian Police Department and use the model to estimate the deterrence effectof additional resources spent on speeding interdiction.
Resumo:
This paper studies monetary and fiscal policy interactions in a two country model, where taxes on firms sales are optimally chosen and the monetary policy is set cooperatively.It turns out that in a two country setting non-cooperative fiscal policy makers have an incentive to change taxes on sales depending on shocks realizations in order to reduce output production. Therefore whether the fiscal policy is set cooperatively or not matters for optimal monetary policy decisions. Indeed, as already shown in the literature, the cooperative monetary policy maker implements the flexible price allocation only when special conditions on the value of the distortions underlying the economy are met. However, if non-cooperative fiscal policy makers set the taxes on firms sales depending on shocks realizations, these conditions cannot be satisfied; conversely, when fiscal policy is cooperative, these conditions are fulfilled. We conclude that whether implementing the flexible price allocation is optimal or not depends on the fiscal policy regime.
Resumo:
Revenue management practices often include overbooking capacity to account for customerswho make reservations but do not show up. In this paper, we consider the network revenuemanagement problem with no-shows and overbooking, where the show-up probabilities are specificto each product. No-show rates differ significantly by product (for instance, each itinerary andfare combination for an airline) as sale restrictions and the demand characteristics vary byproduct. However, models that consider no-show rates by each individual product are difficultto handle as the state-space in dynamic programming formulations (or the variable space inapproximations) increases significantly. In this paper, we propose a randomized linear program tojointly make the capacity control and overbooking decisions with product-specific no-shows. Weestablish that our formulation gives an upper bound on the optimal expected total profit andour upper bound is tighter than a deterministic linear programming upper bound that appearsin the existing literature. Furthermore, we show that our upper bound is asymptotically tightin a regime where the leg capacities and the expected demand is scaled linearly with the samerate. We also describe how the randomized linear program can be used to obtain a bid price controlpolicy. Computational experiments indicate that our approach is quite fast, able to scale to industrialproblems and can provide significant improvements over standard benchmarks.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.