962 resultados para Constrained Minimization
Resumo:
We explore the finish-to-start precedence relations of project activities used in scheduling problems. From these relations, we devise a method to identify groups of activities that could execute concurrently, i.e. activities in the same group can all execute in parallel. The method derives a new set of relations to describe the concurrency. Then, it is represented by an undirected graph and the maximal cliques problem identifies the groups. We provide a running example with a project from our previous studies in resource constrained project cost minimization together with an example application on the concurrency detection method: the evaluation of the resource stress.
Resumo:
"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"
Resumo:
Quenching process, TRIP, J2-plasticity theory, phase transition, distortion
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2012
Resumo:
This paper analyzes the role of financial development as a source of endogenous instability in small open economies. By assuming that firms face credit constraints, our model displays a complex dynamic behavior for intermediate values of the parameter representing the level of financial development of the economy. The basic implication of our model is that economies experiencing a process of financial development are more unstable than both very underdeveloped and very developed economies. Our instability concept means that small shocks have a persistent effect on the long run behavior of the model and also that economies can exhibit cycles with a very high period or even chaotic dynamic patterns.
Resumo:
Recently, several school districts in the US have adopted or consider adopting the Student-Optimal Stable Mechanism or the Top Trading Cycles Mechanism to assign children to public schools. There is clear evidence that for school districts that employ (variants of) the so-called Boston Mechanism the transition would lead to efficiency gains. The first two mechanisms are strategy-proof, but in practice student assignment procedures impede students to submit a preference list that contains all their acceptable schools. Therefore, any desirable property of the mechanisms is likely toget distorted. We study the non trivial preference revelation game where students can only declare up to a fixed number (quota) of schools to be acceptable. We focus on the stability of the Nash equilibrium outcomes. Our main results identify rather stringent necessary and sufficient conditions on the priorities to guaranteestability. This stands in sharp contrast with the Boston Mechanism which yields stable Nash equilibrium outcomes, independently of the quota. Hence, the transition to any of the two mechanisms is likely to come with a higher risk that students seek legal actionas lower priority students may occupy more preferred schools.
Resumo:
The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.
Resumo:
It is often alleged that high auction prices inhibit service deployment. We investigate this claim under the extreme case of financially constrained bidders. If demand is just slightly elastic, auctions maximize consumer surplus if consumer surplus is a convex function of quantity (a common assumption), or if consumer surplus is concave and the proportion of expenditure spent on deployment is greater than one over the elasticity of demand. The latter condition appears to be true for most of the large telecom auctions in the US and Europe. Thus, even if high auction prices inhibit service deployment, auctions appear to be optimal from the consumers' point of view.
Resumo:
The literature on school choice assumes that families can submit a preference list over all the schools they want to be assigned to. However, in many real-life instances families are only allowed to submit a list containing a limited number of schools. Subjects' incentives are drastically affected, as more individuals manipulate their preferences. Including a safety school in the constrained list explains most manipulations. Competitiveness across schools play an important role. Constraining choices increases segregation and affects the stability and efficiency of the final allocation. Remarkably, the constraint reduces significantly the proportion of subjects playing a dominated strategy.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
We show that standard expenditure multipliers capture economy-wide effects of new government projects only when financing constraints are not binding. In actual policy making, however, new projects usually need financing. Under liquidity constraints, new projects are subject to two opposite effects: an income effect and a set of spending substitution effects. The former is the traditional, unrestricted, multiplier effect; the latter is the result of expenditure reallocation to upheld effective financing constraints. Unrestricted multipliers will therefore be, as a general rule, upward biased and policy designs based upon them should be reassessed in the light of the countervailing substitution effects.
Resumo:
Inspired by experiments that use single-particle tracking to measure the regions of confinement of selected chromosomal regions within cell nuclei, we have developed an analytical approach that takes into account various possible positions and shapes of the confinement regions. We show, in particular, that confinement of a particle into a subregion that is entirely enclosed within a spherical volume can lead to a higher limit of the mean radial square displacement value than the one associated with a particle that can explore the entire spherical volume. Finally, we apply the theory to analyse the motion of extrachromosomal chromatin rings within nuclei of living yeast.
Resumo:
Long-term outcomes after kidney transplantation remain suboptimal, despite the great achievements observed in recent years with the use of modern immunosuppressive drugs. Currently, the calcineurin inhibitors (CNI) cyclosporine and tacrolimus remain the cornerstones of immunosuppressive regimens in many centers worldwide, regardless of their well described side-effects, including nephrotoxicity. In this article, we review recent CNI-minimization strategies in kidney transplantation, while emphasizing on the importance of long-term follow-up and patient monitoring. Finally, accumulating data indicate that low-dose CNI-based regimens would provide an interesting balance between efficacy and toxicity.
Resumo:
In this paper we axiomatize the strong constrained egalitarian solution (Dutta and Ray, 1991) over the class of weak superadditive games using constrained egalitarianism, order-consistency, and converse order-consistency. JEL classification: C71, C78. Keywords: Cooperative TU-game, strong constrained egalitarian solution, axiomatization.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.