66 resultados para Mixed integer programming
Resumo:
Background: Optimization methods allow designing changes in a system so that specific goals are attained. These techniques are fundamental for metabolic engineering. However, they are not directly applicable for investigating the evolution of metabolic adaptation to environmental changes. Although biological systems have evolved by natural selection and result in well-adapted systems, we can hardly expect that actual metabolic processes are at the theoretical optimum that could result from an optimization analysis. More likely, natural systems are to be found in a feasible region compatible with global physiological requirements. Results: We first present a new method for globally optimizing nonlinear models of metabolic pathways that are based on the Generalized Mass Action (GMA) representation. The optimization task is posed as a nonconvex nonlinear programming (NLP) problem that is solved by an outer- approximation algorithm. This method relies on solving iteratively reduced NLP slave subproblems and mixed-integer linear programming (MILP) master problems that provide valid upper and lower bounds, respectively, on the global solution to the original NLP. The capabilities of this method are illustrated through its application to the anaerobic fermentation pathway in Saccharomyces cerevisiae. We next introduce a method to identify the feasibility parametric regions that allow a system to meet a set of physiological constraints that can be represented in mathematical terms through algebraic equations. This technique is based on applying the outer-approximation based algorithm iteratively over a reduced search space in order to identify regions that contain feasible solutions to the problem and discard others in which no feasible solution exists. As an example, we characterize the feasible enzyme activity changes that are compatible with an appropriate adaptive response of yeast Saccharomyces cerevisiae to heat shock Conclusion: Our results show the utility of the suggested approach for investigating the evolution of adaptive responses to environmental changes. The proposed method can be used in other important applications such as the evaluation of parameter changes that are compatible with health and disease states.
Resumo:
Annualising work hours (AH) is a means of achievement flexibility in the use of human resources to face the seasonal nature of demand. In Corominas et al. (1) two MILP models are used to solve the problem of planning staff working hours with annual horizon. The costs due to overtime and to the employment of temporary workers are minimised, and the distribution of working time over the course of the year for each worker and the distribution of working time provided by temporary workers are regularised.In the aforementioned paper, the following is assumed: (i) the holiday weeks are fixed a priori and (ii) the workers are from different categories who are able to perform specific type of task have se same efficiency; moreover, the values of the binary variables (and others) in the second model are fixed to those in the first model (thus, in the second model these will intervene as constants and not as variables, resulting in an LP model).In the present paper, these assumptions are relaxed and a more general problem is solved. The computational experiment leads to the conclusion that MILP is a technique suited to dealing with the problem.
Resumo:
The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method
Resumo:
This paper aims to estimate a translog stochastic frontier production function in the analysis of a panel of 150 mixed Catalan farms in the period 1989-1993, in order to attempt to measure and explain variation in technical inefficiency scores with a one-stage approach. The model uses gross value added as the output aggregate measure. Total employment, fixed capital, current assets, specific costs and overhead costs are introduced into the model as inputs. Stochasticfrontier estimates are compared with those obtained using a linear programming method using a two-stage approach. The specification of the translog stochastic frontier model appears as an appropriate representation of the data, technical change was rejected and the technical inefficiency effects were statistically significant. The mean technical efficiency in the period analyzed was estimated to be 64.0%. Farm inefficiency levels were found significantly at 5%level and positively correlated with the number of economic size units.
Resumo:
We describe an equivalence of categories between the category of mixed Hodge structures and a category of vector bundles on the toric complex projective plane which verify some semistability condition. We then apply this correspondence to define an invariant which generalises the notion of R-split mixed Hodge structure and compute extensions in the category of mixed Hodge structures in terms of extensions of the corresponding vector bundles. We also give a relative version of this correspondence and apply it to define stratifications of the bases of the variations of mixed Hodge structure.
Resumo:
We study the optimal public intervention in setting minimum standards of formation for specialized medical care. The abilities the physicians obtain by means of their training allow them to improve their performance as providers of cure and earn some monopoly rents.. Our aim is to characterize the most efficient regulation in this field taking into account different regulatory frameworks. We find that the existing situation in some countries, in which the amount of specialization is controlled, and the costs of this process of specialization are publicly financed, can be supported as the best possible intervention.
Resumo:
Actualment, la resposta de la majoria d’instrumentació operacional i dels dosímetres personals utilitzats en radioprotecció per a la dosimetria neutrònica és altament dependent de l’energia dels espectres neutrònics a analitzar, especialment amb camps neutrònics amb una important component intermitja. En conseqüència, la interpretació de les lectures d’aquests aparells es complicada si no es té un coneixement previ de la distribució espectral de la fluència neutrònica en els punts d’interès. El Grup de Física de les Radiacions de la Universitat Autònoma de Barcelona (GFR-UAB) ha desenvolupat en els últims anys un espectròmetre de neutrons basat en un Sistema d’Esferes Bonner (BSS) amb un contador proporcional d’3He com a detector actiu. Els principals avantatges dels espectròmetres de neutrons per BSS són: la seva resposta isotròpica, la possibilitat de discriminar la component neutrònica de la gamma en camps mixtos, i la seva alta sensibilitat neutrònica als nivells de dosi analitzats. Amb aquestes característiques, els espectròmetres neutrònics per BSS compleixen amb els estándards de les últimes recomanacions de la ICRP i poden ser utilitzats també en el camp de la dosimetria neutrònica per a la mesura de dosis en el rang d’energia que va dels tèrmics fins als 20 MeV, en nou ordres de magnitud. En el marc de la col•laboració entre el GFR - UAB i el Laboratorio Nazionale di Frascati – Istituto Nazionale di Fisica Nucleare (LNF-INFN), ha tingut lloc una experiència comparativa d’espectrometria per BSS amb els feixos quasi monoenergètics de 2.5 MeV i 14 MeV del Fast Neutron Generator de l’ENEA. En l’exercici s’ha determinat l’espectre neutrònic a diferents distàncies del blanc de l’accelerador, aprofitant el codi FRUIT recentment desenvolupat pel grup LNF. Els resultats obtinguts mostren una bona coherència entre els dos espectròmetres i les dades mesurades i simulades.
Resumo:
We first recall the construction of the Chow motive modelling intersection cohomology of a proper surface X and study its fundamental properties. Using Voevodsky's category of effective geometrical motives, we then study the motive of the exceptional divisor D in a non-singular blow-up of X. If all geometric irreducible components of D are of genus zero, then Voevodsky's formalism allows us to construct certain one-extensions of Chow motives, as canonical subquotients of the motive with compact support of the smooth part of X. Specializing to Hilbert-Blumenthal surfaces, we recover a motivic interpretation of a recent construction of A. Caspar.
Resumo:
We propose a mixed finite element method for a class of nonlinear diffusion equations, which is based on their interpretation as gradient flows in optimal transportation metrics. We introduce an appropriate linearization of the optimal transport problem, which leads to a mixed symmetric formulation. This formulation preserves the maximum principle in case of the semi-discrete scheme as well as the fully discrete scheme for a certain class of problems. In addition solutions of the mixed formulation maintain exponential convergence in the relative entropy towards the steady state in case of a nonlinear Fokker-Planck equation with uniformly convex potential. We demonstrate the behavior of the proposed scheme with 2D simulations of the porous medium equations and blow-up questions in the Patlak-Keller-Segel model.
Resumo:
This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.
Resumo:
This paper studies endogenous mergers of complements with mixed bundling, by allowing both for joint and separate consumption. After merger, partner fi rms decrease the price of the bundled system. Besides, when markets for individual components are suffi ciently important, partner firms raise prices of stand-alone products, exploiting their monopoly power in local markets and making substitute 'mix-and-match' composite products less attractive to consumers. Even though these effects favor the pro fitability of mergers, merging is not always an equilibrium outcome. The reason is that outsiders respond by cutting their prices to retain their market share, and mergers can be unprofitable when competition is intense. From a welfare analysis, we observe that the number of mergers observed in equilibrium may be either excessive (when markets for individual components are important) or suboptimal (when markets for individual components are less important). Keywords: complements; merger; mixed bundling; separate consumption JEL classi fication: L13; L41; D43
Resumo:
El projecte que es presenta permet analitzar els avantatges i inconvenients d’una programació orientada a hardware i d’una programació orientada a software a partir del desenvolupament de dos dissenys, un cronòmetre i un freqüencímetre en cadascun dels modes de programació. Donat que en les dues aplicacions es requereix alta precisió de temps (μs) i flexibilitat en el control, la solució final que es proposa és un disseny “mixt” amb dos mòduls hardware específics (cronòmetre i freqüencímetre) integrats en un NIOS/CPU sobre una FPGA. Els dos mòduls es controlen per software sobre un sistema Linux empotrat (μCLinux).
Resumo:
In This work we present a Web-based tool developed with the aim of reinforcing teaching and learning of introductory programming courses. This tool provides support for teaching and learning. From the teacher's perspective the system introduces important gains with respect to the classical teaching methodology. It reinforces lecture and laboratory sessions, makes it possible to give personalized attention to the student, assesses the degree of participation of the students and most importantly, performs a continuous assessment of the student's progress. From the student's perspective it provides a learning framework, consisting in a help environment and a correction environment, which facilitates their personal work. With this tool students are more motivated to do programming
Resumo:
In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)
Resumo:
Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.