957 resultados para Optimal reactive dispatch problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existe normalmente el propósito de obtener la mejor solución posible cuando se plantea un problema estructural, entendiendo como mejor la solución que cumpliendo los requisitos estructurales, de uso, etc., tiene un coste físico menor. En una primera aproximación se puede representar el coste físico por medio del peso propio de la estructura, lo que permite plantear la búsqueda de la mejor solución como la de menor peso. Desde un punto de vista práctico, la obtención de buenas soluciones—es decir, soluciones cuyo coste sea solo ligeramente mayor que el de la mejor solución— es una tarea tan importante como la obtención de óptimos absolutos, algo en general difícilmente abordable. Para disponer de una medida de la eficiencia que haga posible la comparación entre soluciones se propone la siguiente definición de rendimiento estructural: la razón entre la carga útil que hay que soportar y la carga total que hay que contabilizar (la suma de la carga útil y el peso propio). La forma estructural puede considerarse compuesta por cuatro conceptos, que junto con el material, definen una estructura: tamaño, esquema, proporción, y grueso.Galileo (1638) propuso la existencia de un tamaño insuperable para cada problema estructural— el tamaño para el que el peso propio agota una estructura para un esquema y proporción dados—. Dicho tamaño, o alcance estructural, será distinto para cada material utilizado; la única información necesaria del material para su determinación es la razón entre su resistencia y su peso especifico, una magnitud a la que denominamos alcance del material. En estructuras de tamaño muy pequeño en relación con su alcance estructural la anterior definición de rendimiento es inútil. En este caso —estructuras de “talla nula” en las que el peso propio es despreciable frente a la carga útil— se propone como medida del coste la magnitud adimensional que denominamos número de Michell, que se deriva de la “cantidad” introducida por A. G. M. Michell en su artículo seminal de 1904, desarrollado a partir de un lema de J. C. Maxwell de 1870. A finales del siglo pasado, R. Aroca combino las teorías de Galileo y de Maxwell y Michell, proponiendo una regla de diseño de fácil aplicación (regla GA), que permite la estimación del alcance y del rendimiento de una forma estructural. En el presente trabajo se estudia la eficiencia de estructuras trianguladas en problemas estructurales de flexión, teniendo en cuenta la influencia del tamaño. Por un lado, en el caso de estructuras de tamaño nulo se exploran esquemas cercanos al optimo mediante diversos métodos de minoración, con el objetivo de obtener formas cuyo coste (medido con su numero deMichell) sea muy próximo al del optimo absoluto pero obteniendo una reducción importante de su complejidad. Por otro lado, se presenta un método para determinar el alcance estructural de estructuras trianguladas (teniendo en cuenta el efecto local de las flexiones en los elementos de dichas estructuras), comparando su resultado con el obtenido al aplicar la regla GA, mostrando las condiciones en las que es de aplicación. Por último se identifican las líneas de investigación futura: la medida de la complejidad; la contabilidad del coste de las cimentaciones y la extensión de los métodos de minoración cuando se tiene en cuenta el peso propio. ABSTRACT When a structural problem is posed, the intention is usually to obtain the best solution, understanding this as the solution that fulfilling the different requirements: structural, use, etc., has the lowest physical cost. In a first approximation, the physical cost can be represented by the self-weight of the structure; this allows to consider the search of the best solution as the one with the lowest self-weight. But, from a practical point of view, obtaining good solutions—i.e. solutions with higher although comparable physical cost than the optimum— can be as important as finding the optimal ones, because this is, generally, a not affordable task. In order to have a measure of the efficiency that allows the comparison between different solutions, a definition of structural efficiency is proposed: the ratio between the useful load and the total load —i.e. the useful load plus the self-weight resulting of the structural sizing—. The structural form can be considered to be formed by four concepts, which together with its material, completely define a particular structure. These are: Size, Schema, Slenderness or Proportion, and Thickness. Galileo (1638) postulated the existence of an insurmountable size for structural problems—the size for which a structure with a given schema and a given slenderness, is only able to resist its self-weight—. Such size, or structural scope will be different for every different used material; the only needed information about the material to determine such size is the ratio between its allowable stress and its specific weight: a characteristic length that we name material structural scope. The definition of efficiency given above is not useful for structures that have a small size in comparison with the insurmountable size. In this case—structures with null size, inwhich the self-weight is negligible in comparisonwith the useful load—we use as measure of the cost the dimensionless magnitude that we call Michell’s number, an amount derived from the “quantity” introduced by A. G. M. Michell in his seminal article published in 1904, developed out of a result from J. C.Maxwell of 1870. R. Aroca joined the theories of Galileo and the theories of Maxwell and Michell, obtaining some design rules of direct application (that we denominate “GA rule”), that allow the estimation of the structural scope and the efficiency of a structural schema. In this work the efficiency of truss-like structures resolving bending problems is studied, taking into consideration the influence of the size. On the one hand, in the case of structures with null size, near-optimal layouts are explored using several minimization methods, in order to obtain forms with cost near to the absolute optimum but with a significant reduction of the complexity. On the other hand, a method for the determination of the insurmountable size for truss-like structures is shown, having into account local bending effects. The results are checked with the GA rule, showing the conditions in which it is applicable. Finally, some directions for future research are proposed: the measure of the complexity, the cost of foundations and the extension of optimization methods having into account the self-weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimal design of a vertical cantilever beam is presented in this paper. The beam is assumed immersed in an elastic Winkler soil and subjected to several loads: a point force at the tip section, its self weight and a uniform distributed load along its length. lbe optimal design problem is to find the beam of a given length and minimum volume, such that the resultant compressive stresses are admisible. This prohlem is analyzed according to linear elasticity theory and within different alternative structural models: column, Navier-Bernoulli beam-column, Timoshenko beamcolumn (i.e. with shear strain) under conservative loads, typically, constant direction loads. Results obtained in each case are compared, in order to evaluate the sensitivity of model on the numerical results. The beam optimal design is described by the section distribution layout (area, second moment, shear area etc.) along the beam span and the corresponding beam total volume. Other situations, some of them very interesting from a theoretical point of view, with follower loads (Beck and Leipholz problems) are also discussed, leaving for future work numerical details and results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation in the 11th European Symposium of the Working Party on Computer Aided Process Engineering, Kolding, Denmark, May 27-30, 2001.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we present a systematic method for the optimal development of bioprocesses that relies on the combined use of simulation packages and optimization tools. One of the main advantages of our method is that it allows for the simultaneous optimization of all the individual components of a bioprocess, including the main upstream and downstream units. The design task is mathematically formulated as a mixed-integer dynamic optimization (MIDO) problem, which is solved by a decomposition method that iterates between primal and master sub-problems. The primal dynamic optimization problem optimizes the operating conditions, bioreactor kinetics and equipment sizes, whereas the master levels entails the solution of a tailored mixed-integer linear programming (MILP) model that decides on the values of the integer variables (i.e., number of equipments in parallel and topological decisions). The dynamic optimization primal sub-problems are solved via a sequential approach that integrates the process simulator SuperPro Designer® with an external NLP solver implemented in Matlab®. The capabilities of the proposed methodology are illustrated through its application to a typical fermentation process and to the production of the amino acid L-lysine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study Forward Osmosis (FO) as an emerging desalination technology, and its capability to replace totally or partially Reverse Osmosis (RO) in order to reduce the great amount of energy required in the current desalination plants. For this purpose, we propose a superstructure that includes both membrane based desalination technologies, allowing the selection of only one of the technologies or a combination of both of them seeking for the optimal configuration of the network. The optimization problem is solved for a seawater desalination plant with a given fresh water production. The results obtained show that the optimal solution combines both desalination technologies to reduce not only the energy consumption but also the total cost of the desalination process in comparison with the same plant but operating only with RO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid increase in the number of immigrants from outside of the EU coming to Germany has become the paramount political issue. According to new estimates, the number of individuals expected arrive in Germany in 2015 and apply for asylum there is 800,000, which is nearly twice as many as estimated in earlier forecasts. Various administrative, financial and social problems related to the influx of migrants are becoming increasingly apparent. The problem of ‘refugees’ (in public debate, the terms ‘immigrants’, ‘refugees’, ‘illegal immigrants’, ‘economic immigrants’ have not been clearly defined and have often been used interchangeably) has been culminating for over a year. Despite this, it was being disregarded by Angela Merkel’s government which was preoccupied with debates on how to rescue Greece. It was only daily reports of cases of refugee centres being set on fire that convinced Chancellor Merkel to speak and to make immigration problem a priority issue (Chefsache). Neither the ruling coalition nor the opposition parties have a consistent idea of how Germany should react to the growing number of refugees. In this matter, divisions run across parties. Various solutions have been proposed, from liberalisation of laws on the right to stay in Germany to combating illegal immigration more effectively, which would be possible if asylum granting procedures were accelerated. The proposed solutions have not been properly thought through, instead they are reactive measures inspired by the results of opinion polls. This is why their assumptions are often contradictory. The situation is similar regarding the actions proposed by Chancellor Merkel which involve faster procedures to expel individuals with no right to stay in Germany and a plan to convince other EU states to accept ‘refugees’. None of these ideas is new – they were already present in the German internal debate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discoloration and mineralization of Reactive Red HE-3B were studied by using a laponite clay-based Fe nanocomposite (Fe-Lap-RD) as a heterogeneous catalyst in the presence of H2O2 and UV light. Our experimental results clearly indicate that Fe-Lap-RD mainly consists of Fe2O3 (meghemite) and Fe2Si4O10(OH)2 (iron silicate hydroxide) which have tetragonal and monoclinic structures, respectively, and has a high specific surface area (472m(2) / g) as well as a high total pore volume (0.547 cm(3)/g). It was observed that discoloration of HE-3B undergoes a much faster kinetics than mineralization of HE-3B. It was also found that initial HE-3B concentration, H2O2 concentration, UV light wavelength and power, and Fe-Lap-RD catalyst loading are the four main factors that can significantly influence the mineralization of HE-3B. At optimal conditions, complete discoloration of 100 mg/L HE-3B can be achieved in 30 min and the total organic carbon removal ratio can attain 76% in 120 min, illustrating that Fe-Lap-RD has a high photo-catalytic activity in the photo-assisted discoloration and mineralization of HE-3B in the presence of UV light (254nm) and H2O2. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. The present paper describes a component of a large Population cost-effectiveness study that aimed to identify the averted burden and economic efficiency of current and optimal treatment for the major mental disorders. This paper reports on the findings for the anxiety disorders (panic disorder/agoraphobia, social phobia, generalized anxiety disorder, post-traumatic stress disorder and obsessive-compulsive disorder). Method. Outcome was calculated as averted 'years lived with disability' (YLD), a population summary measure of disability burden. Costs were the direct health care costs in 1997-8 Australian dollars. The cost per YLD averted (efficiency) was calculated for those already in contact with the health system for a mental health problem (current care) and for a hypothetical optimal care package of evidence-based treatment for this same group. Data sources included the Australian National Survey of Mental Health and Well-being and published treatment effects and unit costs. Results. Current coverage was around 40% for most disorders with the exception of social phobia at 21%. Receipt of interventions consistent with evidence-based care ranged from 32% of those in contact with services for social phobia to 64% for post-traumatic stress disorder. The cost of this care was estimated at $400 million, resulting in a cost per YLD averted ranging from $7761 for generalized anxiety disorder to $34 389 for panic/agoraphobia. Under optimal care, costs remained similar but health gains were increased substantially, reducing the cost per YLD to < $20 000 for all disorders. Conclusions. Evidence-based care for anxiety disorders would produce greater population health gain at a similar cost to that of current care, resulting in a substantial increase in the cost-effectiveness of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power systems rely greatly on ancillary services in maintaining operation security. As one of the most important ancillary services, spinning reserve must be provided effectively in the deregulated market environment. This paper focuses on the design of an integrated market for both electricity and spinning reserve service with particular emphasis on coordinated dispatch of bulk power and spinning reserve services. A new market dispatching mechanism has been developed to minimize the ISO's total payment while ensuring system security. Genetic algorithms are used in the finding of the global optimal solutions for this dispatching problem. Case studies and corresponding analyses haw been carried out to demonstrate and discuss the efficiency and usefulness of the proposed market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The buffer allocation problem (BAP) is a well-known difficult problem in the design of production lines. We present a stochastic algorithm for solving the BAP, based on the cross-entropy method, a new paradigm for stochastic optimization. The algorithm involves the following iterative steps: (a) the generation of buffer allocations according to a certain random mechanism, followed by (b) the modification of this mechanism on the basis of cross-entropy minimization. Through various numerical experiments we demonstrate the efficiency of the proposed algorithm and show that the method can quickly generate (near-)optimal buffer allocations for fairly large production lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For quantum systems with linear dynamics in phase space much of classical feedback control theory applies. However, there are some questions that are sensible only for the quantum case: Given a fixed interaction between the system and the environment what is the optimal measurement on the environment for a particular control problem? We show that for a broad class of optimal (state- based) control problems ( the stationary linear-quadratic-Gaussian class), this question is a semidefinite program. Moreover, the answer also applies to Markovian (current-based) feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Majoritarian Parliamentary System, the government has a constitutional right to call an early election. This right provides the government a control to achieve its objective to remain in power for as long as possible. We model the early election problem mathematically using opinion polls data as a stochastic process to proxy the government's probability of re-election. These data measure the difference in popularity between the government and the opposition. We fit a mean reverting Stochastic Differential Equation to describe the behaviour of the process and consider the possibility for the government to use other control tools, which are termed 'boosts' to induce shocks to the opinion polls by making timely policy announcements or economic actions. These actions improve the government's popularity and have some impact upon the early-election exercise boundary. © Austral. Mathematical Soc. 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process optimisation and optimal control of batch and continuous drum granulation processes are studied in this paper. The main focus of the current research has been: (i) construction of optimisation and control relevant, population balance models through the incorporation of moisture content, drum rotation rate and bed depth into the coalescence kernels; (ii) investigation of optimal operational conditions using constrained optimisation techniques; (iii) development of optimal control algorithms based on discretized population balance equations; and (iv) comprehensive simulation studies on optimal control of both batch and continuous granulation processes. The objective of steady state optimisation is to minimise the recycle rate with minimum cost for continuous processes. It has been identified that the drum rotation-rate, bed depth (material charge), and moisture content of solids are practical decision (design) parameters for system optimisation. The objective for the optimal control of batch granulation processes is to maximize the mass of product-sized particles with minimum time and binder consumption. The objective for the optimal control of the continuous process is to drive the process from one steady state to another in a minimum time with minimum binder consumption, which is also known as the state-driving problem. It has been known for some time that the binder spray-rate is the most effective control (manipulative) variable. Although other possible manipulative variables, such as feed flow-rate and additional powder flow-rate have been investigated in the complete research project, only the single input problem with the binder spray rate as the manipulative variable is addressed in the paper to demonstrate the methodology. It can be shown from simulation results that the proposed models are suitable for control and optimisation studies, and the optimisation algorithms connected with either steady state or dynamic models are successful for the determination of optimal operational conditions and dynamic trajectories with good convergence properties. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We prove upper and lower bounds relating the quantum gate complexity of a unitary operation, U, to the optimal control cost associated to the synthesis of U. These bounds apply for any optimal control problem, and can be used to show that the quantum gate complexity is essentially equivalent to the optimal control cost for a wide range of problems, including time-optimal control and finding minimal distances on certain Riemannian, sub-Riemannian, and Finslerian manifolds. These results generalize the results of [Nielsen, Dowling, Gu, and Doherty, Science 311, 1133 (2006)], which showed that the gate complexity can be related to distances on a Riemannian manifold.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integrated chemical-biological degradation combining advanced oxidation by UV/H2O2 followed by aerobic biodegradation was used to degrade C.I. Reactive Azo Red 195A, commonly used in the textile industry in Australia. An experimental design based on the response surface method was applied to evaluate the interactive effects of influencing factors (UV irradiation time, initial hydrogen peroxide dosage and recirculation ratio of the system) on decolourisation efficiency and optimizing the operating conditions of the treatment process. The effects were determined by the measurement of dye concentration and soluble chemical oxygen demand (S-COD). The results showed that the dye and S-COD removal were affected by all factors individually and interactively. Maximal colour degradation performance was predicted, and experimentally validated, with no recirculation, 30 min UV irradiation and 500 mg H2O2/L. The model predictions for colour removal, based on a three-factor/five-level Box-Wilson central composite design and the response surface method analysis, were found to be very close to additional experimental results obtained under near optimal conditions. This demonstrates the benefits of this approach in achieving good predictions while minimising the number of experiments required. (c) 2006 Elsevier B.V. All rights reserved.