970 resultados para optimal solution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy efficiency plays an important role to the CO2 emissions reduction, combating climate change and improving the competitiveness of the economy. The problem presented here is related to the use of stand-alone diesel gen-sets and its high specific fuel consumptions when operates at low loads. The variable speed gen-set concept is explained as an energy-saving solution to improve this system efficiency. This paper details how an optimum fuel consumption trajectory based on experimentally Diesel engine power map is obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, many real-time operating systems discretize the time relying on a system time unit. To take this behavior into account, real-time scheduling algorithms must adopt a discrete-time model in which both timing requirements of tasks and their time allocations have to be integer multiples of the system time unit. That is, tasks cannot be executed for less than one time unit, which implies that they always have to achieve a minimum amount of work before they can be preempted. Assuming such a discrete-time model, the authors of Zhu et al. (Proceedings of the 24th IEEE international real-time systems symposium (RTSS 2003), 2003, J Parallel Distrib Comput 71(10):1411–1425, 2011) proposed an efficient “boundary fair” algorithm (named BF) and proved its optimality for the scheduling of periodic tasks while achieving full system utilization. However, BF cannot handle sporadic tasks due to their inherent irregular and unpredictable job release patterns. In this paper, we propose an optimal boundary-fair scheduling algorithm for sporadic tasks (named BF TeX ), which follows the same principle as BF by making scheduling decisions only at the job arrival times and (expected) task deadlines. This new algorithm was implemented in Linux and we show through experiments conducted upon a multicore machine that BF TeX outperforms the state-of-the-art discrete-time optimal scheduler (PD TeX ), benefiting from much less scheduling overheads. Furthermore, it appears from these experimental results that BF TeX is barely dependent on the length of the system time unit while PD TeX —the only other existing solution for the scheduling of sporadic tasks in discrete-time systems—sees its number of preemptions, migrations and the time spent to take scheduling decisions increasing linearly when improving the time resolution of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The shifted Legendre orthogonal polynomials are used for the numerical solution of a new formulation for the multi-dimensional fractional optimal control problem (M-DFOCP) with a quadratic performance index. The fractional derivatives are described in the Caputo sense. The Lagrange multiplier method for the constrained extremum and the operational matrix of fractional integrals are used together with the help of the properties of the shifted Legendre orthonormal polynomials. The method reduces the M-DFOCP to a simpler problem that consists of solving a system of algebraic equations. For confirming the efficiency and accuracy of the proposed scheme, some test problems are implemented with their approximate solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the existence theory for parabolic variational inequalities in weighted L2 spaces with respect to excessive measures associated with a transition semigroup. We characterize the value function of optimal stopping problems for finite and infinite dimensional diffusions as a generalized solution of such a variational inequality. The weighted L2 setting allows us to cover some singular cases, such as optimal stopping for stochastic equations with degenerate diffusion coeficient. As an application of the theory, we consider the pricing of American-style contingent claims. Among others, we treat the cases of assets with stochastic volatility and with path-dependent payoffs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pricing American options is an interesting research topic since there is no analytical solution to value these derivatives. Different numerical methods have been proposed in the literature with some, if not all, either limited to a specific payoff or not applicable to multidimensional cases. Applications of Monte Carlo methods to price American options is a relatively new area that started with Longstaff and Schwartz (2001). Since then, few variations of that methodology have been proposed. The general conclusion is that Monte Carlo estimators tend to underestimate the true option price. The present paper follows Glasserman and Yu (2004b) and proposes a novel Monte Carlo approach, based on designing "optimal martingales" to determine stopping times. We show that our martingale approach can also be used to compute the dual as described in Rogers (2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the theoretical macroeconomics literature, fiscal policy is almost uniformly taken to mean taxing and spending by a ‘benevolent government’ that exploits the potential aggregate demand externalities inherent in the imperfectly competitive nature of goods markets. Whilst shown to raise aggregate output and employment, these policies crowd-out private consumption and hence typically reduce welfare. In this paper we consider the use of ‘tax-and-subsidise’ instead of ‘taxand- spend’ policies on account of their widespread use by governments, even in the recent recession, to stimulate economic activity. Within a static general equilibrium macro-model with imperfectly competitive good markets we examine the effect of wage and output subsidies and show that, for a small open economy, positive tax and subsidy rates exist which maximise welfare, rendering no intervention as a suboptimal state. We also show that, within a two-country setting, a Nash non-cooperative symmetric equilibrium with positive tax and subsidy rates exists, and that cooperation between trading partners in setting these rates is more expansionary and leads to an improvement upon the non-cooperative solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a thermally fluctuating long linear polymeric chain in a solution, the ends, from time to time, approach each other. At such an instance, the chain can be regarded as closed and thus will form a knot or rather a virtual knot. Several earlier studies of random knotting demonstrated that simpler knots show a higher occurrence for shorter random walks than do more complex knots. However, up to now there have been no rules that could be used to predict the optimal length of a random walk, i.e. the length for which a given knot reaches its highest occurrence. Using numerical simulations, we show here that a power law accurately describes the relation between the optimal lengths of random walks leading to the formation of different knots and the previously characterized lengths of ideal knots of a corresponding type.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a previous paper a novel Generalized Multiobjective Multitree model (GMM-model) was proposed. This model considers for the first time multitree-multicast load balancing with splitting in a multiobjective context, whose mathematical solution is a whole Pareto optimal set that can include several results than it has been possible to find in the publications surveyed. To solve the GMM-model, in this paper a multi-objective evolutionary algorithm (MOEA) inspired by the Strength Pareto Evolutionary Algorithm (SPEA) is proposed. Experimental results considering up to 11 different objectives are presented for the well-known NSF network, with two simultaneous data flows

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pontryagin's maximum principle from optimal control theory is used to find the optimal allocation of energy between growth and reproduction when lifespan may be finite and the trade-off between growth and reproduction is linear. Analyses of the optimal allocation problem to date have generally yielded bang-bang solutions, i.e. determinate growth: life-histories in which growth is followed by reproduction, with no intermediate phase of simultaneous reproduction and growth. Here we show that an intermediate strategy (indeterminate growth) can be selected for if the rates of production and mortality either both increase or both decrease with increasing body size, this arises as a singular solution to the problem. Our conclusion is that indeterminate growth is optimal in more cases than was previously realized. The relevance of our results to natural situations is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Resuscitation in severe head injury may be detrimental when given with hypotonic fluids. We evaluated the effects of lactated Ringer's solution (sodium 131 mmol/L, 277 mOsm/L) compared with hypertonic saline (sodium 268 mmol/L, 598 mOsm/L) in severely head-injured children over the first 3 days after injury. DESIGN: An open, randomized, and prospective study. SETTING: A 16-bed pediatric intensive care unit (ICU) (level III) at a university children's hospital. PATIENTS: A total of 35 consecutive children with head injury. INTERVENTIONS: Thirty-two children with Glasgow Coma Scores of <8 were randomly assigned to receive either lactated Ringer's solution (group 1) or hypertonic saline (group 2). Routine care was standardized, and included the following: head positioning at 30 degrees; normothermia (96.8 degrees to 98.6 degrees F [36 degrees to 37 degrees C]); analgesia and sedation with morphine (10 to 30 microg/kg/hr), midazolam (0.2 to 0.3 mg/kg/hr), and phenobarbital; volume-controlled ventilation (PaCO2 of 26.3 to 30 torr [3.5 to 4 kPa]); and optimal oxygenation (PaO2 of 90 to 105 torr [12 to 14 kPa], oxygen saturation of >92%, and hematocrit of >0.30). MEASUREMENTS AND MAIN RESULTS: Mean arterial pressure and intracranial pressure (ICP) were monitored continuously and documented hourly and at every intervention. The means of every 4-hr period were calculated and serum sodium concentrations were measured at the same time. An ICP of 15 mm Hg was treated with a predefined sequence of interventions, and complications were documented. There was no difference with respect to age, male/female ratio, or initial Glasgow Coma Score. In both groups, there was an inverse correlation between serum sodium concentration and ICP (group 1: r = -.13, r2 = .02, p < .03; group 2: r = -.29, r2 = .08, p < .001) that disappeared in group 1 and increased in group 2 (group 1: r = -.08, r2 = .01, NS; group 2: r = -.35, r2 =.12, p < .001). Correlation between serum sodium concentration and cerebral perfusion pressure (CPP) became significant in group 2 after 8 hrs of treatment (r = .2, r2 = .04, p = .002). Over time, ICP and CPP did not significantly differ between the groups. However, to keep ICP at <15 mm Hg, group 2 patients required significantly fewer interventions (p < .02). Group 1 patients received less sodium (8.0 +/- 4.5 vs. 11.5 +/- 5.0 mmol/kg/day, p = .05) and more fluid on day 1 (2850 +/- 1480 vs. 2180 +/- 770 mL/m2, p = .05). They also had a higher frequency of acute respiratory distress syndrome (four vs. 0 patients, p = .1) and more than two complications (six vs. 1 patient, p = .09). Group 2 patients had significantly shorter ICU stay times (11.6 +/- 6.1 vs. 8.0 +/- 2.4 days; p = .04) and shorter mechanical ventilation times (9.5 +/- 6.0 vs. 6.9 +/- 2.2 days; p = .1). The survival rate and duration of hospital stay were similar in both groups. CONCLUSIONS: Treatment of severe head injury with hypertonic saline is superior to that treatment with lactated Ringer's solution. An increase in serum sodium concentrations significantly correlates with lower ICP and higher CPP. Children treated with hypertonic saline require fewer interventions, have fewer complications, and stay a shorter time in the ICU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the optimum design of 3R manipulators is formulated and solved by using an algebraic formulation of workspace boundary. A manipulator design can be approached as a problem of optimization, in which the objective functions are the size of the manipulator and workspace volume; and the constrains can be given as a prescribed workspace volume. The numerical solution of the optimization problem is investigated by using two different numerical techniques, namely, sequential quadratic programming and simulated annealing. Numerical examples illustrate a design procedure and show the efficiency of the proposed algorithms.