53 resultados para Multiperiod mixed-integer convex model

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

All rights reserved. In this paper, we propose and study a unified mixed-integer programming model that simultaneously optimizes fluence weights and multi-leaf collimator (MLC) apertures in the treatment planning optimization of VMAT, Tomotherapy, and CyberKnife. The contribution of our model is threefold: (i) Our model optimizes the fluence and MLC apertures simultaneously for a given set of control points. (ii) Our model can incorporate all volume limits or dose upper bounds for organs at risk (OAR) and dose lower bound limits for planning target volumes (PTV) as hard constraints, but it can also relax either of these constraint sets in a Lagrangian fashion and keep the other set as hard constraints. (iii) For faster solutions, we propose several heuristic methods based on the MIP model, as well as a meta-heuristic approach. The meta-heuristic is very efficient in practice, being able to generate dose- and machinery-feasible solutions for problem instances of clinical scale, e.g., obtaining feasible treatment plans to cases with 180 control points, 6750 sample voxels and 18,000 beamlets in 470 seconds, or cases with 72 control points, 8000 sample voxels and 28,800 beamlets in 352 seconds. With discretization and down-sampling of voxels, our method is capable of tackling a treatment field of 8000-64,000cm3, depending on the ratio of critical structure versus unspecified tissues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial producers face the task of optimizing production process in an attempt to achieve the desired quality such as mechanical properties with the lowest energy consumption. In industrial carbon fiber production, the fibers are processed in bundles containing (batches) several thousand filaments and consequently the energy optimization will be a stochastic process as it involves uncertainty, imprecision or randomness. This paper presents a stochastic optimization model to reduce energy consumption a given range of desired mechanical properties. Several processing condition sets are developed and for each set of conditions, 50 samples of fiber are analyzed for their tensile strength and modulus. The energy consumption during production of the samples is carefully monitored on the processing equipment. Then, five standard distribution functions are examined to determine those which can best describe the distribution of mechanical properties of filaments. To verify the distribution goodness of fit and correlation statistics, the Kolmogorov-Smirnov test is used. In order to estimate the selected distribution (Weibull) parameters, the maximum likelihood, least square and genetic algorithm methods are compared. An array of factors including the sample size, the confidence level, and relative error of estimated parameters are used for evaluating the tensile strength and modulus properties. The energy consumption and N2 gas cost are modeled by Convex Hull method. Finally, in order to optimize the carbon fiber production quality and its energy consumption and total cost, mixed integer linear programming is utilized. The results show that using the stochastic optimization models, we are able to predict the production quality in a given range and minimize the energy consumption of its industrial process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Kidney Exchange Problem (KEP) is a combinatorial optimization problem and has attracted the attention from the community of integer programming/combinatorial optimisation in the past few years. Defined on a directed graph, the KEP has two variations: one concerns cycles only, and the other, cycles as well as chains on the same graph. We call the former a Cardinality Constrained Multi-cycle Problem (CCMcP) and the latter a Cardinality Constrained Cycles and Chains Problem (CCCCP). The cardinality for cycles is restricted in both CCMcP and CCCCP. As for chains, some studies in the literature considered cardinality restrictions, whereas others did not. The CCMcP can be viewed as an Asymmetric Travelling Salesman Problem that does allow subtours, however these subtours are constrained by cardinality, and that it is not necessary to visit all vertices. In existing literature of the KEP, the cardinality constraint for cycles is usually considered to be small (to the best of our knowledge, no more than six). In a CCCCP, each vertex on the directed graph can be included in at most one cycle or chain, but not both. The CCMcP and the CCCCP are interesting and challenging combinatorial optimization problems in their own rights, particularly due to their similarities to some travelling salesman- and vehicle routing-family of problems. In this paper, our main focus is to review the existing mathematical programming models and solution methods in the literature, analyse the performance of these models, and identify future research directions. Further, we propose a polynomial-sized and an exponential-sized mixed-integer linear programming model, discuss a number of stronger constraints for cardinality-infeasible-cycle elimination for the latter, and present some preliminary numerical results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design space exploration formalism has developed data structures and algorithms of sufficient complexity and scope to support conceptual layout, massing, and enclosure configurations. However, design remains a human enterprise. To support the user in designing with the formalism, we have developed an interaction model that addresses the interleaving of user actions with the formal operations of design space exploration. The central feature of our interaction model is the modeling of control based on mixed-initiative. Initiative is sometimes taken by the designer and sometimes by the formalism in working on a shared design task. The model comprises three layers, domain, task, and dialogue. In this paper we describe the formulation of the domain layer of our mixed-initiative interaction model for design space exploration. We present the view of the domain as understood in the formalism in terms of the three abstract concepts of state, move, and structure. In order to support mixed initiative, it is necessary to develop a shared view of the domain. The domain layer addresses this problem by mapping the designer's view onto the symbol substrate. First, we present the designer's view of the domain in terms of problems, solutions, choices, and history. Second, we show how this view is interleaved with the symbol-substrate through four domain layer constructs, problem state, solution state, choice, and exploration history. The domain layer presents a suitable foundation for integrating the role of the designer with a description formalism. It enables the designer to maintain exploration freedom in terms of formulating and reformulating problems, generating solutions, making choices, and navigating the history of exploration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The asymmetric travelling salesman problem with replenishment arcs (RATSP), arising from work related to aircraft routing, is a generalisation of the well-known ATSP. In this paper, we introduce a polynomial size mixed-integer linear programming (MILP) formulation for the RATSP, and improve an existing exponential size ILP formulation of Zhu [The aircraft rotation problem, Ph.D. Thesis, Georgia Institute of Technology, Atlanta, 1994] by proposing two classes of stronger cuts. We present results that under certain conditions, these two classes of stronger cuts are facet-defining for the RATS polytope, and that ATSP facets can be lifted, to give RATSP facets. We implement our polyhedral findings and develop a Lagrangean relaxation (LR)-based branch-and-bound (BNB) algorithm for the RATSP, and compare this method with solving the polynomial size formulation using ILOG Cplex 9.0, using both randomly generated problems and aircraft routing problems. Finally we compare our methods with the existing method of Boland et al. [The asymmetric traveling salesman problem with replenishment arcs, European J. Oper. Res. 123 (2000) 408–427]. It turns out that both of our methods are much faster than that of Boland et al. [The asymmetric traveling salesman problem with replenishment arcs, European J. Oper. Res. 123 (2000) 408–427], and that the LR-based BNB method is more efficient for problems that resemble the aircraft rotation problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wetland and floodplain ecosystems along many regulated rivers are highly stressed, primarily due to a lack of environmental flows of appropriate magnitude, frequency, duration, and timing to support ecological functions. In the absence of increased environmental flows, the ecological health of river ecosystems can be enhanced by the operation of existing and new flow-control infrastructure (weirs and regulators) to return more natural environmental flow regimes to specific areas. However, determining the optimal investment and operation strategies over time is a complex task due to several factors including the multiple environmental values attached to wetlands, spatial and temporal heterogeneity and dependencies, nonlinearity, and time-dependent decisions. This makes for a very large number of decision variables over a long planning horizon. The focus of this paper is the development of a nonlinear integer programming model that accommodates these complexities. The mathematical objective aims to return the natural flow regime of key components of river ecosystems in terms of flood timing, flood duration, and interflood period. We applied a 2-stage recursive heuristic using tabu search to solve the model and tested it on the entire South Australian River Murray floodplain. We conclude that modern meta-heuristics can be used to solve the very complex nonlinear problems with spatial and temporal dependencies typical of environmental flow allocation in regulated river ecosystems. The model has been used to inform the investment in, and operation of, flow-control infrastructure in the South Australian River Murray.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the resource-allocation problem in multicell networks targeting the max-min throughput of all cells. A joint optimization over power control, channel allocation, and user association is considered, and the problem is then formulated as a nonconvex mixed-integer nonlinear problem (MINLP). To solve this problem, we proposed an alternating-optimization-based algorithm, which applies branch-and-bound and simulated annealing in solving subproblems at each optimization step. We also demonstrate the convergence and efficiency of the proposed algorithms by thorough numerical experiments. The experimental results show that joint optimization over all resources outperforms the restricted optimization over individual resources significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the explosion of big data, processing large numbers of continuous data streams, i.e., big data stream processing (BDSP), has become a crucial requirement for many scientific and industrial applications in recent years. By offering a pool of computation, communication and storage resources, public clouds, like Amazon's EC2, are undoubtedly the most efficient platforms to meet the ever-growing needs of BDSP. Public cloud service providers usually operate a number of geo-distributed datacenters across the globe. Different datacenter pairs are with different inter-datacenter network costs charged by Internet Service Providers (ISPs). While, inter-datacenter traffic in BDSP constitutes a large portion of a cloud provider's traffic demand over the Internet and incurs substantial communication cost, which may even become the dominant operational expenditure factor. As the datacenter resources are provided in a virtualized way, the virtual machines (VMs) for stream processing tasks can be freely deployed onto any datacenters, provided that the Service Level Agreement (SLA, e.g., quality-of-information) is obeyed. This raises the opportunity, but also a challenge, to explore the inter-datacenter network cost diversities to optimize both VM placement and load balancing towards network cost minimization with guaranteed SLA. In this paper, we first propose a general modeling framework that describes all representative inter-task relationship semantics in BDSP. Based on our novel framework, we then formulate the communication cost minimization problem for BDSP into a mixed-integer linear programming (MILP) problem and prove it to be NP-hard. We then propose a computation-efficient solution based on MILP. The high efficiency of our proposal is validated by extensive simulation based studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract—
After a decade of extensive research on application-specific wireless sensor networks (WSNs), the recent development of information and communication technologies makes it practical to realize the software-defined sensor networks (SDSNs), which are able to adapt to various application requirements and to fully explore the resources of WSNs. A sensor node in SDSN is able to conduct multiple tasks with different sensing targets simultaneously. A given sensing task usually involves multiple sensors to achieve a certain quality-of-sensing, e.g., coverage ratio. It is significant to design an energy-efficient sensor scheduling and management strategy with guaranteed quality-of-sensing for all tasks. To this end, three issues are investigated in this paper: 1) the subset of sensor nodes that shall be activated, i.e., sensor activation, 2) the task that each sensor node shall be assigned, i.e., task mapping, and 3) the sampling rate on a sensor for a target, i.e., sensing scheduling. They are jointly considered and formulated as a mixed-integer with quadratic constraints programming (MIQP) problem, which is then reformulated into a mixed-integer linear programming (MILP) formulation with low computation complexity via linearization. To deal with dynamic events such as sensor node participation and departure, during SDSN operations, an efficient online algorithm using local optimization is developed. Simulation results show that our proposed online algorithm approaches the globally optimized network energy efficiency with much lower rescheduling time and control overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to low electricity rates at nighttime, home charging for electric vehicles (EVs) is conventionally favored. However, the recent tendency in support of daytime workplace charging that absorbs energy produced by solar photovoltaic (PV) panels appears to be the most promising solution to facilitating higher PV and EV penetration in the power grid. This paper studies optimal sizing of workplace charging stations considering probabilistic reactive power support for plug-in hybrid electric vehicles (PHEVs), which are powered by PV units in medium voltage (MV) commercial networks. In this study, analytical expressions are first presented to estimate the size of charging stations integrated with PV units with an objective of minimizing energy losses. These stations are capable of providing reactive power support to the main grid in addition to charging PHEVs while considering the probability of PV generation. The study is further extended to investigate the impact of time-varying voltage-dependent charging load models on PV penetration. The simulation results obtained on an 18-bus test distribution system show that various charging load models can produce dissimilar levels of PHEV and PV penetration. Particularly, the maximum energy loss and peak load reductions are achieved at 70.17% and 42.95% respectively for the mixed charging load model, where the system accommodates respective PHEV and PV penetration levels of 9.51% and 50%. The results of probabilistic voltage distributions are also thoroughly reported in the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study we explore a model to optimize the Intensive Care Unit (ICU) discharging decisions prior to service completion as a result of capacity-constrained situation under uncertainty. Discharging prior to service completion, which is called demand-driven discharge or premature discharging, increases the chance that a patient to be readmitted to the ICU in the near future. Since readmission imposes an additional load on ICUs, the cost of demand-driven discharge is pertained to the surge of readmission chance and the length of stay (LOS) in the ICU after readmission. Hence, the problem is how to select a current patient in the ICU for demand-driven discharge to accommodate a new critically ill patient. In essence, the problem is formulated as a stochastic dynamic programming model. However, even in the deterministic form i.e. knowing the arrival and treatment times in advance, solving the dynamic programming model is almost unaffordable for a sizable problem. This is illustrated by formulating the problem by an integer programming model. The uncertainties and difficulties in the problem are convincing reasons to use the optimization-simulation approach. Thus, using simulations, we evaluate various scenarios by considering Weibull distribution for the LOS. While it is known that selecting a patient with the lowest readmission risk is optimum under certain conditions and supposing a memory-less distribution for LOS; we remark that when LOS is non-memory-less, considering readmission risk and remaining LOS rather than just readmission risk leads to better results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Operations Research (OR) community have defined many deterministic manufacturing control problems mainly focused on scheduling. Well-defined benchmark problems provide a mechanism for communication of the effectiveness of different optimization algorithms. Manufacturing problems within industry are stochastic and complex. Common features of these problems include: variable demand, machine part specific breakdown patterns, part machine specific process durations, continuous production, Finished Goods Inventory (FGI) buffers, bottleneck machines and limited production capacity. Discrete Event Simulation (DES) is a commonly used tool for studying manufacturing systems of realistic complexity. There are few reports of detail-rich benchmark problems for use within the simulation optimization community that are as complex as those faced by production managers. This work details an algorithm that can be used to create single and multistage production control problems. The reported software implementation of the algorithm generates text files in eXtensible Markup Language (XML) format that are easily edited and understood as well as being cross-platform compatible. The distribution and acceptance of benchmark problems generated with the algorithm would enable researchers working on simulation and optimization of manufacturing problems to effectively communicate results to benefit the field in general.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computer-based environments for supporting design are complex software artifacts. These tools need to use sound computational formalisms as well as address issues of human usability. The development of interactive and usable generative systems is a significant research area in design computation. Though classical search techniques play a central role in the generative kernels of these "closed-world" systems, the open-ended exploration of design spaces is the desirable goal. In this paper, we present a formal model of exploration that combines search with user driven exploration. We describe the role of interaction and agency in an experimental mixed-initiative design support system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many vision problems deal with high-dimensional data, such as motion segmentation and face clustering. However, these high-dimensional data usually lie in a low-dimensional structure. Sparse representation is a powerful principle for solving a number of clustering problems with high-dimensional data. This principle is motivated from an ideal modeling of data points according to linear algebra theory. However, real data in computer vision are unlikely to follow the ideal model perfectly. In this paper, we exploit the mixed norm regularization for sparse subspace clustering. This regularization term is a convex combination of the l1norm, which promotes sparsity at the individual level and the block norm l2/1 which promotes group sparsity. Combining these powerful regularization terms will provide a more accurate modeling, subsequently leading to a better solution for the affinity matrix used in sparse subspace clustering. This could help us achieve better performance on motion segmentation and face clustering problems. This formulation also caters for different types of data corruptions. We derive a provably convergent algorithm based on the alternating direction method of multipliers (ADMM) framework, which is computationally efficient, to solve the formulation. We demonstrate that this formulation outperforms other state-of-arts on both motion segmentation and face clustering.