20 resultados para mixed binary nonlinear programming

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Kidney Exchange Problem (KEP) is a combinatorial optimization problem and has attracted the attention from the community of integer programming/combinatorial optimisation in the past few years. Defined on a directed graph, the KEP has two variations: one concerns cycles only, and the other, cycles as well as chains on the same graph. We call the former a Cardinality Constrained Multi-cycle Problem (CCMcP) and the latter a Cardinality Constrained Cycles and Chains Problem (CCCCP). The cardinality for cycles is restricted in both CCMcP and CCCCP. As for chains, some studies in the literature considered cardinality restrictions, whereas others did not. The CCMcP can be viewed as an Asymmetric Travelling Salesman Problem that does allow subtours, however these subtours are constrained by cardinality, and that it is not necessary to visit all vertices. In existing literature of the KEP, the cardinality constraint for cycles is usually considered to be small (to the best of our knowledge, no more than six). In a CCCCP, each vertex on the directed graph can be included in at most one cycle or chain, but not both. The CCMcP and the CCCCP are interesting and challenging combinatorial optimization problems in their own rights, particularly due to their similarities to some travelling salesman- and vehicle routing-family of problems. In this paper, our main focus is to review the existing mathematical programming models and solution methods in the literature, analyse the performance of these models, and identify future research directions. Further, we propose a polynomial-sized and an exponential-sized mixed-integer linear programming model, discuss a number of stronger constraints for cardinality-infeasible-cycle elimination for the latter, and present some preliminary numerical results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The molecular geometry, the three dimensional arrangement of atoms in space, is a major factor determining the properties and reactivity of molecules, biomolecules and macromolecules. Computation of stable molecular conformations can be done by locating minima on the potential energy surface (PES). This is a very challenging global optimization problem because of extremely large numbers of shallow local minima and complicated landscape of PES. This paper illustrates the mathematical and computational challenges on one important instance of the problem, computation of molecular geometry of oligopeptides, and proposes the use of the Extended Cutting Angle Method (ECAM) to solve this problem.

ECAM is a deterministic global optimization technique, which computes tight lower bounds on the values of the objective function and fathoms those part of the domain where the global minimum cannot reside. As with any domain partitioning scheme, its challenge is an extremely large partition of the domain required for accurate lower bounds. We address this challenge by providing an efficient combinatorial algorithm for calculating the lower bounds, and by combining ECAM with a local optimization method, while preserving the deterministic character of ECAM.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The asymmetric travelling salesman problem with replenishment arcs (RATSP), arising from work related to aircraft routing, is a generalisation of the well-known ATSP. In this paper, we introduce a polynomial size mixed-integer linear programming (MILP) formulation for the RATSP, and improve an existing exponential size ILP formulation of Zhu [The aircraft rotation problem, Ph.D. Thesis, Georgia Institute of Technology, Atlanta, 1994] by proposing two classes of stronger cuts. We present results that under certain conditions, these two classes of stronger cuts are facet-defining for the RATS polytope, and that ATSP facets can be lifted, to give RATSP facets. We implement our polyhedral findings and develop a Lagrangean relaxation (LR)-based branch-and-bound (BNB) algorithm for the RATSP, and compare this method with solving the polynomial size formulation using ILOG Cplex 9.0, using both randomly generated problems and aircraft routing problems. Finally we compare our methods with the existing method of Boland et al. [The asymmetric traveling salesman problem with replenishment arcs, European J. Oper. Res. 123 (2000) 408–427]. It turns out that both of our methods are much faster than that of Boland et al. [The asymmetric traveling salesman problem with replenishment arcs, European J. Oper. Res. 123 (2000) 408–427], and that the LR-based BNB method is more efficient for problems that resemble the aircraft rotation problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines children’s multiplatform commissioning at the Australian Broadcasting Corporation (ABC) in the context of the digitalisation of Australian television. A pursuit of audience share and reach to legitimise its recurrent funding engenders a strategy that prioritises the entertainment values of the ABC’s children’s offerings. Nevertheless, these multiplatform texts (comprising complementary ‘on-air’ and ‘online’ textualities) evidence a continuing commitment to a youth-focussed, public service remit, and reflect the ABC’s Charter obligations to foster innovation, creativity, participation, citizenship, and the values of social inclusiveness. The analysis focuses on two recent ‘marquee’ drama projects, Dance Academy (a contemporary teen series) and My Place (a historical series for a middle childhood audience). The research draws on a series of research interviews, analysis of policy documents and textual analysis of the television and multiplatform content. The authors argue that a mixed diet of programming, together with an educative or social developmental agenda, features in the design of both program and online participation for the public broadcaster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is important to derive priority weights from interval-valued fuzzy preferences when a pairwise comparative mechanism is used. By focusing on the significance of consistency in the pairwise comparison matrix, two numerical-valued consistent comparison matrices are extracted from an interval fuzzy judgement matrix. Both consistent matrices are derived by solving the linear or nonlinear programming models with the aid of assessments from Decision Makers (DMs). An interval priority weight vector from the extracted consistent matrices is generated. In order to retain more information hidden in the intervals, a new probability-based method for comparison of the interval priority weights is introduced. An algorithm for deriving the final priority interval weights for both consistent and inconsistent interval matrices is proposed. The algorithm is also generalized to handle the pairwise comparison matrix with fuzzy numbers. The comparative results from the five examples reveal that the proposed method, as compared with eight existing methods, exhibits a smaller degree of uncertainty pertaining to the priority weights, and is also more reliable based on the similarity measure. © 2014 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the resource-allocation problem in multicell networks targeting the max-min throughput of all cells. A joint optimization over power control, channel allocation, and user association is considered, and the problem is then formulated as a nonconvex mixed-integer nonlinear problem (MINLP). To solve this problem, we proposed an alternating-optimization-based algorithm, which applies branch-and-bound and simulated annealing in solving subproblems at each optimization step. We also demonstrate the convergence and efficiency of the proposed algorithms by thorough numerical experiments. The experimental results show that joint optimization over all resources outperforms the restricted optimization over individual resources significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the explosion of big data, processing large numbers of continuous data streams, i.e., big data stream processing (BDSP), has become a crucial requirement for many scientific and industrial applications in recent years. By offering a pool of computation, communication and storage resources, public clouds, like Amazon's EC2, are undoubtedly the most efficient platforms to meet the ever-growing needs of BDSP. Public cloud service providers usually operate a number of geo-distributed datacenters across the globe. Different datacenter pairs are with different inter-datacenter network costs charged by Internet Service Providers (ISPs). While, inter-datacenter traffic in BDSP constitutes a large portion of a cloud provider's traffic demand over the Internet and incurs substantial communication cost, which may even become the dominant operational expenditure factor. As the datacenter resources are provided in a virtualized way, the virtual machines (VMs) for stream processing tasks can be freely deployed onto any datacenters, provided that the Service Level Agreement (SLA, e.g., quality-of-information) is obeyed. This raises the opportunity, but also a challenge, to explore the inter-datacenter network cost diversities to optimize both VM placement and load balancing towards network cost minimization with guaranteed SLA. In this paper, we first propose a general modeling framework that describes all representative inter-task relationship semantics in BDSP. Based on our novel framework, we then formulate the communication cost minimization problem for BDSP into a mixed-integer linear programming (MILP) problem and prove it to be NP-hard. We then propose a computation-efficient solution based on MILP. The high efficiency of our proposal is validated by extensive simulation based studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract—
After a decade of extensive research on application-specific wireless sensor networks (WSNs), the recent development of information and communication technologies makes it practical to realize the software-defined sensor networks (SDSNs), which are able to adapt to various application requirements and to fully explore the resources of WSNs. A sensor node in SDSN is able to conduct multiple tasks with different sensing targets simultaneously. A given sensing task usually involves multiple sensors to achieve a certain quality-of-sensing, e.g., coverage ratio. It is significant to design an energy-efficient sensor scheduling and management strategy with guaranteed quality-of-sensing for all tasks. To this end, three issues are investigated in this paper: 1) the subset of sensor nodes that shall be activated, i.e., sensor activation, 2) the task that each sensor node shall be assigned, i.e., task mapping, and 3) the sampling rate on a sensor for a target, i.e., sensing scheduling. They are jointly considered and formulated as a mixed-integer with quadratic constraints programming (MIQP) problem, which is then reformulated into a mixed-integer linear programming (MILP) formulation with low computation complexity via linearization. To deal with dynamic events such as sensor node participation and departure, during SDSN operations, an efficient online algorithm using local optimization is developed. Simulation results show that our proposed online algorithm approaches the globally optimized network energy efficiency with much lower rescheduling time and control overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial producers face the task of optimizing production process in an attempt to achieve the desired quality such as mechanical properties with the lowest energy consumption. In industrial carbon fiber production, the fibers are processed in bundles containing (batches) several thousand filaments and consequently the energy optimization will be a stochastic process as it involves uncertainty, imprecision or randomness. This paper presents a stochastic optimization model to reduce energy consumption a given range of desired mechanical properties. Several processing condition sets are developed and for each set of conditions, 50 samples of fiber are analyzed for their tensile strength and modulus. The energy consumption during production of the samples is carefully monitored on the processing equipment. Then, five standard distribution functions are examined to determine those which can best describe the distribution of mechanical properties of filaments. To verify the distribution goodness of fit and correlation statistics, the Kolmogorov-Smirnov test is used. In order to estimate the selected distribution (Weibull) parameters, the maximum likelihood, least square and genetic algorithm methods are compared. An array of factors including the sample size, the confidence level, and relative error of estimated parameters are used for evaluating the tensile strength and modulus properties. The energy consumption and N2 gas cost are modeled by Convex Hull method. Finally, in order to optimize the carbon fiber production quality and its energy consumption and total cost, mixed integer linear programming is utilized. The results show that using the stochastic optimization models, we are able to predict the production quality in a given range and minimize the energy consumption of its industrial process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

All rights reserved. In this paper, we propose and study a unified mixed-integer programming model that simultaneously optimizes fluence weights and multi-leaf collimator (MLC) apertures in the treatment planning optimization of VMAT, Tomotherapy, and CyberKnife. The contribution of our model is threefold: (i) Our model optimizes the fluence and MLC apertures simultaneously for a given set of control points. (ii) Our model can incorporate all volume limits or dose upper bounds for organs at risk (OAR) and dose lower bound limits for planning target volumes (PTV) as hard constraints, but it can also relax either of these constraint sets in a Lagrangian fashion and keep the other set as hard constraints. (iii) For faster solutions, we propose several heuristic methods based on the MIP model, as well as a meta-heuristic approach. The meta-heuristic is very efficient in practice, being able to generate dose- and machinery-feasible solutions for problem instances of clinical scale, e.g., obtaining feasible treatment plans to cases with 180 control points, 6750 sample voxels and 18,000 beamlets in 470 seconds, or cases with 72 control points, 8000 sample voxels and 28,800 beamlets in 352 seconds. With discretization and down-sampling of voxels, our method is capable of tackling a treatment field of 8000-64,000cm3, depending on the ratio of critical structure versus unspecified tissues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to determine if taste interactions occur when bitter stimuli are mixed. Eight bitter stimuli were employed: denatonium benzoate (DB), quinine-HCl (QHCl), sucrose octaacetate (SOA), urea, L-tryptophan (L-trp), L-phenylalanine (L-phe), ranitidine-HCl, and Tetralone. The first experiment constructed individual psychophysical curves for each subject (n = 19) for each compound to account for individual differences in sensitivities when presenting bitter compounds in experiment 2. Correlation analysis revealed two groupings of bitter compounds at low intensity (1, L-trp, L-phe, and ranitidine; 2, SOA and QHCl), but the correlations within each group decreased as the perceived intensity increased. In experiment 2, intensity ratings and two-alternative forced-choice discrimination tasks showed that bitter compounds generally combine additively in mixture and do not show interactions with a few specific exceptions. The methods employed detected synergy among sweeteners, but could not detect synergy among these eight bitter compounds. In general, the perceived bitterness of these binary bitter-compound mixtures was an additive function of the total bitter-inducing stimuli in the mouth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to study potential mixture interactions among bitter compounds, selected sodium salts were added to five compounds presented either alone or as binary bitter- ompound mixtures. Each compound was tested at a concentration that elicited ‘weak’ perceived bitterness. The bitter compounds were mixed at these concentrations to form a subset of possible binary mixtures. For comparison, the concentration of each solitary compound was doubled to measure bitterness inhibition at the higher intensity level elicited by the mixtures. The following sodium salts were tested for bitterness inhibition: 100 mM sodium chloride (salty), 100 mM sodium gluconate (salty), 100 and 20 mM monosodium glutamate (umami), and 50 mM adenosine monophosphate disodium salt (umami). Sucrose (sweet) was also employed as a bitterness suppressor. The sodium salts differentially suppressed the bitterness of compounds and their binary combinations. Although most bitter compounds were suppressed, the bitterness of tetralone was not suppressed, nor was the bitterness of the binary mixtures that contained it. In general, the percent suppression of binary mixtures of compounds was predicted by the average percent suppression of its two components. Within the constraints of the present study, the bitterness of mixtures was suppressed by sodium salts and sucrose independently, with few bitter interactions. This is consistent with observations that the bitter taste system integrates the bitterness of multi-compound solutions linearly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the implementation of a number of modern methods of global and nonsmooth continuous optimization, based on the ideas of Rubinov, in a programming library GANSO. GANSO implements the derivative-free bundle method, the extended cutting angle method, dynamical system-based optimization and their various combinations and heuristics. We outline the main ideas behind each method, and report on the interfacing with Matlab and Maple packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Appropriate training data always play an important role in constructing an efficient classifier to solve the data mining classification problem. Support Vector Machine (SVM) is a comparatively new approach in constructing a model/classifier for data analysis, based on Statistical Learning Theory (SLT). SVM utilizes a transformation of the basic constrained optimization problem compared to that of a quadratic programming method, which can be solved parsimoniously through standard methods. Our research focuses on SVM to classify a number of different sizes of data sets. We found SVM to perform well in the case of discrimination compared to some other existing popular classifiers.