953 resultados para Complex combinatorial problem
Resumo:
This report outlines the problem of intelligent failure recovery in a problem-solver for electrical design. We want our problem solver to learn as much as it can from its mistakes. Thus we cast the engineering design process on terms of Problem Solving by Debugging Almost-Right Plans, a paradigm for automatic problem solving based on the belief that creation and removal of "bugs" is an unavoidable part of the process of solving a complex problem. The process of localization and removal of bugs called for by the PSBDARP theory requires an approach to engineering analysis in which every result has a justification which describes the exact set of assumptions it depends upon. We have developed a program based on Analysis by Propagation of Constraints which can explain the basis of its deductions. In addition to being useful to a PSBDARP designer, these justifications are used in Dependency-Directed Backtracking to limit the combinatorial search in the analysis routines. Although the research we will describe is explicitly about electrical circuits, we believe that similar principles and methods are employed by other kinds of engineers, including computer programmers.
Resumo:
This study was undertaken to explore gel permeation chromatography (GPC) for estimating molecular weights of proanthocyanidin fractions isolated from sainfoin (Onobrychis viciifolia). The results were compared with data obtained by thiolytic degradation of the same fractions. Polystyrene, polyethylene glycol and polymethyl methacrylate standards were not suitable for estimating the molecular weights of underivatized proanthocyanidins. Therefore, a novel HPLC-GPC method was developed based on two serially connected PolarGel-L columns using DMF that contained 5% water, 1% acetic acid and 0.15 M LiBr at 0.7 ml/min and 50 degrees C. This yielded a single calibration curve for galloyl glucoses (trigalloyl glucose, pentagalloyl glucose), ellagitannins (pedunculagin, vescalagin, punicalagin, oenothein B, gemin A), proanthocyanidins (procyanidin B2, cinnamtannin B1), and several other polyphenols (catechin, epicatechin gallate, epicallocatechin gallate, amentoflavone). These GPC predicted molecular weights represented a considerable advance over previously reported HPLC-GPC methods for underivatized proanthocyanidins. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.
Resumo:
This paper proposes strategies to reduce the number of variables and the combinatorial search space of the multistage transmission expansion planning problem (TEP). The concept of the binary numeral system (BNS) is used to reduce the number of binary and continuous variables related to the candidate transmission lines and network constraints that are connected with them. The construction phase of greedy randomized adaptive search procedure (GRASP-CP) and additional constraints, obtained from power flow equilibrium in an electric power system are employed for more reduction in search space. The multistage TEP problem is modeled like a mixed binary linear programming problem and solved using a commercial solver with a low computational time. The results of one test system and two real systems are presented in order to show the efficiency of the proposed solution technique. © 1969-2012 IEEE.
Resumo:
Many combinatorial problems coming from the real world may not have a clear and well defined structure, typically being dirtied by side constraints, or being composed of two or more sub-problems, usually not disjoint. Such problems are not suitable to be solved with pure approaches based on a single programming paradigm, because a paradigm that can effectively face a problem characteristic may behave inefficiently when facing other characteristics. In these cases, modelling the problem using different programming techniques, trying to ”take the best” from each technique, can produce solvers that largely dominate pure approaches. We demonstrate the effectiveness of hybridization and we discuss about different hybridization techniques by analyzing two classes of problems with particular structures, exploiting Constraint Programming and Integer Linear Programming solving tools and Algorithm Portfolios and Logic Based Benders Decomposition as integration and hybridization frameworks.
Resumo:
Combinatorial Optimization is becoming ever more crucial, in these days. From natural sciences to economics, passing through urban centers administration and personnel management, methodologies and algorithms with a strong theoretical background and a consolidated real-word effectiveness is more and more requested, in order to find, quickly, good solutions to complex strategical problems. Resource optimization is, nowadays, a fundamental ground for building the basements of successful projects. From the theoretical point of view, Combinatorial Optimization rests on stable and strong foundations, that allow researchers to face ever more challenging problems. However, from the application point of view, it seems that the rate of theoretical developments cannot cope with that enjoyed by modern hardware technologies, especially with reference to the one of processors industry. In this work we propose new parallel algorithms, designed for exploiting the new parallel architectures available on the market. We found that, exposing the inherent parallelism of some resolution techniques (like Dynamic Programming), the computational benefits are remarkable, lowering the execution times by more than an order of magnitude, and allowing to address instances with dimensions not possible before. We approached four Combinatorial Optimization’s notable problems: Packing Problem, Vehicle Routing Problem, Single Source Shortest Path Problem and a Network Design problem. For each of these problems we propose a collection of effective parallel solution algorithms, either for solving the full problem (Guillotine Cuts and SSSPP) or for enhancing a fundamental part of the solution method (VRP and ND). We endorse our claim by presenting computational results for all problems, either on standard benchmarks from the literature or, when possible, on data from real-world applications, where speed-ups of one order of magnitude are usually attained, not uncommonly scaling up to 40 X factors.
Resumo:
The procurement of transportation services via large-scale combinatorial auctions involves a couple of complex decisions whose outcome highly influences the performance of the tender process. This paper examines the shipper's task of selecting a subset of the submitted bids which efficiently trades off total procurement cost against expected carrier performance. To solve this bi-objective winner determination problem, we propose a Pareto-based greedy randomized adaptive search procedure (GRASP). As a post-optimizer we use a path relinking procedure which is hybridized with branch-and-bound. Several variants of this algorithm are evaluated by means of artificial test instances which comply with important real-world characteristics. The two best variants prove superior to a previously published Pareto-based evolutionary algorithm.
Resumo:
Computational fluid dynamic (CFD) methods are used in this paper to predict the power production from entire wind farms in complex terrain and to shed some light into the wake flow patterns. Two full three-dimensional Navier–Stokes solvers for incompressible fluid flow, employing k − ϵ and k − ω turbulence closures, are used. The wind turbines are modeled as momentum absorbers by means of their thrust coefficient through the actuator disk approach. Alternative methods for estimating the reference wind speed in the calculation of the thrust are tested. The work presented in this paper is part of the work being undertaken within the UpWind Integrated Project that aims to develop the design tools for next generation of large wind turbines. In this part of UpWind, the performance of wind farm and wake models is being examined in complex terrain environment where there are few pre-existing relevant measurements. The focus of the work being carried out is to evaluate the performance of CFD models in large wind farm applications in complex terrain and to examine the development of the wakes in a complex terrain environment.
Resumo:
In clause is given robotic a complex for drilling and milling sandwich shells from polymeric composites. The machining of polymeric composite materials has technological problems. At drilling sandwich shells there is a probability of destruction of a drill from hit of the tool in a partition. The system sensibilization robotic complex for increase of reliability of work of the cutting tool of the small size is offered.
Resumo:
Cover title.
Resumo:
This thesis is concerned with Organisational Problem Solving. The work reflects the complexities of organisational problem situations and the eclectic approach that has been necessary to gain an understanding of the processes involved. The thesis is structured into three main parts. Part I describes the author's understanding of problems and suitable approaches. Chapter 2 identifies the Transcendental Realist (TR) view of science (Harre 1970, Bhaskar 1975) as the best general framework for identifying suitable approaches to complex organisational problems. Chapter 3 discusses the relationship between Checkland's methodology (1972) and TR. The need to generate iconic (explanatory) models of the problem situation is identified and the ability of viable system modelling to supplement the modelling stage of the methodology is explored in Chapter 4. Chapter 5 builds further on the methodology to produce an original iconic model of the methodological process. The model characterises the mechanisms of organisational problem situations as well as desirable procedural steps. The Weltanschauungen (W's) or "world views" of key actors is recognised as central to the mechanisms involved. Part II describes the experience which prompted the theoretical investigation. Chapter 6 describes the first year of the project. The success of this stage is attributed to the predominance of a single W. Chapter 7 describes the changes in the organisation which made the remaining phase of the project difficult. These difficulties are attributed to a failure to recognise the importance of differing W's. Part III revisits the theoretical and organisational issues. Chapter 8 identifies a range of techniques embodying W's which are compatible with .the framework of Part I and which might usefully supplement it. Chapter 9 characterises possible W's in the sponsoring organisation. Throughout the work, an attempt 1s made to reflect the process as well as the product of the author's leaving.
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^