26 resultados para Unconstrained minimization

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bound-constrained minimization is a subject of active research. To assess the performance of existent solvers, numerical evaluations and comparisons are carried on. Arbitrary decisions that may have a crucial effect on the conclusions of numerical experiments are highlighted in the present work. As a result, a detailed evaluation based on performance profiles is applied to the comparison of bound-constrained minimization solvers. Extensive numerical results are presented and analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After sintering advanced ceramics, there are invariably distortions, caused in large part by the heterogeneous distribution of density gradients along the compacted piece. To correct distortions, machining is generally used to manufacture pieces within dimensional and geometric tolerances. Hence, narrow material removal limit conditions are applied, which minimize the generation of damage. Another alternative is machining the compacted piece before sintering, called the green ceramic stage, which allows machining without damage to mechanical strength. Since the greatest concentration of density gradients is located in the outer-most layers of the compacted piece, this study investigated the removal of different allowance values by means of green machining. The output variables are distortion after sintering, tool wear, cutting force, and the surface roughness of the green ceramics and the sintered ones. The following results have been noted: less distortion is verified in the sintered piece after 1mm allowance removal; and the higher the tool wear the worse the surface roughness of both green and sintered pieces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Bernoulli's model for vibration of beams is often used to make predictions of bending modulus of elasticity when using dynamic tests. However this model ignores the rotary inertia and shear. Such effects can be added to the solution of Bernoulli's equation by means of the correction proposed by Goens (1931) or by Timoshenko (1953). But to apply these corrections it is necessary to know the E/G ratio of the material. The objective of this paper is the determination of the E/G ratio of wood logs by adjusting the analytical solution of the Timoshenko beam model to the dynamic testing data of 20 Eucalyptus citriodora logs. The dynamic testing was performed with the logs in free-free suspension. To find the stiffness properties of the logs, the residue minimization was carried out using the Genetic Algorithm (GA). From the result analysis one can reasonably assume E/G = 20 for wood logs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare experimental and predicted differential scanning calorimetry (DSC) curves for palm oil (PO), peanut oil (PeO) and grapeseed oil (GO). The predicted curves are computed from the solid-liquid equilibrium modelling and direct minimization of the Gibbs free energy. For PO, the lower the scan rate, the better the agreement. The temperature transitions of PeO and GO were predicted with an average deviation of -0.72 degrees C and -1.29 degrees C respectively, in relation to experimental data from literature. However, the predicted curves showed other peaks not reported experimentally, as computed DSC curves correspond to equilibrium hypothesis which is reached experimentally for an infinitely small scan rate. The results revealed that predicted transitions temperatures using equilibrium hypotheses can be useful in pre-experimental evaluation of vegetable oils formulations seeking for desired melting profiles. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a survey of evolutionary algorithms that are designed for decision-tree induction. In this context, most of the paper focuses on approaches that evolve decision trees as an alternate heuristics to the traditional top-down divide-and-conquer approach. Additionally, we present some alternative methods that make use of evolutionary algorithms to improve particular components of decision-tree classifiers. The paper's original contributions are the following. First, it provides an up-to-date overview that is fully focused on evolutionary algorithms and decision trees and does not concentrate on any specific evolutionary approach. Second, it provides a taxonomy, which addresses works that evolve decision trees and works that design decision-tree components by the use of evolutionary algorithms. Finally, a number of references are provided that describe applications of evolutionary algorithms for decision-tree induction in different domains. At the end of this paper, we address some important issues and open questions that can be the subject of future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An optimal control strategy for the highly active antiretroviral therapy associated to the acquired immunodeficiency syndrome should be designed regarding a comprehensive analysis of the drug chemotherapy behavior in the host tissues, from major viral replication sites to viral sanctuary compartments. Such approach is critical in order to efficiently explore synergistic, competitive and prohibitive relationships among drugs and, hence, therapy costs and side-effect minimization. In this paper, a novel mathematical model for HIV-1 drug chemotherapy dynamics in distinct host anatomic compartments is proposed and theoretically evaluated on fifteen conventional anti-retroviral drugs. Rather than interdependence between drug type and its concentration profile in a host tissue, simulated results suggest that such profile is importantly correlated with the host tissue under consideration. Furthermore, the drug accumulative dynamics are drastically affected by low patient compliance with pharmacotherapy, even when a single dose lacks. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to investigate the effect of different feeding times (2, 4 and 6 h) and applied volumetric organic loads (4.5, 6.0 and 7.5 gCOD L-1 day(-1)) on the performance of an anaerobic sequencing batch biofilm reactor (AnSBBR) treating effluent from biodiesel production. Polyurethane foam cubes were used as inert support in the reactor, and mixing was accomplished by recirculating the liquid phase. The effect of feeding time on reactor performance showed to be more pronounced at higher values of applied volumetric organic loads (AVOLs). Highest organic material removal efficiencies achieved at AVOL of 4.5 gCOD L-1 day(-1) were 87 % at 4-h feeding against 84 % at 2-h and 6-h feeding. At AVOL of 6.0 gCOD L-1 day(-1), highest organic material removal efficiencies achieved with 4-h and 6-h feeding were 84 %, against 71 % at 2-h feeding. At AVOL of 7.5 gCOD L-1 day(-1), organic material removal efficiency achieved with 4-h feeding was 77 %. Hence, longer feeding times favored minimization of total volatile acids concentration during the cycle as well as in the effluent, guaranteeing process stability and safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The single machine scheduling problem with a common due date and non-identical ready times for the jobs is examined in this work. Performance is measured by the minimization of the weighted sum of earliness and tardiness penalties of the jobs. Since this problem is NP-hard, the application of constructive heuristics that exploit specific characteristics of the problem to improve their performance is investigated. The proposed approaches are examined through a computational comparative study on a set of 280 benchmark test problems with up to 1000 jobs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mining operations around the world make extensive use of blasthole sampling for short-term planning, which has two undisputed advantages: (1) blastholes are closely spaced providing relatively high sampling density per ton, and (2) there is no additional cost since the blastholes must be drilled anyway. However, blasthole sampling usually presents poor sampling precision, and the inconstant sampling bias caused by particle size and density segregation is an even more serious problem, generally precluding representativeness. One of the main causes of this bias is a highly varying loss of fines, which can lead to both under- and over-estimation of grade depending on the ore type and the gangue. This study validates a new, modified sectorial sampler, designed to reduce the loss of fines and thereby increase sampling accuracy for narrow-diameter blasthole sampling. First results show a significantly improved estimation of gold grade as well as the minimization of the loss of fines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At each outer iteration of standard Augmented Lagrangian methods one tries to solve a box-constrained optimization problem with some prescribed tolerance. In the continuous world, using exact arithmetic, this subproblem is always solvable. Therefore, the possibility of finishing the subproblem resolution without satisfying the theoretical stopping conditions is not contemplated in usual convergence theories. However, in practice, one might not be able to solve the subproblem up to the required precision. This may be due to different reasons. One of them is that the presence of an excessively large penalty parameter could impair the performance of the box-constraint optimization solver. In this paper a practical strategy for decreasing the penalty parameter in situations like the one mentioned above is proposed. More generally, the different decisions that may be taken when, in practice, one is not able to solve the Augmented Lagrangian subproblem will be discussed. As a result, an improved Augmented Lagrangian method is presented, which takes into account numerical difficulties in a satisfactory way, preserving suitable convergence theory. Numerical experiments are presented involving all the CUTEr collection test problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existence and stability of three-dimensional (3D) solitons, in cross-combined linear and nonlinear optical lattices, are investigated. In particular, with a starting optical lattice (OL) configuration such that it is linear in the x-direction and nonlinear in the y-direction, we consider the z-direction either unconstrained (quasi-2D OL case) or with another linear OL (full 3D case). We perform this study both analytically and numerically: analytically by a variational approach based on a Gaussian ansatz for the soliton wavefunction and numerically by relaxation methods and direct integrations of the corresponding Gross-Pitaevskii equation. We conclude that, while 3D solitons in the quasi-2D OL case are always unstable, the addition of another linear OL in the z-direction allows us to stabilize 3D solitons both for attractive and repulsive mean interactions. From our results, we suggest the possible use of spatial modulations of the nonlinearity in one of the directions as a tool for the management of stable 3D solitons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.