773 resultados para minimization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the current economic scenario, it is important to the incessant search for improvements in production quality and also in reducing costs. The great competition and technological innovation makes customers stay more and more demanding and seek multiple sources of improvement in production. The work aimed to use the general desirability to optimize a process involving multiple answers in machining experiment. The process of cylindrical turning is one of the most common processes of metal cutting machining and involves several factors, in which will be analysed the best combination of the input factors in machining process, with variable response to surface roughness (Ra) and cutting length (Lc) that vary important answers to measure process efficiency and product quality. The method is a case study, since it involves a study of a tool well addressed in the literature. Data analysis was used in the process of doctoral thesis of Ricardo Penteado on the theme using metaheuristicas combined with different methods of bonding for the optimization of a turning process of multiple responses, then used the desirability and analysis tool. Joint optimization by desirability, the method proposed the following combination of input variables, variable cutting speed at 90 m/min ( -1 level), the breakthrough in 0, 12 mm/revol. ( -1 level), the machining depth should be in 1.6 mm (level 1), gum used must be the TP2500 ( -1 level), in abundant fluid (level 1) and laminated material (level 1) to the maximization of the cutting length (Lc) and minimization of roughness (Ra)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Bernoulli's model for vibration of beams is often used to make predictions of bending modulus of elasticity when using dynamic tests. However this model ignores the rotary inertia and shear. Such effects can be added to the solution of Bernoulli's equation by means of the correction proposed by Goens (1931) or by Timoshenko (1953). But to apply these corrections it is necessary to know the E/G ratio of the material. The objective of this paper is the determination of the E/G ratio of wood logs by adjusting the analytical solution of the Timoshenko beam model to the dynamic testing data of 20 Eucalyptus citriodora logs. The dynamic testing was performed with the logs in free-free suspension. To find the stiffness properties of the logs, the residue minimization was carried out using the Genetic Algorithm (GA). From the result analysis one can reasonably assume E/G = 20 for wood logs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare experimental and predicted differential scanning calorimetry (DSC) curves for palm oil (PO), peanut oil (PeO) and grapeseed oil (GO). The predicted curves are computed from the solid-liquid equilibrium modelling and direct minimization of the Gibbs free energy. For PO, the lower the scan rate, the better the agreement. The temperature transitions of PeO and GO were predicted with an average deviation of -0.72 degrees C and -1.29 degrees C respectively, in relation to experimental data from literature. However, the predicted curves showed other peaks not reported experimentally, as computed DSC curves correspond to equilibrium hypothesis which is reached experimentally for an infinitely small scan rate. The results revealed that predicted transitions temperatures using equilibrium hypotheses can be useful in pre-experimental evaluation of vegetable oils formulations seeking for desired melting profiles. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a survey of evolutionary algorithms that are designed for decision-tree induction. In this context, most of the paper focuses on approaches that evolve decision trees as an alternate heuristics to the traditional top-down divide-and-conquer approach. Additionally, we present some alternative methods that make use of evolutionary algorithms to improve particular components of decision-tree classifiers. The paper's original contributions are the following. First, it provides an up-to-date overview that is fully focused on evolutionary algorithms and decision trees and does not concentrate on any specific evolutionary approach. Second, it provides a taxonomy, which addresses works that evolve decision trees and works that design decision-tree components by the use of evolutionary algorithms. Finally, a number of references are provided that describe applications of evolutionary algorithms for decision-tree induction in different domains. At the end of this paper, we address some important issues and open questions that can be the subject of future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An optimal control strategy for the highly active antiretroviral therapy associated to the acquired immunodeficiency syndrome should be designed regarding a comprehensive analysis of the drug chemotherapy behavior in the host tissues, from major viral replication sites to viral sanctuary compartments. Such approach is critical in order to efficiently explore synergistic, competitive and prohibitive relationships among drugs and, hence, therapy costs and side-effect minimization. In this paper, a novel mathematical model for HIV-1 drug chemotherapy dynamics in distinct host anatomic compartments is proposed and theoretically evaluated on fifteen conventional anti-retroviral drugs. Rather than interdependence between drug type and its concentration profile in a host tissue, simulated results suggest that such profile is importantly correlated with the host tissue under consideration. Furthermore, the drug accumulative dynamics are drastically affected by low patient compliance with pharmacotherapy, even when a single dose lacks. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to investigate the effect of different feeding times (2, 4 and 6 h) and applied volumetric organic loads (4.5, 6.0 and 7.5 gCOD L-1 day(-1)) on the performance of an anaerobic sequencing batch biofilm reactor (AnSBBR) treating effluent from biodiesel production. Polyurethane foam cubes were used as inert support in the reactor, and mixing was accomplished by recirculating the liquid phase. The effect of feeding time on reactor performance showed to be more pronounced at higher values of applied volumetric organic loads (AVOLs). Highest organic material removal efficiencies achieved at AVOL of 4.5 gCOD L-1 day(-1) were 87 % at 4-h feeding against 84 % at 2-h and 6-h feeding. At AVOL of 6.0 gCOD L-1 day(-1), highest organic material removal efficiencies achieved with 4-h and 6-h feeding were 84 %, against 71 % at 2-h feeding. At AVOL of 7.5 gCOD L-1 day(-1), organic material removal efficiency achieved with 4-h feeding was 77 %. Hence, longer feeding times favored minimization of total volatile acids concentration during the cycle as well as in the effluent, guaranteeing process stability and safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The single machine scheduling problem with a common due date and non-identical ready times for the jobs is examined in this work. Performance is measured by the minimization of the weighted sum of earliness and tardiness penalties of the jobs. Since this problem is NP-hard, the application of constructive heuristics that exploit specific characteristics of the problem to improve their performance is investigated. The proposed approaches are examined through a computational comparative study on a set of 280 benchmark test problems with up to 1000 jobs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mining operations around the world make extensive use of blasthole sampling for short-term planning, which has two undisputed advantages: (1) blastholes are closely spaced providing relatively high sampling density per ton, and (2) there is no additional cost since the blastholes must be drilled anyway. However, blasthole sampling usually presents poor sampling precision, and the inconstant sampling bias caused by particle size and density segregation is an even more serious problem, generally precluding representativeness. One of the main causes of this bias is a highly varying loss of fines, which can lead to both under- and over-estimation of grade depending on the ore type and the gangue. This study validates a new, modified sectorial sampler, designed to reduce the loss of fines and thereby increase sampling accuracy for narrow-diameter blasthole sampling. First results show a significantly improved estimation of gold grade as well as the minimization of the loss of fines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At each outer iteration of standard Augmented Lagrangian methods one tries to solve a box-constrained optimization problem with some prescribed tolerance. In the continuous world, using exact arithmetic, this subproblem is always solvable. Therefore, the possibility of finishing the subproblem resolution without satisfying the theoretical stopping conditions is not contemplated in usual convergence theories. However, in practice, one might not be able to solve the subproblem up to the required precision. This may be due to different reasons. One of them is that the presence of an excessively large penalty parameter could impair the performance of the box-constraint optimization solver. In this paper a practical strategy for decreasing the penalty parameter in situations like the one mentioned above is proposed. More generally, the different decisions that may be taken when, in practice, one is not able to solve the Augmented Lagrangian subproblem will be discussed. As a result, an improved Augmented Lagrangian method is presented, which takes into account numerical difficulties in a satisfactory way, preserving suitable convergence theory. Numerical experiments are presented involving all the CUTEr collection test problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the first numerical implementation of the minimal Landau background gauge for Yang-Mills theory on the lattice. Our approach is a simple generalization of the usual minimal Landau gauge and is formulated for the general SU(N) gauge group. We also report on preliminary tests of the method in the four-dimensional SU(2) case, using different background fields. Our tests show that the convergence of the numerical minimization process is comparable to the case of a null background. The uniqueness of the minimizing functional employed is briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the m-machine no-wait flow shop problem where the set-up time of a job is separated from its processing time. The performance measure considered is the total flowtime. A new hybrid metaheuristic Genetic Algorithm-Cluster Search is proposed to solve the scheduling problem. The performance of the proposed method is evaluated and the results are compared with the best method reported in the literature. Experimental tests show superiority of the new method for the test problems set, regarding the solution quality. (c) 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Augmented Lagrangian methods are effective tools for solving large-scale nonlinear programming problems. At each outer iteration, a minimization subproblem with simple constraints, whose objective function depends on updated Lagrange multipliers and penalty parameters, is approximately solved. When the penalty parameter becomes very large, solving the subproblem becomes difficult; therefore, the effectiveness of this approach is associated with the boundedness of the penalty parameters. In this paper, it is proved that under more natural assumptions than the ones employed until now, penalty parameters are bounded. For proving the new boundedness result, the original algorithm has been slightly modified. Numerical consequences of the modifications are discussed and computational experiments are presented.