550 resultados para Ought to


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A heuristic algorithm that employs fuzzy logic is proposed to the power system transmission expansion planning problem. The algorithm is based on the divide to conquer strategy, which is controlled by the fuzzy system. The algorithm provides high quality solutions with the use of fuzzy decision making, which is based on nondeterministic criteria to guide the search. The fuzzy system provides a self-adjusting mechanism that eliminates the manual adjustment of parameters to each system being solved. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the modeling of a weed infestation risk inference system that implements a collaborative inference scheme based on rules extracted from two Bayesian network classifiers. The first Bayesian classifier infers a categorical variable value for the weed-crop competitiveness using as input categorical variables for the total density of weeds and corresponding proportions of narrow and broad-leaved weeds. The inferred categorical variable values for the weed-crop competitiveness along with three other categorical variables extracted from estimated maps for the weed seed production and weed coverage are then used as input for a second Bayesian network classifier to infer categorical variables values for the risk of infestation. Weed biomass and yield loss data samples are used to learn the probability relationship among the nodes of the first and second Bayesian classifiers in a supervised fashion, respectively. For comparison purposes, two types of Bayesian network structures are considered, namely an expert-based Bayesian classifier and a naive Bayes classifier. The inference system focused on the knowledge interpretation by translating a Bayesian classifier into a set of classification rules. The results obtained for the risk inference in a corn-crop field are presented and discussed. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Voltage and current waveforms of a distribution or transmission power system are not pure sinusoids. There are distortions in these waveforms that can be represented as a combination of the fundamental frequency, harmonics and high frequency transients. This paper presents a novel approach to identifying harmonics in power system distorted waveforms. The proposed method is based on Genetic Algorithms, which is an optimization technique inspired by genetics and natural evolution. GOOAL, a specially designed intelligent algorithm for optimization problems, was successfully implemented and tested. Two kinds of representations concerning chromosomes are utilized: binary and real. The results show that the proposed method is more precise than the traditional Fourier Transform, especially considering the real representation of the chromosomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes a method based on both preprocessing and data mining with the objective of identify harmonic current sources in residential consumers. In addition, this methodology can also be applied to identify linear and nonlinear loads. It should be emphasized that the entire database was obtained through laboratory essays, i.e., real data were acquired from residential loads. Thus, the residential system created in laboratory was fed by a configurable power source and in its output were placed the loads and the power quality analyzers (all measurements were stored in a microcomputer). So, the data were submitted to pre-processing, which was based on attribute selection techniques in order to minimize the complexity in identifying the loads. A newer database was generated maintaining only the attributes selected, thus, Artificial Neural Networks were trained to realized the identification of loads. In order to validate the methodology proposed, the loads were fed both under ideal conditions (without harmonics), but also by harmonic voltages within limits pre-established. These limits are in accordance with IEEE Std. 519-1992 and PRODIST (procedures to delivery energy employed by Brazilian`s utilities). The results obtained seek to validate the methodology proposed and furnish a method that can serve as alternative to conventional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently semi-empirical models to estimate flow boiling heat transfer coefficient, saturated CHF and pressure drop in micro-scale channels have been proposed. Most of the models were developed based on elongated bubbles and annular flows in the view of the fact that these flow patterns are predominant in smaller channels. In these models, the liquid film thickness plays an important role and such a fact emphasizes that the accurate measurement of the liquid film thickness is a key point to validate them. On the other hand, several techniques have been successfully applied to measure liquid film thicknesses during condensation and evaporation under macro-scale conditions. However, although this subject has been targeted by several leading laboratories around the world, it seems that there is no conclusive result describing a successful technique capable of measuring dynamic liquid film thickness during evaporation inside micro-scale round channels. This work presents a comprehensive literature review of the methods used to measure liquid film thickness in macro- and micro-scale systems. The methods are described and the main difficulties related to their use in micro-scale systems are identified. Based on this discussion, the most promising methods to measure dynamic liquid film thickness in micro-scale channels are identified. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The activity of validating identified requirements for an information system helps to improve the quality of a requirements specification document and, consequently, the success of a project. Although various different support tools to requirements engineering exist in the market, there is still a lack of automated support for validation activity. In this context, the purpose of this paper is to make up for that deficiency, with the use of an automated tool, to provide the resources for the execution of an adequate validation activity. The contribution of this study is to enable an agile and effective follow-up of the scope established for the requirements, so as to lead the development to a solution which would satisfy the real necessities of the users, as well as to supply project managers with relevant information about the maturity of the analysts involved in requirements specification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to formulate and investigate the application of various nonlinear H(infinity) control methods to a fiee-floating space manipulator subject to parametric uncertainties and external disturbances. From a tutorial perspective, a model-based approach and adaptive procedures based on linear parametrization, neural networks and fuzzy systems are covered by this work. A comparative study is conducted based on experimental implementations performed with an actual underactuated fixed-base planar manipulator which is, following the DEM concept, dynamically equivalent to a free-floating space manipulator. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The machining of hardened steels has always been a great challenge in metal cutting, particularly for drilling operations. Generally, drilling is the machining process that is most difficult to cool due to the tool`s geometry. The aim of this work is to determine the heat flux and the coefficient of convection in drilling using the inverse heat conduction method. Temperature was assessed during the drilling of hardened AISI H13 steel using the embedded thermocouple technique. Dry machining and two cooling/lubrication systems were used, and thermocouples were fixed at distances very close to the hole`s wall. Tests were replicated for each condition, and were carried out with new and worn drills. An analytical heat conduction model was used to calculate the temperature at tool-workpiece interface and to define the heat flux and the coefficient of convection. In all tests using new and worn out drills, the lowest temperatures and decrease of heat flux were observed using the flooded system, followed by the MQL, considering the dry condition as reference. The decrease of temperature was directly proportional to the amount of lubricant applied and was significant in the MQL system when compared to dry cutting. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to analyze the contribution of human resources management throughout the evolutionary stages of environmental management in Brazilian companies. A theoretical framework concerning environmental management and its evolution and the `greening` of the functional and competitive dimensions of human resource management were developed. A methodological triangulation was developed in two complimentary phases. In the first phase, data were collected from 94 Brazilian companies with ISO 14001 certification. The data collected were analyzed and processed using statistical techniques. The conclusions of the first phase supported the second phase of this empirical research. The second phase consisted of a study of multiple cases in four Brazilian companies. The results show evidence of the first known empirical study of contributions of human resource dimensions throughout the stages of environmental management in Brazilian manufacturing companies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a non-linear boundary element formulation applied to analysis of contact problems. The boundary element method (BEM) is known as a robust and accurate numerical technique to handle this type of problem, because the contact among the solids occurs along their boundaries. The proposed non-linear formulation is based on the use of singular or hyper-singular integral equations by BEM, for multi-region contact. When the contact occurs between crack surfaces, the formulation adopted is the dual version of BEM, in which singular and hyper-singular integral equations are defined along the opposite sides of the contact boundaries. The structural non-linear behaviour on the contact is considered using Coulomb`s friction law. The non-linear formulation is based on the tangent operator in which one uses the derivate of the set of algebraic equations to construct the corrections for the non-linear process. This implicit formulation has shown accurate as the classical approach, however, it is faster to compute the solution. Examples of simple and multi-region contact problems are shown to illustrate the applicability of the proposed scheme. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with analysis of multiple random crack propagation in two-dimensional domains using the boundary element method (BEM). BEM is known to be a robust and accurate numerical technique for analysing this type of problem. The formulation adopted in this work is based on the dual BEM, for which singular and hyper-singular integral equations are used. We propose an iterative scheme to predict the crack growth path and the crack length increment at each time step. The proposed scheme able us to simulate localisation and coalescence phenomena, which is the main contribution of this paper. Considering the fracture mechanics analysis, the displacement correlation technique is applied to evaluate the stress intensity factors. The propagation angle and the equivalent stress intensity factor are calculated using the theory of maximum circumferential stress. Examples of simple and multi-fractured domains, loaded up to the rupture, are considered to illustrate the applicability of the proposed scheme. (C) 2010 Elsevier Ltd. All rights reserved.