914 resultados para computational study


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter studies a two-level production planning problem where, on each level, a lot sizing and scheduling problem with parallel machines, capacity constraints and sequence-dependent setup costs and times must be solved. The problem can be found in soft drink companies where the production process involves two interdependent levels with decisions concerning raw material storage and soft drink bottling. Models and solution approaches proposed so far are surveyed and conceptually compared. Two different approaches have been selected to perform a series of computational comparisons: an evolutionary technique comprising a genetic algorithm and its memetic version, and a decomposition and relaxation approach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to growing urbanization and industrialization, the environment is suffering from pollution of rivers, degradation of soils and deteriorated air quality. Quality indices appear to be useful to evaluate the conditions of these media. The aim of this study was the development of a water quality index using a fuzzy inference system, since such an approach has proved advantageous in addressing problems that are subjective by nature or for which the data are uncertain. The methodology employed was based on this inference system, and considered the nine water quality parameters employed by CETESB (Companhia de Tecnologia de Saneamento Ambiental, São Paulo, Brazil) to evaluate water quality. After assessment of the data using the index, a comparison was made with the WQI (Water Quality Index), which is used for the monitoring of various water bodies, including in the study region. The results obtained using the index developed on the basis of fuzzy inference were found to be more useful than those derived from the method currently used by CETESB, since losses and/or omissions concerning individual parameters were minimized. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reaction of 2,6-diformylpyridine-bis(benzoylhydrazone) [dfpbbh] and 2,6-diformylpyridine-bis(4-phenylsemicarbazone) [dfpbpsc] with lanthanides salts yielded the new chelates complexes [Eu(dfpbpsc-H +) 2]NO 3 (1), [Dy(fbhmp) 2][Dy(dfpbbh-2H +) 2]·2EtOH·2H 2O (fbhmp = 2-formylbenzoylhydrazone-6-methoxide-pyridine; Ph = phenyl; Py = pyridine; Et = ethyl) and [Er 2(dfpbbh-2H +) 2(μ-NO 3)(H 2O) 2(OH)]·H 2O. X-ray diffraction analysis was employed for the structural characterization of the three chelate complexes. In the case of complex 1, optical, synthetic and computational methods were also exploited for ground state structure determinations and triplet energy level of the ligand and HOMO-LUMO calculations, as well as for a detailed study of its luminescence properties. © 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on literature review, electronic systems design employ largely top-down methodology. The top-down methodology is vital for success in the synthesis and implementation of electronic systems. In this context, this paper presents a new computational tool, named BD2XML, to support electronic systems design. From a block diagram system of mixed-signal is generated object code in XML markup language. XML language is interesting because it has great flexibility and readability. The BD2XML was developed with object-oriented paradigm. It was used the AD7528 converter modeled in MATLAB / Simulink as a case study. The MATLAB / Simulink was chosen as a target due to its wide dissemination in academia and industry. From this case study it is possible to demonstrate the functionality of the BD2XML and make it a reflection on the design challenges. Therefore, an automatic tool for electronic systems design reduces the time and costs of the design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital data sets constitute rich sources of information, which can be extracted and evaluated applying computational tools, for example, those ones for Information Visualization. Web-based applications, such as social network environments, forums and virtual environments for Distance Learning, are good examples for such sources. The great amount of data has direct impact on processing and analysis tasks. This paper presents the computational tool Mapper, defined and implemented to use visual representations - maps, graphics and diagrams - for supporting the decision making process by analyzing data stored in Virtual Learning Environment TelEduc-Unesp. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inferences about leaf anatomical characteristics had largely been made by manually measuring diverse leaf regions, such as cuticle, epidermis and parenchyma to evaluate differences caused by environmental variables. Here we tested an approach for data acquisition and analysis in ecological quantitative leaf anatomy studies based on computer vision and pattern recognition methods. A case study was conducted on Gochnatia polymorpha (Less.) Cabrera (Asteraceae), a Neotropical savanna tree species that has high phenotypic plasticity. We obtained digital images of cross-sections of its leaves developed under different light conditions (sun vs. shade), different seasons (dry vs. wet) and in different soil types (oxysoil vs. hydromorphic soil), and analyzed several visual attributes, such as color, texture and tissues thickness in a perpendicular plane from microscopic images. The experimental results demonstrated that computational analysis is capable of distinguishing anatomical alterations in microscope images obtained from individuals growing in different environmental conditions. The methods presented here offer an alternative way to determine leaf anatomical differences. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modal analysis is widely approached in the classic theory of power systems modelling. This technique is also applied to model multiconductor transmission lines and their self and mutual electrical parameters. However, this methodology has some particularities and inaccuracies for specific applications, which are not clearly described in the technical literature. This study provides a brief review on modal decoupling applied in transmission line digital models and thereafter a novel and simplified computational routine is proposed to overcome the possible errors embedded by the modal decoupling in the simulation/ modelling computational algorithm. © The Institution of Engineering and Technology 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a computational fluid dynamics (CFD) application about the axial fan design used in an agricultural spraying system with a theoretical and experimental analysis of comparative results between the characteristic curves of a fan for several rotations and numerical results for the influence of blade attack angle variation and optimization of the spraying system, both for a same rotation. Flow was considered three-dimensional, turbulent, isothermal, viscous and non-compressible in a steady state, disregarding any influence of the gravity field. The average turbulent field was obtained from the application of time average where the turbulence model required for closing the set of equations was the k-E model. Resolution of all connected phenomena was achieved with the help of a fluid dynamics computer, CFX, which uses the finite volumes technique as a numerical method. In order to validate the theoretical analysis, an experiment was conducted in a circular section of a horizontal wind tunnel, using a Pitot tube for pressure readings. The main results demonstrate that the methodology used, based on CFD techniques, is able to reproduce the phenomenological behavior of an axial fan in a spraying system because results were very reliable and similar to experimentally measured ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is twofold: to analyze the computational complexity of the cogeneration design problem; to present an expert system to solve the proposed problem, comparing such an approach with the traditional searching methods available.Design/methodology/approach - The complexity of the cogeneration problem is analyzed through the transformation of the well-known knapsack problem. Both problems are formulated as decision problems and it is proven that the cogeneration problem is np-complete. Thus, several searching approaches, such as population heuristics and dynamic programming, could be used to solve the problem. Alternatively, a knowledge-based approach is proposed by presenting an expert system and its knowledge representation scheme.Findings - The expert system is executed considering two case-studies. First, a cogeneration plant should meet power, steam, chilled water and hot water demands. The expert system presented two different solutions based on high complexity thermodynamic cycles. In the second case-study the plant should meet just power and steam demands. The system presents three different solutions, and one of them was never considered before by our consultant expert.Originality/value - The expert system approach is not a "blind" method, i.e. it generates solutions based on actual engineering knowledge instead of the searching strategies from traditional methods. It means that the system is able to explain its choices, making available the design rationale for each solution. This is the main advantage of the expert system approach over the traditional search methods. On the other hand, the expert system quite likely does not provide an actual optimal solution. All it can provide is one or more acceptable solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling is a step to perform a finite element analysis. Different methods of model construction are reported in literature, as the Bio-CAD modeling. The purpose of this study was to perform a model evaluation and application using two methods of Bio-CAD modeling from human edentulous hemi-mandible on the finite element analysis. From CT scans of dried human skull was reconstructed a stereolithographic model. Two methods of modeling were performed: STL conversion approach (Model 1) associated to STL simplification and reverse engineering approach (Model 2). For finite element analysis was used the action of lateral pterygoid muscle as loading condition to assess total displacement (D), equivalent von-Mises stress (VM) and maximum principal stress (MP). Two models presented differences on the geometry regarding surface number (1834 (model 1); 282 (model 2)). Were observed differences in finite element mesh regarding element number (30428 nodes/16683 elements (model 1); 15801 nodes/8410 elements (model 2). D, VM and MP stress areas presented similar distribution in two models. The values were different regarding maximum and minimum values of D (ranging 0-0.511 mm (model 1) and 0-0.544 mm (model 2), VM stress (6.36E-04-11.4 MPa (model 1) and 2.15E-04-14.7 MPa (model 2) and MP stress (-1.43-9.14 MPa (model 1) and -1.2-11.6 MPa (model 2). From two methods of Bio-CAD modeling, the reverse engineering presented better anatomical representation compared to the STL conversion approach. The models presented differences in the finite element mesh, total displacement and stress distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide variety of molecular architectures used in sensors and biosensors and the large amount of data generated with some principles of detection have motivated the use of computational methods, such as information visualization techniques, not only to handle the data but also to optimize sensing performance. In this study, we combine projection techniques with micro-Raman scattering and atomic force microscopy (AFM) to address critical issues related to practical applications of electronic tongues (e-tongues) based on impedance spectroscopy. Experimentally, we used sensing units made with thin films of a perylene derivative (AzoPTCD acronym), coating Pt interdigitated electrodes, to detect CuCl(2) (Cu(2+)), methylene blue (MB), and saccharose in aqueous solutions, which were selected due to their distinct molecular sizes and ionic character in solution. The AzoPTCD films were deposited from monolayers to 120 nm via Langmuir-Blodgett (LB) and physical vapor deposition (PVD) techniques. Because the main aspects investigated were how the interdigitated electrodes are coated by thin films (architecture on e-tongue) and the film thickness, we decided to employ the same material for all sensing units. The capacitance data were projected into a 2D plot using the force scheme method, from which we could infer that at low analyte concentrations the electrical response of the units was determined by the film thickness. Concentrations at 10 mu M or higher could be distinguished with thinner films tens of nanometers at most-which could withstand the impedance measurements, and without causing significant changes in the Raman signal for the AzoPTCD film-forming molecules. The sensitivity to the analytes appears to be related to adsorption on the film surface, as inferred from Raman spectroscopy data using MB as analyte and from the multidimensional projections. The analysis of the results presented may serve as a new route to select materials and molecular architectures for novel sensors and biosensors, in addition to suggesting ways to unravel the mechanisms behind the high sensitivity obtained in various sensors.