889 resultados para systems engineering
Resumo:
A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.
Resumo:
In this article we address decomposition strategies especially tailored to perform strong coupling of dimensionally heterogeneous models, under the hypothesis that one wants to solve each submodel separately and implement the interaction between subdomains by boundary conditions alone. The novel methodology takes full advantage of the small number of interface unknowns in this kind of problems. Existing algorithms can be viewed as variants of the `natural` staggered algorithm in which each domain transfers function values to the other, and receives fluxes (or forces), and vice versa. This natural algorithm is known as Dirichlet-to-Neumann in the Domain Decomposition literature. Essentially, we propose a framework in which this algorithm is equivalent to applying Gauss-Seidel iterations to a suitably defined (linear or nonlinear) system of equations. It is then immediate to switch to other iterative solvers such as GMRES or other Krylov-based method. which we assess through numerical experiments showing the significant gain that can be achieved. indeed. the benefit is that an extremely flexible, automatic coupling strategy can be developed, which in addition leads to iterative procedures that are parameter-free and rapidly converging. Further, in linear problems they have the finite termination property. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
The amount of textual information digitally stored is growing every day. However, our capability of processing and analyzing that information is not growing at the same pace. To overcome this limitation, it is important to develop semiautomatic processes to extract relevant knowledge from textual information, such as the text mining process. One of the main and most expensive stages of the text mining process is the text pre-processing stage, where the unstructured text should be transformed to structured format such as an attribute-value table. The stemming process, i.e. linguistics normalization, is usually used to find the attributes of this table. However, the stemming process is strongly dependent on the language in which the original textual information is given. Furthermore, for most languages, the stemming algorithms proposed in the literature are computationally expensive. In this work, several improvements of the well know Porter stemming algorithm for the Portuguese language, which explore the characteristics of this language, are proposed. Experimental results show that the proposed algorithm executes in far less time without affecting the quality of the generated stems.
Resumo:
Localization and Mapping are two of the most important capabilities for autonomous mobile robots and have been receiving considerable attention from the scientific computing community over the last 10 years. One of the most efficient methods to address these problems is based on the use of the Extended Kalman Filter (EKF). The EKF simultaneously estimates a model of the environment (map) and the position of the robot based on odometric and exteroceptive sensor information. As this algorithm demands a considerable amount of computation, it is usually executed on high end PCs coupled to the robot. In this work we present an FPGA-based architecture for the EKF algorithm that is capable of processing two-dimensional maps containing up to 1.8 k features at real time (14 Hz), a three-fold improvement over a Pentium M 1.6 GHz, and a 13-fold improvement over an ARM920T 200 MHz. The proposed architecture also consumes only 1.3% of the Pentium and 12.3% of the ARM energy per feature.
Resumo:
Although literature presents several alternatives, an approach based on the electronic analogy was still not considered for the implementation of an inductor-free realization of the double scroll Chua`s circuit. This paper presents a new inductor-free configuration of the Chua`s circuit based on the electronic analogy. This proposal results in a versatile and functional inductorless implementation of the Chua`s circuit that offers new and interesting features for several applications. The analogous circuit is implemented and used to perform an experimental mapping of a large variety of attractors.
Resumo:
The aim of task scheduling is to minimize the makespan of applications, exploiting the best possible way to use shared resources. Applications have requirements which call for customized environments for their execution. One way to provide such environments is to use virtualization on demand. This paper presents two schedulers based on integer linear programming which schedule virtual machines (VMs) in grid resources and tasks on these VMs. The schedulers differ from previous work by the joint scheduling of tasks and VMs and by considering the impact of the available bandwidth on the quality of the schedule. Experiments show the efficacy of the schedulers in scenarios with different network configurations.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.
Resumo:
Effluents from pesticide industries have great difficulty to decontaminate the environment and, moreover, are characterized by high organic charge and toxicity. The research group Center for Chemical Systems Engineering (CESQ) at the Department of Chemical Engineering of Polytechnical School of University of São Paulo and Department of Chemical Engineering, Federal University of Rio Grande do Norte have been applying the Advanced Oxidation Processes (AOP's) for the degradation of various types of pollutants. These processes are based on the generation of hydroxyl radicals, highly reactive substances. Thus, this dissertation aims to explore this process, since it has been proven to be quite effective in removing organic charge. Therefore, it was decided by photo-Fenton process applied to the degradation of the fungicide Thiophanate methyl in aqueous system using annular reactor (with lamp Philips HPLN 125W) and solar. The samples were collected during the experiment and analyzed for dissolved organic carbon (TOC) using a Shimadzu TOC (Shimadzu 5050A e VCP). The Doehlert experimental design has been used to evaluate the influence of ultraviolet radiation, the concentrations of methyl thiophanate (C12H14N4O4S2), hydrogen peroxide (H2O2) and iron ions (Fe2+), among these parameters, was considered the best experimental conditions, [Fe2+] = 0.6 mmol/L and [H2O2] = 0.038 mol/L in EXP 5 experiment and in SOL 5 experiment, obtaining a percentage of TOC removal of 60% in the annular reactor and 75% in the solar reactor
Resumo:
Transport systems involved the use of territory in different Brazilian cities with regard to the occupation of road systems in urban areas. The implementation of systems engineering and transport infrastructure such as roads (roads), signs, stops, stations and complex road (bridges, viaducts and tunnels) are not used in the same way in the area. The subway is not even use the bus and vice versa. The time spent in travel, the time to access and the number of trips made by passengers in each way of transport is not the same. The use of transport systems in the territory, therefore, takes place through a whole in the current period we are in the technicalscientific and informational. This work addresses, however, the area used as a synonym of geographical area, analyzed by two categories of analysis, systems of objects formed by the fixed and the systems formed by the action flows. The system analyzed is the public transport by bus and population displacement that makes using this medium with source destination from home to work and has as empirical cut the Lagoa Azul located in the district administrative area north of Natal / RN. The general objective of this research is to understand the extent to which public transport has contributed to the socio-spatial accessibility of the residents of Barrio Blue Lagoon, located in Natal-RN, emphasizing the way home and the workplace. To reach the general objective of this dissertation, a study was made in light of the line which the methodological empirical facts, statistical data and theoretical knowledge of the events that occur in the quarter related to the Lagoa Azul economic aspects. Use for this, the concepts of mobility and Accessibility
Resumo:
This article has the purpose to review the main codes used to detect and correct errors in data communication specifically in the computer's network. The Hamming's code and the Ciclic Redundancy Code (CRC) are presented as the focus of this article as well as CRC hardware implementation. Each code is reviewed in details in order to fill the gaps in the literature and to make it accessible to the computer science and engineering students as well as to anyone who may be interested in learning the technique to treat error in data communication.
Resumo:
The main concern in Wireless Sensor Networks (WSN) algorithms and protocols are the energy consumption. Thus, the WSN lifetime is one of the most important metric used to measure the performance of the WSN approaches. Another important metric is the WSN spatial coverage, where the main goal is to obtain sensed data in a uniform way. This paper has proposed an approach called (m,k)-Gur Game that aims a trade-off between quality of service and the increasement of spatial coverage diversity. Simulation results have shown the effectiveness of this approach. © 2012 IEEE.
Resumo:
The Internet of Things is a new paradigm where smart embedded devices and systems are connected to the Internet. In this context, Wireless Sensor Networks (WSN) are becoming an important alternative for sensing and actuating critical applications like industrial automation, remote patient monitoring and domotics. The IEEE 802.15.4 protocol has been adopted as a standard for WSN and the 6LoWPAN protocol has been proposed to overcome the challenges of integrating WSN and Internet protocols. In this paper, the mechanisms of header compression and fragmentation of IPv6 datagrams proposed in the 6LoWPAN standard were evaluated through field experiments using a gateway prototype and IEEE 802.15.4 nodes.
Resumo:
The use of computer-assisted technologies such as CAD - Computed Aided Design, CAM - Computed Aided Manufacturing, CAE - Computed Aided Engineering and CNC - Computed Numerical Control, are priorities in engineering and product designers. However, the dimensional measurement between the virtual and the real product design requires research, and dissemination procedures among its users. This work aims to use these technologies, through analysis and measurement of a CNC milling machine, designed and assembled in the university. Through the use of 3D scanning, and analyzing images of the machined samples, and its original virtual files, it was possible to compare the sizes of these samples in counterposition to the original virtual dimensions, we can state that the distortions between the real and virtual, are within acceptable limits for this type of equipment. As a secondary objective, this work seeks to disseminate and make more accessible the use of these technologies.