17 resultados para Trial and error

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due of industrial informatics several attempts have been done to develop notations and semantics, which are used for classifying and describing different kind of system behavior, particularly in the modeling phase. Such attempts provide the infrastructure to resolve some real problems of engineering and construct practical systems that aim at, mainly, to increase the productivity, quality, and security of the process. Despite the many studies that have attempted to develop friendly methods for industrial controller programming, they are still programmed by conventional trial-and-error methods and, in practice, there is little written documentation on these systems. The ideal solution would be to use a computational environment that allows industrial engineers to implement the system using high-level language and that follows international standards. Accordingly, this work proposes a methodology for plant and control modelling of the discrete event systems that include sequential, parallel and timed operations, using a formalism based on Statecharts, denominated Basic Statechart (BSC). The methodology also permits automatic procedures to validate and implement these systems. To validate our methodology, we presented two case studies with typical examples of the manufacturing sector. The first example shows a sequential control for a tagged machine, which is used to illustrated dependences between the devices of the plant. In the second example, we discuss more than one strategy for controlling a manufacturing cell. The model with no control has 72 states (distinct configurations) and, the model with sequential control generated 20 different states, but they only act in 8 distinct configurations. The model with parallel control generated 210 different states, but these 210 configurations act only in 26 distinct configurations, therefore, one strategy control less restrictive than previous. Lastly, we presented one example for highlight the modular characteristic of our methodology, which it is very important to maintenance of applications. In this example, the sensors for identifying pieces in the plant were removed. So, changes in the control model are needed to transmit the information of the input buffer sensor to the others positions of the cell

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the emergence of other forms of artificial lift, sucker rod pumping systems remains hegemonic because of its flexibility of operation and lower investment cost compared to other lifting techniques developed. A successful rod pumping sizing necessarily passes through the supply of estimated flow and the controlled wear of pumping equipment used in the mounted configuration. However, the mediation of these elements is particularly challenging, especially for most designers dealing with this work, which still lack the experience needed to get good projects pumping in time. Even with the existence of various computer applications on the market in order to facilitate this task, they must face a grueling process of trial and error until you get the most appropriate combination of equipment for installation in the well. This thesis proposes the creation of an expert system in the design of sucker rod pumping systems. Its mission is to guide a petroleum engineer in the task of selecting a range of equipment appropriate to the context provided by the characteristics of the oil that will be raised to the surface. Features such as the level of gas separation, presence of corrosive elements, possibility of production of sand and waxing are taken into account in selecting the pumping unit, sucker-rod strings and subsurface pump and their operation mode. It is able to approximate the inferente process in the way of human reasoning, which leads to results closer to those obtained by a specialist. For this, their production rules were based on the theory of fuzzy sets, able to model vague concepts typically present in human reasoning. The calculations of operating parameters of the pumping system are made by the API RP 11L method. Based on information input, the system is able to return to the user a set of pumping configurations that meet a given design flow, but without subjecting the selected equipment to an effort beyond that which can bear

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Droughts surfaced in 1877 as a crucial problem for the birthing Brazilian nation. Engineers, who formed the country's technical and scientific elite, took it upon themselves to study, understand and fight the problem through planned actions of intervention on space. This work, based on proposals and discussions contained in engineering magazines and reports, aims to provide elements for the comprehension of how these systematized actions against droughts, in the Iate nineteenth and early twentieth century, contributed to spatial analysis and the formation of a (then-inexistent) regional and territorial planning discipline in Brazi!. Engineers, by taking up the position of masterminds in the country's modernization, guaranteed for themselves personal economic stability, social prestige and political power. By understanding nature, either as a resource to be exploited or an adversary to national progress, they contributed to the delimitation of the region now known as the Northeast. By seeking to understand the drought phenomenon, they created knowledge about the space they sought to intervene on; by constructing their projects amid political and economical difficulty, they changed the organizational structures of cities and country in the northeast. The proposals for açudes (Iarge water reservoirs) allowed the fixation of population and the resistance against droughts; the roads - railroads and automotive roadways - connected the sertão to the capitais and the coast, speeding up help to the affected populations during droughts and allowing the circulation of goods so as to strengthen the local economies in normal rimes. The adopted practices and techniques, adapted from foreign experience and developed through trial and improvement, were consolidated as an eminently spatial intervention course, even if a theoretical body of regional or territorial planning wasn't formed in Brazil. Regional Planning proper was first applied in the country in the Northeast itself, in the 1950s, based off an economical view of reality in order to achieve development. The engineer's work prior tothat date, however, cannot be dlsconsldered. It was proved that, despite facing financial and political hurdles, engineers had a profound commitment to the problem and intended to act systematically to transform the economical and social relations in the region, in order to be victorious in their struggle against droughts

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents a proposal to detect interface in atmospheric oil tanks by installing a differential pressure level transmitter to infer the oil-water interface. The main goal of this project is to maximize the quantity of free water that is delivered to the drainage line by controlling the interface. A Fuzzy Controller has been implemented by using the interface transmitter as the Process Variable. Two ladder routine was generated to perform the control. One routine was developed to calculate the error and error variation. The other was generate to develop the fuzzy controller itself. By using rules, the fuzzy controller uses these variables to set the output. The output is the position variation of the drainage valve. Although the ladder routine was implemented into an Allen Bradley PLC, Control Logix family it can be implemented into any brand of PLCs

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work describes the experimental implementation of a shunt active power filter applied to a three-phase induction generator. The control strategy of active filter turned to the excitation control of the machine and to decrease the harmonics in the generator output current. Involved the implementation of a digital PWM switching, and was made a comparison of two techniques for obtaining the reference currents. The first technique is based on the synchronous dq reference method and the second on the theory of instantaneous power. The comparison is performed via simulation and experimental results. To obtain the experimental results, was mounted a bench trial and the control and communications needed were implemented using DSP - MS320F2812. The simulation results and experimental data proved the efficiency of the filter to apply, highlighting the technique of instantaneous power

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work calibration models were constructed to determine the content of total lipids and moisture in powdered milk samples. For this, used the near-infrared spectroscopy by diffuse reflectance, combined with multivariate calibration. Initially, the spectral data were submitted to correction of multiplicative light scattering (MSC) and Savitzsky-Golay smoothing. Then, the samples were divided into subgroups by application of hierarchical clustering analysis of the classes (HCA) and Ward Linkage criterion. Thus, it became possible to build regression models by partial least squares (PLS) that allowed the calibration and prediction of the content total lipid and moisture, based on the values obtained by the reference methods of Soxhlet and 105 ° C, respectively . Therefore, conclude that the NIR had a good performance for the quantification of samples of powdered milk, mainly by minimizing the analysis time, not destruction of the samples and not waste. Prediction models for determination of total lipids correlated (R) of 0.9955, RMSEP of 0.8952, therefore the average error between the Soxhlet and NIR was ± 0.70%, while the model prediction to content moisture correlated (R) of 0.9184, RMSEP, 0.3778 and error of ± 0.76%

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Removing inconsistencies in a project is a less expensive activity when done in the early steps of design. The use of formal methods improves the understanding of systems. They have various techniques such as formal specification and verification to identify these problems in the initial stages of a project. However, the transformation from a formal specification into a programming language is a non-trivial task and error prone, specially when done manually. The aid of tools at this stage can bring great benefits to the final product to be developed. This paper proposes the extension of a tool whose focus is the automatic translation of specifications written in CSPM into Handel-C. CSP is a formal description language suitable for concurrent systems, and CSPM is the notation used in tools support. Handel-C is a programming language whose result can be compiled directly into FPGA s. Our extension increases the number of CSPM operators accepted by the tool, allowing the user to define local processes, to rename channels in a process and to use Boolean guards on external choices. In addition, we also propose the implementation of a communication protocol that eliminates some restrictions on parallel composition of processes in the translation into Handel-C, allowing communication in a same channel between multiple processes to be mapped in a consistent manner and that improper communication in a channel does not ocurr in the generated code, ie, communications that are not allowed in the system specification

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work, we propose a two-stage algorithm for real-time fault detection and identification of industrial plants. Our proposal is based on the analysis of selected features using recursive density estimation and a new evolving classifier algorithm. More specifically, the proposed approach for the detection stage is based on the concept of density in the data space, which is not the same as probability density function, but is a very useful measure for abnormality/outliers detection. This density can be expressed by a Cauchy function and can be calculated recursively, which makes it memory and computational power efficient and, therefore, suitable for on-line applications. The identification/diagnosis stage is based on a self-developing (evolving) fuzzy rule-based classifier system proposed in this work, called AutoClass. An important property of AutoClass is that it can start learning from scratch". Not only do the fuzzy rules not need to be prespecified, but neither do the number of classes for AutoClass (the number may grow, with new class labels being added by the on-line learning process), in a fully unsupervised manner. In the event that an initial rule base exists, AutoClass can evolve/develop it further based on the newly arrived faulty state data. In order to validate our proposal, we present experimental results from a level control didactic process, where control and error signals are used as features for the fault detection and identification systems, but the approach is generic and the number of features can be significant due to the computationally lean methodology, since covariance or more complex calculations, as well as storage of old data, are not required. The obtained results are significantly better than the traditional approaches used for comparison

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The soil heat flux and soil thermal diffusivity are important components of the surface energy balance, especially in ar id and semi-arid regions. The obj ective of this work was to carry out to estimate the soil heat flux from th e soil temperature measured at a single depth, based on the half-order time derivative met hod proposed by Wang and Bras (1999), and to establish a method capable of es timating the thermal diffusivity of the soil, based on the half order derivative, from the temporal series of soil temperature at two depths. The results obtained in the estimates of soil heat flux were compared with the values of soil heat flux measured through flux plates, and the thermal di ffusivity estimated was compared with the measurements carried out in situ. The results obtained showed excellent concordance between the estimated and measured soil heat flux, with correlation (r), coeffici ent of determination (R 2 ) and standard error (W/m 2 ) of: r = 0.99093, R 2 = 0.98194 and error = 2.56 (W/m 2 ) for estimated period of 10 days; r = 0,99069, R 2 = 0,98147 and error = 2.59 (W/m 2 ) for estimated period of 30 days; and r = 0,98974, R 2 = 0,97958 and error = 2.77 (W/m 2 ) for estimated period of 120 days. The values of thermal di ffusivity estimated by the proposed method showed to be coherent and consis tent with in situ measured va lues, and with the values found in the literature usi ng conventional methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wireless Communication is a trend in the industrial environment nowadays and on this trend, we can highlight the WirelessHART technology. In this situation, it is natural the search for new improvements in the technology and such improvements can be related directly to the routing and scheduling algorithms. In the present thesis, we present a literature review about the main specific solutions for Routing and scheduling for WirelessHART. The thesis also proposes a new scheduling algorithm called Flow Scheduling that intends to improve superframe utilization and flexibility aspects. For validation purposes, we develop a simulation module for the Network Simulator 3 (NS-3) that models aspects like positioning, signal attenuation and energy consumption and provides an link individual error configuration. The module also allows the creation of the scheduling superframe using the Flow and Han Algorithms. In order to validate the new algorithms, we execute a series of comparative tests and evaluate the algorithms performance for link allocation, delay and superframe occupation. In order to validate the physical layer of the simulation module, we statically configure the routing and scheduling aspects and perform reliability and energy consumption tests using various literature topologies and error probabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wireless Communication is a trend in the industrial environment nowadays and on this trend, we can highlight the WirelessHART technology. In this situation, it is natural the search for new improvements in the technology and such improvements can be related directly to the routing and scheduling algorithms. In the present thesis, we present a literature review about the main specific solutions for Routing and scheduling for WirelessHART. The thesis also proposes a new scheduling algorithm called Flow Scheduling that intends to improve superframe utilization and flexibility aspects. For validation purposes, we develop a simulation module for the Network Simulator 3 (NS-3) that models aspects like positioning, signal attenuation and energy consumption and provides an link individual error configuration. The module also allows the creation of the scheduling superframe using the Flow and Han Algorithms. In order to validate the new algorithms, we execute a series of comparative tests and evaluate the algorithms performance for link allocation, delay and superframe occupation. In order to validate the physical layer of the simulation module, we statically configure the routing and scheduling aspects and perform reliability and energy consumption tests using various literature topologies and error probabilities.