77 resultados para Desenvolvimento Orientado a Modelos
Resumo:
This work studies the development, implementation and improvement of a macroscopic model to describe the behavior of the spouted bed dryer with continuous feeding for pastes and suspensions drying. This model is based on the CST model (Freire et al., 2009) and the model of Fernandes (2005), whose theoretical foundation is based on macroscopic mass and heat balances for the three phases involved in the process: gas, liquid and solid. Because this technique is quite relevant, the studies of modeling and simulation of spouted bed drying are essential in the analysis of the process as a whole, because through them it is possible to predict and understand the behavior of the process, which contributes significantly to more efficient project and operation. The development and understanding of the phenomena involved in the drying process can be obtained by comparing the experimental data with those from computer simulations. Such knowledge is critical for choosing properly the process conditions in order to obtain a good drying efficiency. Over the past few years, researches and development of works in the field of pastes and suspensions drying in spouted bed has been gaining ground in Brazil. The Particulate Systems Laboratory at Universidade Federal do Rio Grande do Norte, has been developing several researches and generating a huge collection of experimental data concerning the drying of fruit pulps, vegetables pastes, goat milk and suspensions of agro-industrial residues. From this collection, some data of goat milk and residue from acerola (Malpighia glabra L.) drying were collected. For the first time, these data were used for the development and validation of a model that can describe the behavior of spouted bed dryer. Thus, it was possible to model the dryer and to evaluate the influence of process variables (paste feeding, temperature and flow rate of the drying air) in the drying dynamics. We also performed water evaporation experiments in order to understand and to study the behavior of the dryer wall temperature and the evaporation rate. All these analysis will contribute to future works involving the implementation of control strategies in the pastes and suspensions drying. The results obtained in transient analysis were compared with experimental data indicating that this model well represents the process
Resumo:
In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer's ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination Humberto Neves Maia de Oliveira Tese de Doutorado PPGEQ/PRH-ANP 14/UFRN of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells
Resumo:
This work aims for the evaluation of Cruzeta Irrigated Perimeter, RN, which consists in the efficient use of water for agricultural production. The goal is looking for the available quantity of water for supplying required demands for adequate and economically viable cultures for the region. It is supposed that regional community water is supplied by pipelines from sources located outside of the region. From this study it is recommended the implantation of adequate installments for culture management in accordance with the availability of water resources and others conditionings. It must be considered the intensity of rainy and drought seasons in order to adjust the cultivated area and equipments to be operated, and also, the use of operating models and simulations in order to establish alert levels and eventually, reduction of irrigated area. Based on obtained data it is proposed the cultivation of different types of non-permanent cultures so that temporary cultures would be extensively produced in periods of abundant reservoir storage water permitting the transformation of storage water in storage culture products and few or no production in severe drought periods. This is the basic premise for sustainable agricultural development for Brazilian semiarid region
Resumo:
Digital Elevation Models (DEM) are numerical representations of a portion of the earth surface. Among several factors which affect the quality of a DEM, it should be emphasized the attention on the input data and the choice of the interpolating algorithm. On the other hand, several numerical models are used nowadays to characterize nearshore hydrodynamics and morphological changes in coastal areas, whose validation is based on field data collection. Independent on the complexity of the physical processes which are modeled, little attention has been given to the intrinsic bathymetric interpolation built within the numerical models of the specific application. Therefore, this study aims to investigate and to quantify the influence of the bathymetry, as obtained by a DEM, on the hydrodynamic circulation model at a coastal stretch, off the coast of the State of Rio Grande do Norte, Northeast Brazil. This coastal region is characterized by strong hydrodynamic and littoral processes, resulting in a very dynamic morphology with shallow coastal bathymetry. Important economic activities, such as oil exploitation and production, fisheries, salt ponds, shrimp farms and tourism, also bring impacts upon the local ecosystems and influence themselves the local hydrodynamics. This fact makes the region one of the most important for the development of the State, but also enhances the possibility of serious environmental accidents. As a hydrodynamic model, SisBaHiA® - Environmental Hydrodynamics System ( Sistema Básico de Hidrodinâmica Ambiental ) was chosen, for it has been successfully employed at several locations along the Brazilian coast. This model was developed at the Coastal and Oceanographical Engineering Group of the Ocean Engineering Program at the Federal University of Rio de Janeiro. Several interpolating methods were tested for the construction of the DEM, namely Natural Neighbor, Kriging, Triangulation with Linear Interpolation, Inverse Distance to a Power, Nearest Neighbor, and Minimum Curvature, all implemented within the software Surfer®. The bathymetry which was used as reference for the DEM was obtained from nautical charts provided by the Brazilian Hydrographic Service of the Brazilian Navy and from a field survey conducted in 2005. Changes in flow velocity and free surface elevation were evaluated under three aspects: a spatial vision along three profiles perpendicular to the coast and one profile longitudinal to the coast as shown; a temporal vision from three central nodes of the grid during 30 days; a hodograph analysis of components of speed in U and V, by different tidal cycles. Small, but negligible, variations in sea surface elevation were identified. However, the differences in flow and direction of velocities were significant, depending on the DEM
Resumo:
The great majority of analytical models for extragalactic radio sources suppose self-similarity and can be classified into three types: I, II and III. We have developed a model that represents a generalization of most models found in the literature and showed that these three types are particular cases. The model assumes that the area of the head of the jet varies with the jet size according to a power law and the jet luminosity is a function of time. As it is usually done, the basic hypothesis is that there is an equilibrium between the pressure exerted both by the head of the jet and the cocoon walls and the ram pressure of the ambient medium. The equilibrium equations and energy conservation equation allow us to express the size and width of the source and the pressure in the cocoon as a power law and find the respective exponents. All these assumptions can be used to calculate the evolution of the source size, width and radio luminosity. This can then be compared with the observed width-size relation for radio lobes and the power-size (P-D) diagram of both compact (GPS and CSS) and extended sources from the 3CR catalogue. In this work we introduce two important improvement as compared with a previous work: (1)We have put together a larger sample of both compact and extended radio sources
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
Among the numerous policy changes that the world has experienced in recent years, occupies a prominent place in the quest for greater transparency of public agencies. Transparency has been an important tool in the accountability of the State to promote greater participation of the society by providing information that was previously restricted knowledge of public agencies. Brazil, following this trend, promulgated in May 2012 the Access to Information Act that seeks to disclose the actions of the State at all levels, in all public administration agencies. On the same day of the enactment of the law is provided society with a site that is empowering citizens to make their requests for information to government agencies. The Federal University of Rio Grande do Norte, which at that time had no a tool to assist them in managing this demand. This project has the objective to describe, build and implement a solution to solve this problem using Design Science Research as methodology. As result, the solution built in this research became a new module of the institution s ERP became it capable to control the entire process, and will be helpful to others partners which use our system ERP
Resumo:
This study investigates the chemical species produced water from the reservoir areas of oil production in the field of Monte Alegre (onshore production) with a proposal of developing a model applied to the identification of the water produced in different zones or groups of zones.Starting from the concentrations of anions and cátions from water produced as input parameters in Linear Discriminate Analysis, it was possible to estimate and compare the model predictions respecting the particularities of their methods in order to ascertain which one would be most appropriate. The methods Resubstitution, Holdout Method and Lachenbruch were used for adjustment and general evaluation of the built models. Of the estimated models for Wells producing water for a single production area, the most suitable method was the "Holdout Method and had a hit rate of 90%. Discriminant functions (CV1, CV2 and CV3) estimated in this model were used to modeling new functions for samples ofartificial mixtures of produced water (producedin our laboratory) and samples of mixtures actualproduced water (water collected inwellsproducingmore thanonezone).The experiment with these mixtures was carried out according to a schedule experimental mixtures simplex type-centroid also was simulated in which the presence of water from steam injectionin these tanks fora part of amostras. Using graphs of two and three dimensions was possible to estimate the proportion of water in the production area
Resumo:
Smart card applications represent a growing market. Usually this kind of application manipulate and store critical information that requires some level of security, such as financial or confidential information. The quality and trustworthiness of smart card software can be improved through a rigorous development process that embraces formal techniques of software engineering. In this work we propose the BSmart method, a specialization of the B formal method dedicated to the development of smart card Java Card applications. The method describes how a Java Card application can be generated from a B refinement process of its formal abstract specification. The development is supported by a set of tools, which automates the generation of some required refinements and the translation to Java Card client (host) and server (applet) applications. With respect to verification, the method development process was formalized and verified in the B method, using the Atelier B tool [Cle12a]. We emphasize that the Java Card application is translated from the last stage of refinement, named implementation. This translation process was specified in ASF+SDF [BKV08], describing the grammar of both languages (SDF) and the code transformations through rewrite rules (ASF). This specification was an important support during the translator development and contributes to the tool documentation. We also emphasize the KitSmart library [Dut06, San12], an essential component of BSmart, containing models of all 93 classes/interfaces of Java Card API 2:2:2, of Java/Java Card data types and machines that can be useful for the specifier, but are not part of the standard Java Card library. In other to validate the method, its tool support and the KitSmart, we developed an electronic passport application following the BSmart method. We believe that the results reached in this work contribute to Java Card development, allowing the generation of complete (client and server components), and less subject to errors, Java Card applications.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
Middleware platforms have been widely used as an underlying infrastructure to the development of distributed applications. They provide distribution and heterogeneity transparency and a set of services that ease the construction of distributed applications. Nowadays, the middlewares accommodate an increasing variety of requirements to satisfy distinct application domains. This broad range of application requirements increases the complexity of the middleware, due to the introduction of many cross-cutting concerns in the architecture, which are not properly modularized by traditional programming techniques, resulting in a tangling and spread of theses concerns in the middleware code. The presence of these cross-cutting concerns limits the middleware scalability and aspect-oriented paradigm has been used successfully to improve the modularity, extensibility and customization capabilities of middleware. This work presents AO-OiL, an aspect-oriented (AO) middleware architecture, based on the AO middleware reference architecture. This middleware follows the philosophy that the middleware functionalities must be driven by the application requirements. AO-OiL consists in an AO refactoring of the OiL (Orb in Lua) middleware in order to separate basic and crosscutting concerns. The proposed architecture was implemented in Lua and RE-AspectLua. To evaluate the refactoring impact in the middleware architecture, this paper presents a comparative analysis of performance between AO-OiL and OiL
Resumo:
Research on Wireless Sensor Networks (WSN) has evolved, with potential applications in several domains. However, the building of WSN applications is hampered by the need of programming in low-level abstractions provided by sensor OS and of specific knowledge about each application domain and each sensor platform. We propose a MDA approach do develop WSN applications. This approach allows domain experts to directly contribute in the developing of applications without needing low level knowledge on WSN platforms and, at the same time, it allows network experts to program WSN nodes to met application requirements without specific knowledge on the application domain. Our approach also promotes the reuse of the developed software artifacts, allowing an application model to be reused across different sensor platforms and a platform model to be reused for different applications
Resumo:
In development of Synthetic Agents for Education, the doubt still resides about what would be a behavior that could be considered, in fact, plausible for this agent's type, which can be considered as effective on the transmission of the knowledge by the agent and the function of emotions this process. The purpose of this labor has an investigative nature in an attempt to discover what aspects are important for this behavior consistent and practical development of a chatterbot with the function of virtual tutor, within the context of learning algorithms. In this study, we explained the agents' basics, Intelligent Tutoring Systems, bots, chatterbots and how these systems need to provide credibility to report on their behavior. Models of emotions, personality and humor to computational agents are also covered, as well as previous studies by other researchers at the area. After that, the prototype is detailed, the research conducted, a summary of results achieved, the architectural model of the system, vision of computing and macro view of the features implemented.
Resumo:
The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet
Resumo:
The software systems development with domain-specific languages has become increasingly common. Domain-specific languages (DSLs) provide increased of the domain expressiveness, raising the abstraction level by facilitating the generation of models or low-level source code, thus increasing the productivity of systems development. Consequently, methods for the development of software product lines and software system families have also proposed the adoption of domain-specific languages. Recent studies have investigated the limitations of feature model expressiveness and proposing the use of DSLs as a complement or substitute for feature model. However, in complex projects, a single DSL is often insufficient to represent the different views and perspectives of development, being necessary to work with multiple DSLs. In order to address new challenges in this context, such as the management of consistency between DSLs, and the need to methods and tools that support the development with multiple DSLs, over the past years, several approaches have been proposed for the development of generative approaches. However, none of them considers matters relating to the composition of DSLs. Thus, with the aim to address this problem, the main objectives of this dissertation are: (i) to investigate the adoption of the integrated use of feature models and DSLs during the domain and application engineering of the development of generative approaches; (ii) to propose a method for the development of generative approaches with composition DSLs; and (iii) to investigate and evaluate the usage of modern technology based on models driven engineering to implement strategies of integration between feature models and composition of DSLs