17 resultados para Hydrologic Modeling Catchment and Runoff Computations
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The management of water resources in the river basin level, as it defines the Law nº 9433/97, requires the effective knowledge of the processes of hydrological basin, resulting from studies based on consistent series of hydrological data that reflect the characteristics of the basin. In this context, the objective of this work was to develop the modeling of catchment basin of the river Jundiaí - RN and carry out the study of attenuation of a flood of the dam Tabatinga, by means of a monitoring project of hydrological data and climatology of the basin, with a view to promoting the development of research activities by applying methodologies unified and appropriate for the assessment of hydrological studies in the transition region of the semiarid and the forest zone on the coast of Rio Grande do Norte. For the study of the hydrological characteristics of the basin was conducted the automatic design of the basin of the river Jundiaí, with the aid of programs of geoprocessing, was adopted a hydrological model daily, the NRCS, which is a model determined and concentrated. For the use of this model was necessary to determine some parameters that are used in this model, as the Curve Number. Having in mind that this is the first study that is being conducted in the basin with the employment of this model, it was made sensitivity analysis of the results of this model from the adoption of different values of CN, situated within a range appropriate to the conditions of use, occupation and the nature of the soil of this basin. As the objective of this study was also developing a simulation model of the operation of the Tabatinga dam and with this flood control caused in the city of Macaíba, it was developed a mathematical model of fluid balance, developed to be used in Microsoft Excel. The simulation was conducted in two phases: the first step was promoted the water balance daily that allowed the analysis of the sensitivity of the model in relation to the volume of waiting, as well as the determination of the period of greatest discharges daily averages. From this point, it was assumed for the second stage, which was in the determination of the hydrograph of discharges effluent slots, that was determined by means of the fluid balance time, on the basis of the discharges effluents generated by a mathematical equation whose parameters were adjusted according to the hydrograph daily. Through the analyzes it was realized that the dam Tabatinga only has how to carry out the attenuation of floods through the regularization of the volume of waiting, with this there is a loss of approximately 56.5% on storage capacity of this dam, because for causing the attenuation effect of filled the shell of this dam has to remain more than 5m below the level of the sill, representing at least 50.582.927m3. The results obtained with the modeling represents a first step in the direction of improving the level of hydrological information about the behavior of the basins of the semiarid. In order to monitor quantitatively the hydrographic basin of the river Jundiaí will be necessary to install a rain gauge register, next to the Tabatinga dam and a pressure transducer, for regular measurements of flow in the reservoir of the dam. The climatological data will be collected in full automatic weather station installed in Agricultural School Jundiaí
Resumo:
COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.
Resumo:
COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.
Resumo:
In the teaching practice of architecture and urbanism in Brazil, educational legislation views modeling laboratories and workshops as an indispensable component of the infrastructure required for the good functioning of any architectural course of study. Although the development of information technology at the international level has created new possibilities for digital production of architectural models, research in this field being underway since the early 1990s, it is only from 2007 onwards that such technologies started to be incorporated into the teaching activity of architecture and urbanism in Brazil, through the pioneering experience at LAPAC/FEC/UNICAMP. It is therefore a recent experiment whose challenges can be highlighted through the following examples: (i) The implementation of digital prototyping laboratories in undergraduate courses of architecture and urbanism is still rare in Brazil; (ii) As a new developing field with few references and application to undergraduate programs, it is hard to define methodological procedures suitable for the pedagogical curricula already implemented or which have already been consolidated over the years; (iii) The new digital ways for producing tridimensional models are marked with specificities which make it difficult to fit them within the existing structures of model laboratories and workshops. Considering the above, the present thesis discusses the tridimensional model as a tool which may contribute to the development of students skills in perceiving, understanding and representing tridimensional space. Analysis is made of the relation between different forms of models and the teaching of architectural project, with emphasis on the design process. Starting from the conceptualization of the word model as it is used in architecture and urbanism, an attempt is made to identify types of tridimensional models used in the process of project conception, both through the traditional, manual way of model construction as well as through the digital ones. There is also an explanation on how new technologies for digital production of models through prototyping are being introduced in undergraduate academic programs of architecture and urbanism in Brazil, as well as a review of recent academic publications in this area. Based on the paradigm of reflective practice in teaching as designed by Schön (2000), the experiment applied in the research was undertaken in the integrated workshop courses of architectural project in the undergraduate program of architecture and urbanism at Universidade Federal do Rio Grande do Norte. Along the experiment, physical modeling, geometric modeling and digital prototyping are used in distinct moments of the design process with the purpose of observing the suitability of each model to the project s phases. The procedures used in the experiments are very close to the Action Research methodology in which the main purpose is the production of theoretical knowledge by improving the practice. The process was repeated during three consecutive semesters and reflection on the results which were achieved in each cycle helped enhancing the next one. As a result, a methodological procedure is proposed which consists of the definition of the Tridimensional Model as the integrating element for the contents studied in a specific academic period or semester. The teaching of Architectural Project as it is developed along the fifth academic period of the Architecture and Urbanism undergraduate program of UFRN is taken as a reference
Resumo:
This work proposes an environment for programming programmable logic controllers applied to oil wells with BCP type method of artificially lifting. The environment will have an editor based in the diagram of sequential functions for programming of PLCs. This language was chosen due to the fact of being high-level and accepted by the international standard IEC 61131-3. The use of these control programs in real PLC will be possible with the use of an intermediate level of language based on XML specification PLCopen T6 XML. For the testing and validation of the control programs, an area should be available for viewing variables obtained through communication with a real PLC. Thus, the main contribution of this work is to develop a computational environment that allows: modeling, testing and validating the controls represented in SFC and applied in oil wells with BCP type method of artificially lifting
Resumo:
Due of industrial informatics several attempts have been done to develop notations and semantics, which are used for classifying and describing different kind of system behavior, particularly in the modeling phase. Such attempts provide the infrastructure to resolve some real problems of engineering and construct practical systems that aim at, mainly, to increase the productivity, quality, and security of the process. Despite the many studies that have attempted to develop friendly methods for industrial controller programming, they are still programmed by conventional trial-and-error methods and, in practice, there is little written documentation on these systems. The ideal solution would be to use a computational environment that allows industrial engineers to implement the system using high-level language and that follows international standards. Accordingly, this work proposes a methodology for plant and control modelling of the discrete event systems that include sequential, parallel and timed operations, using a formalism based on Statecharts, denominated Basic Statechart (BSC). The methodology also permits automatic procedures to validate and implement these systems. To validate our methodology, we presented two case studies with typical examples of the manufacturing sector. The first example shows a sequential control for a tagged machine, which is used to illustrated dependences between the devices of the plant. In the second example, we discuss more than one strategy for controlling a manufacturing cell. The model with no control has 72 states (distinct configurations) and, the model with sequential control generated 20 different states, but they only act in 8 distinct configurations. The model with parallel control generated 210 different states, but these 210 configurations act only in 26 distinct configurations, therefore, one strategy control less restrictive than previous. Lastly, we presented one example for highlight the modular characteristic of our methodology, which it is very important to maintenance of applications. In this example, the sensors for identifying pieces in the plant were removed. So, changes in the control model are needed to transmit the information of the input buffer sensor to the others positions of the cell
Resumo:
This paper describes the design, implementation and enforcement of a system for industrial process control based on fuzzy logic and developed using Java, with support for industrial communication protocol through the OPC (Ole for Process Control). Besides the java framework, the software is completely independent from other platforms. It provides friendly and functional tools for modeling, construction and editing of complex fuzzy inference systems, and uses these logical systems in control of a wide variety of industrial processes. The main requirements of the developed system should be flexibility, robustness, reliability and ease of expansion
Resumo:
The objective of the dissertation was the realization of kinematic modeling of a robotic wheelchair using virtual chains, allowing the wheelchair modeling as a set of robotic manipulator arms forming a cooperative parallel kinematic chain. This document presents the development of a robotic wheelchair to transport people with special needs who overcomes obstacles like a street curb and barriers to accessibility in streets and avenues, including the study of assistive technology, parallel architecture, kinematics modeling, construction and assembly of the prototype robot with the completion of a checklist of problems and barriers to accessibility in several pathways, based on rules, ordinances and existing laws. As a result, simulations were performed on the chair in various states of operation to accomplish the task of going up and down stair with different measures, making the proportional control based on kinematics. To verify the simulated results we developed a prototype robotic wheelchair. This project was developed to provide a better quality of life for people with disabilities
Resumo:
The Pitimbu River Watershed (PRW), belonging to Potiguar capital metropolitan area, State of Rio Grande do Norte, contributes, among other purposes, to human using and animal watering. This watershed is extremely important because, besides filling up with freshwater approximately 30% of the south part of Natal (South, East and West Zones), contributes to the river shore ecosystem equilibrium. Face to the current conjuncture, this study aims to evaluate the urban development dynamics in the PRW, applying Cellular Automata as a modeling instrument, and to simulate future urban scenarios, between 2014 and 2033, using the simulation program SLEUTH. In the calibration phase, urban spots for 1 984, 1992, 2004 and 2013 years were used, with resolution from 100 meters. After the simulation, it was found a predominance of organic growth, expanding the BHRP from existing urban centers. The spontaneous growth occurred through the fullest extent of the watershed, however the probability of effective growth should not exceed 21%. It was observed that, there was a 68% increase for the period between 2014 and 2033, corresponding to an expansion area of 1,778 ha. For 2033, the source of Pitimbu River area and the Jiqui Lake surroundings will increase more than 78%. Finally, it was seen an exogenous urban growth tendency in the watershed (outside-in). As a result of this growth, hydraulics resources will become scarcer
Resumo:
This paper discusses aspects related to the mathematical language and its understanding, in particular, by students of final years of elementary school. Accordingly, we aimed to develop a proposal for teaching, substantiated by mathematical modeling activities and reading, which takes advantage of the student of elementary school a better understanding of mathematical language for the content of proportion. We also aim to build / propose parameters for the assessment of reading proficiency of the language of the student in analyzing and modeling process, its ability to develop/improve/enhance this proficiency. For this purpose, we develop a qualitative research, with procedures for an action research whose analysis of the data is configured as Content Analysis. We refer to epistemological and didactic, in the studies: Piaget (1975, 1990), Vygotsky (1991, 2001), Bakhtin (2006), Freire (1974, 1994), Bicudo and Garnica (2006), Smole and Diniz (2001), Barbosa (2001), Burak (1992), Biembengut (2004), Bassanezi (2002), Carrasco (2006), Becker (2010), Zuin and Reyes (2010), among others. We understand that to acquire new knowledge one must learn to read and reading to learn it, this process is essential for the development of reading proficiency of a person. Modeling, in turn, is a process which enables contact with different forms of reading providing elements favorable to the development here mentioned. The evaluation parameters we use to analyze the level of reading proficiency of mathematical language proved to be effective and therefore a valuable tool that allows the teacher an efficient evaluation and whose results can guide you better in the planning and execution of their practice
Resumo:
The development of interactive systems involves several professionals and the integration between them normally uses common artifacts, such as models, that drive the development process. In the model-driven development approach, the interaction model is an artifact that includes the most of the aspects related to what and how the user can do while he/she interacting with the system. Furthermore, the interactive model may be used to identify usability problems at design time. Therefore, the central problematic addressed by this thesis is twofold. In the first place, the interaction modeling, in a perspective that helps the designer to explicit to developer, who will implement the interface, the aspcts related to the interaction process. In the second place, the anticipated identification of usability problems, that aims to reduce the application final costs. To achieve these goals, this work presents (i) the ALaDIM language, that aims to help the designer on the conception, representation and validation of his interactive message models; (ii) the ALaDIM editor, which was built using the EMF (Eclipse Modeling Framework) and its standardized technologies by OMG (Object Management Group); and (iii) the ALaDIM inspection method, which allows the anticipated identification of usability problems using ALaDIM models. ALaDIM language and editor were respectively specified and implemented using the OMG standards and they can be used in MDA (Model Driven Architecture) activities. Beyond that, we evaluated both ALaDIM language and editor using a CDN (Cognitive Dimensions of Notations) analysis. Finally, this work reports an experiment that validated the ALaDIM inspection method
Resumo:
The developed study proposes a new computer modeling efficient and easy to apply in usual project situations to evaluate the interaction between masonry panels and support structure. The proposed model simulates the behavior of the wall exclusively using frame finite elements, thus compounding an equivalent frame. The validation was performed in two ways: firstly, through the analysis of various panels of generic plans, comparing the results obtained from equivalent frame model with the ones from a reference model, which uses shell finite elements in discretization of the walls; and in a second step, comparing with the results of the experimental model of Rosenhaupt. The analyzes considered the linear elastic behavior for materials and consisted basically in the evaluation of vertical displacements and efforts in support beams, and tensions at the base of walls. Was also evaluated, from flat and threedimensional modeling of some walls from a real project, important aspects of the wall-beam interaction, e.g.: the presence of openings of doors and windows, arranged in any position; conditions of support and linking of beams; interference of moorings between walls; and consideration of wind action. The analysis of the achieved results demonstrated the efficiency of the proposed modeling, since they have very similar aspects in the distribution of stresses and efforts, always with intensities slightly larger than those of the reference and experimental models.
Resumo:
Climate and air pollution, among others, are responsible factors for increase of health vulnerability of the populations that live in urban centers. Climate changes combined with high concentrations of atmospheric pollutants are usually associated with respiratory and cardiovascular diseases. In this sense, the main objective of this research is to model in different ways the climate and health relation, specifically for the children and elderly population which live in São Paulo. Therefore, data of meteorological variables, air pollutants, hospitalizations and deaths from respiratory and cardiovascular diseases a in 11-year period (2000-2010) were used. By using modeling via generalized estimating equations, the relative risk was obtained. By dynamic regression, it was possible to predict the number of deaths through the atmospheric variables and the betabinomial-poisson model was able to estimate the number of deaths and simulate scenarios. The results showed that the risk of hospitalizations due to asthma increases approximately twice for children exposed to high concentrations of particulate matter than children who are not exposed. The risk of death by acute myocardial infarction in elderly increase in 3%, 6%, 4% and 9% due to high concentrations CO, SO2, O3 and PM10, respectively. Regarding the dynamic regression modeling, the results showed that deaths by respiratory diseases can be predicted consistently. The beta-binomial-poisson model was able to reproduce an average number of deaths by heart insufficiency. In the region of Santo Amaro the observed number was 2.462 and the simulated was 2.508, in the Sé region 4.308 were observed and 4.426 simulated, which allowed for the generation of scenarios that may be used as a parameter for decision. Making with these results, it is possible to contribute for methodologies that can improve the understanding of the relation between climate and health and proved support to managers in environmental planning and public health policies.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.