126 resultados para Modelagem baseada no indivíduo
Resumo:
There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant
Resumo:
When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software
Resumo:
The occurrence of problems related to the scattering and tangling phenomenon, such as the difficulty to do system maintenance, increasingly frequent. One way to solve this problem is related to the crosscutting concerns identification. To maximize its benefits, the identification must be performed from early stages of development process, but some works have reported that this has not been done in most of cases, making the system development susceptible to the errors incidence and prone to the refactoring later. This situation affects directly to the quality and cost of the system. PL-AOVgraph is a goal-oriented requirements modeling language which offers support to the relationships representation among requirements and provides separation of crosscutting concerns by crosscutting relationships representation. Therefore, this work presents a semi-automatic method to crosscutting concern identification in requirements specifications written in PL-AOVgraph. An adjacency matrix is used to identify the contributions relationships among the elements. The crosscutting concern identification is based in fan-out analysis of contribution relationships from the informations of adjacency matrix. When identified, the crosscutting relationships are created. And also, this method is implemented as a new module of ReqSys-MDD tool
Resumo:
The main goal of Regression Test (RT) is to reuse the test suite of the latest version of a software in its current version, in order to maximize the value of the tests already developed and ensure that old features continue working after the new changes. Even with reuse, it is common that not all tests need to be executed again. Because of that, it is encouraged to use Regression Tests Selection (RTS) techniques, which aims to select from all tests, only those that reveal faults, this reduces costs and makes this an interesting practice for the testing teams. Several recent research works evaluate the quality of the selections performed by RTS techniques, identifying which one presents the best results, measured by metrics such as inclusion and precision. The RTS techniques should seek in the System Under Test (SUT) for tests that reveal faults. However, because this is a problem without a viable solution, they alternatively seek for tests that reveal changes, where faults may occur. Nevertheless, these changes may modify the execution flow of the algorithm itself, leading some tests no longer exercise the same stretch. In this context, this dissertation investigates whether changes performed in a SUT would affect the quality of the selection of tests performed by an RTS, if so, which features the changes present which cause errors, leading the RTS to include or exclude tests wrongly. For this purpose, a tool was developed using the Java language to automate the measurement of inclusion and precision averages achieved by a regression test selection technique for a particular feature of change. In order to validate this tool, an empirical study was conducted to evaluate the RTS technique Pythia, based on textual differencing, on a large web information system, analyzing the feature of types of tasks performed to evolve the SUT
Resumo:
The Reconfigurables Architectures had appeares as an alternative to the ASICs and the GGP, keeping a balance between flexibility and performance. This work presents a proposal for the modeling of Reconfigurables with Chu Spaces, describing the subjects main about this thematic. The solution proposal consists of a modeling that uses a generalization of the Chu Spaces, called of Chu nets, to model the configurations of a Reconfigurables Architectures. To validate the models, three algorithms had been developed and implemented to compose configurable logic blocks, detection of controllability and observability in applications for Reconfigurables Architectures modeled by Chu nets
Resumo:
This research has as objective of study the evolution of the accountancy princliple terminology which is present in the accounting conceptual framework. The scene of this research will have as target the North American School of Accounting. The choice of the searched terminology is its relevance in the study of the Accounting Theory. To understand the evolution of the accountancy thought, will be boarded: the influence of the Feudal System and the Mercantilism in the European economic conception; the importance of the Industrial Revolution in the beginning of the accounting standards and the influence of England in the formation of the North American School of Accounting. With relation to U.S.A., the development of the economic-financial scene of the American society will be evaluated, focusing the contribution in the search of the construction of an applied theoretical framework to the Accounting. The economic-financial development of U.S.A. provided the sprouting of new users with specific necessities. The necessity of the user for useful information for the decision taking, unchained the process of research directed toward the establishment of an applied Accountancy terminology. In this process, the paper exerted for the responsible accountancy organisms for the accounting standards will be boarded, as well as the professionals associations which had invested in researches, aiming at to elaborate a body of accountancy principles and to adjust the accountancy procedures to the necessities of the users. To reach the research objective, a bibliographical revision in specialized literature will be effected, adopting the historical method, in the period that understands the development of the North American School of Accounting. As result of the research, it can conclude that the evolution process of the terminology which is studied presents a structural logical problem, because the impossibility of the construction of a theoretical framework, having as bases the principle terminology. The impossibility occurred in function of the reach attributed to the term, which made a difficult in its application in the elaboration of the accountancy procedures
Resumo:
In this beginning of the XXI century, the Geology moves for new ways that demand a capacity to work with different information and new tools. It is within this context that the analog characterization has important in the prediction and understanding the lateral changes in the geometry and facies distribution. In the present work was developed a methodology for integration the geological and geophysical data in transitional recent deposits, the modeling of petroliferous reservoirs, the volume calculation and the uncertainties associate with this volume. For this purpose it was carried planialtimetric and geophysics (Ground Penetrating Radar) surveys in three areas of the Parnaíba River. With this information, it was possible to visualize the overlap of different estuary channels and make the delimitation of the channel geometry (width and thickness). For three-dimensional visualization and modeling were used two of the main reservoirs modeling software. These studies were performed with the collected parameters and the data of two reservoirs. The first was created with the Potiguar Basin wells data existents in the literature and corresponding to Açu IV unit. In the second case was used a real database of the Northern Sea. In the procedures of reservoirs modeling different workflows were created and generated five study cases with their volume calculation. Afterwards an analysis was realized to quantify the uncertainties in the geological modeling and their influence in the volume. This analysis was oriented to test the generating see and the analogous data use in the model construction
Resumo:
This thesis presents the results of application of SWAN Simulating WAves Nearshore numerical model, OF third generation, which simulates the propagation and dissipation of energy from sea waves, on the north continental shelf at Rio Grande do Norte, to determine the wave climate, calibrate and validate the model, and assess their potential and limitations for the region of interest. After validation of the wave climate, the results were integrated with information from the submarine relief, and plant morphology of beaches and barrier islands systems. On the second phase, the objective was to analyze the evolution of the wave and its interaction with the shallow seabed, from three transverse profiles orientation from N to S, distributed according to the parallel longitudinal, X = 774000-W, 783000-W e 800000-W. Subsequently, it was were extracted the values of directional waves and winds through all the months between november 2010 to november 2012, to analyze the impact of these forces on the movement area, and then understand the behavior of the morphological variations according to temporal year variability. Based on the results of modeling and its integration with correlated data, and planimetric variations of Soledade and Minhoto beach systems and Ponta do Tubarão and Barra do Fernandes barrier islands systems, it was obtained the following conclusions: SWAN could reproduce and determine the wave climate on the north continental shelf at RN, the results show a similar trend for the measurements of temporal variations of significant height (HS, m) and the mean wave period (Tmed, s); however, the results of parametric statistics were low for the estimates of the maximum values in most of the analyzed periods compared data of PT 1 and PT 2 (measurement points), with alternation of significant wave heights, at times overrated with occasional overlap of swell episodes. By analyzing the spatial distribution of the wave climate and its interaction with the underwater compartmentalization, it was concluded that there is interaction of wave propagation with the seafloor, showing change in significant heights whenever it interacts with the seafloor features (beachrocks, symmetric and asymmetric longitudinal dunes, paleochannel, among others) in the regions of outer, middle and inner shelf. And finally, it is concluded that the study of the stability areas allows identifications of the most unstable regions, confirming that the greatest range of variation indicates greater instability and consequent sensitivity to hydrodynamic processes operating in the coastal region, with positive or negative variation, especially at Ponta do Tubarão and Barra do Fernandes barrier islands systems, where they are more susceptible to waves impacts, as evidenced in retreat of the shoreline
Resumo:
Este trabalho tem como objetivo o estudo do comportamento assintótico da estatística de Pearson (1900), que é o aparato teórico do conhecido teste qui-quadrado ou teste x2 como também é usualmente denotado. Inicialmente estudamos o comportamento da distribuição da estatística qui-quadrado de Pearson (1900) numa amostra {X1, X2,...,Xn} quando n → ∞ e pi = pi0 , 8n. Em seguida detalhamos os argumentos usados em Billingley (1960), os quais demonstram a convergência em distribuição de uma estatística, semelhante a de Pearson, baseada em uma amostra de uma cadeia de Markov, estacionária, ergódica e com espaço de estados finitos S
Resumo:
Os Algoritmos Genético (AG) e o Simulated Annealing (SA) são algoritmos construídos para encontrar máximo ou mínimo de uma função que representa alguma característica do processo que está sendo modelado. Esses algoritmos possuem mecanismos que os fazem escapar de ótimos locais, entretanto, a evolução desses algoritmos no tempo se dá de forma completamente diferente. O SA no seu processo de busca trabalha com apenas um ponto, gerando a partir deste sempre um nova solução que é testada e que pode ser aceita ou não, já o AG trabalha com um conjunto de pontos, chamado população, da qual gera outra população que sempre é aceita. Em comum com esses dois algoritmos temos que a forma como o próximo ponto ou a próxima população é gerada obedece propriedades estocásticas. Nesse trabalho mostramos que a teoria matemática que descreve a evolução destes algoritmos é a teoria das cadeias de Markov. O AG é descrito por uma cadeia de Markov homogênea enquanto que o SA é descrito por uma cadeia de Markov não-homogênea, por fim serão feitos alguns exemplos computacionais comparando o desempenho desses dois algoritmos
Resumo:
In this work we present a mathematical and computational modeling of electrokinetic phenomena in electrically charged porous medium. We consider the porous medium composed of three different scales (nanoscopic, microscopic and macroscopic). On the microscopic scale the domain is composed by a porous matrix and a solid phase. The pores are filled with an aqueous phase consisting of ionic solutes fully diluted, and the solid matrix consists of electrically charged particles. Initially we present the mathematical model that governs the electrical double layer in order to quantify the electric potential, electric charge density, ion adsorption and chemical adsorption in nanoscopic scale. Then, we derive the microscopic model, where the adsorption of ions due to the electric double layer and the reactions of protonation/ deprotanaç~ao and zeta potential obtained in modeling nanoscopic arise in microscopic scale through interface conditions in the problem of Stokes and Nerst-Planck equations respectively governing the movement of the aqueous solution and transport of ions. We developed the process of upscaling the problem nano/microscopic using the homogenization technique of periodic structures by deducing the macroscopic model with their respectives cell problems for effective parameters of the macroscopic equations. Considering a clayey porous medium consisting of kaolinite clay plates distributed parallel, we rewrite the macroscopic model in a one-dimensional version. Finally, using a sequential algorithm, we discretize the macroscopic model via the finite element method, along with the interactive method of Picard for the nonlinear terms. Numerical simulations on transient regime with variable pH in one-dimensional case are obtained, aiming computational modeling of the electroremediation process of clay soils contaminated
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
This work shows a integrated study of modern analog to fluvial reservoirs of Açu Formation (Unit 3). The modern analog studied has been Assu River located in the same named city, Rio Grande do Norte State, Northeast of Brazil. It has been developed a new methodology to parameterizating the fluvial geological bodies by GPR profile (by central frequency antennas of 50, 100 and 200 MHz). The main parameters obtained were width and thickness. Still in the parameterization, orthophotomaps have been used to calculate the canal sinuosity and braided parameters of Assu River. These information are integrated in a database to supply input data in 3D geological models of fluvial reservoirs. It was made an architectural characterization of the deposit by trench description, GPR profile interpretation and natural expositions study to recognize and describe the facies and its associations, external and internal geometries, boundary surfaces and archtetural elements. Finally, a three-dimensional modeling has been built using all the acquired data already in association with real well data of a reservoir which Rio Assu is considered as analogous. Facies simulations have been used simple kriging (deterministic algorithm), SIS and Boolean (object-based, both stochastics). And, for modeling porosities have used the stochastic algorithm SGS
Resumo:
The area between Galinhos and São Bento do Norte beaches, located in the northern coast of the Rio Grande do Norte State is submitted to intense and constant processes of littoral and aeolian transport, causing erosion, alterations in the sediments balance and modifications in the shoreline. Beyond these natural factors, the human interference is huge in the surroundings due to the Guamaré Petroliferous Pole nearby, the greater terrestrial oil producing in Brazil. Before all these characteristics had been organized MAMBMARE and MARPETRO projects with the main objective to execute the geo-environmental monitoring of coastal areas on the northern portion of RN. There is a bulky amount of database from the study area such as geologic and geophysical multitemporal data, hydrodynamic measurements, remote sensing multitemporal images, thematic maps, among others; it is of extreme importance to elaborate a Geographic Database (GD), one of the main components of a Geographic Information System (GIS), to store this amount of information, allowing the access to researchers and users. The first part of this work consisted to elaborate a GD to store the data of the area between Galinhos and São Bento do Norte cities. The main goal was to use the potentiality of the GIS as a tool to support decisions in the environmental monitoring of this region, a valuable target for oil exploration, salt companies and shrimp farms. The collected data was stored as a virtual library to assist men decisions from the results presented as digital thematic maps, tables and reports, useful as source of data in the preventive planning and as guidelines to the future research themes both on regional and local context. The second stage of this work consisted on elaborate the Oil-Spill Environmental Sensitivity Maps. These maps based on the Environmental Sensitivity Index Maps to Oil Spill developed by the Ministry of Environment are cartographic products that supply full information to the decision making, contingency planning and assessment in case of an oil spilling incident in any area. They represent the sensitivity of the areas related to oil spilling, through basic data such as geology, geomorphology, oceanographic, social-economic and biology. Some parameters, as hydrodynamic data, sampling data, coastal type, declivity of the beach face, types of resources in risk (biologic, economic, human or cultural) and the land use of the area are some of the essential information used on the environmental sensitivity maps elaboration. Thus using the available data were possible to develop sensitivity maps of the study area on different dates (June/2000 and December/2000) and to perceive that there was a difference on the sensitivity index generated. The area on December presented more sensible to the oil than the June one because hydrodynamic data (wave and tide energy) allowed a faster natural cleaning on June. The use of the GIS on sensitivity maps showed to be a powerful tool, since it was possible to manipulate geographic data with correctness and to elaborate more accurate maps with a higher level of detail to the study area. This presented an medium index (3 to 4) to the long shore and a high index (10) to the mangrove areas highly vulnerable to oil spill
Resumo:
The geological modeling allows, at laboratory scaling, the simulation of the geometric and kinematic evolution of geological structures. The importance of the knowledge of these structures grows when we consider their role in the creation of traps or conduits to oil and water. In the present work we simulated the formation of folds and faults in extensional environment, through physical and numerical modeling, using a sandbox apparatus and MOVE2010 software. The physical modeling of structures developed in the hangingwall of a listric fault, showed the formation of active and inactive axial zones. In consonance with the literature, we verified the formation of a rollover between these two axial zones. The crestal collapse of the anticline formed grabens, limited by secondary faults, perpendicular to the extension, with a curvilinear aspect. Adjacent to these faults we registered the formation of transversal folds, parallel to the extension, characterized by a syncline in the fault hangingwall. We also observed drag folds near the faults surfaces, these faults are parallel to the fault surface and presented an anticline in the footwall and a syncline hangingwall. To observe the influence of geometrical variations (dip and width) in the flat of a flat-ramp fault, we made two experimental series, being the first with the flat varying in dip and width and the second maintaining the flat variation in width but horizontal. These experiments developed secondary faults, perpendicular to the extension, that were grouped in three sets: i) antithetic faults with a curvilinear geometry and synthetic faults, with a more rectilinear geometry, both nucleated in the base of sedimentary pile. The normal antithetic faults can rotate, during the extension, presenting a pseudo-inverse kinematics. ii) Faults nucleated at the top of the sedimentary pile. The propagation of these faults is made through coalescence of segments, originating, sometimes, the formation of relay ramps. iii) Reverse faults, are nucleated in the flat-ramp interface. Comparing the two models we verified that the dip of the flat favors a differentiated nucleation of the faults at the two extremities of the mater fault. V These two flat-ramp models also generated an anticline-syncline pair, drag and transversal folds. The anticline was formed above the flat being sub-parallel to the master fault plane, while the syncline was formed in more distal areas of the fault. Due the geometrical variation of these two folds we can define three structural domains. Using the physical experiments as a template, we also made numerical modeling experiments, with flat-ramp faults presenting variation in the flat. Secondary antithetic, synthetic and reverse faults were generated in both models. The numerical modeling formed two folds, and anticline above the flat and a syncline further away of the master fault. The geometric variation of these two folds allowed the definition of three structural domains parallel to the extension. These data reinforce the physical models. The comparisons between natural data of a flat-ramp fault in the Potiguar basin with the data of physical and numerical simulations, showed that, in both cases, the variation of the geometry of the flat produces, variation in the hangingwall geometry